00:00:00.001 Started by upstream project "autotest-per-patch" build number 124199 00:00:00.001 originally caused by: 00:00:00.001 Started by user sys_sgci 00:00:00.033 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/crypto-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.033 The recommended git tool is: git 00:00:00.033 using credential 00000000-0000-0000-0000-000000000002 00:00:00.035 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/crypto-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.056 Fetching changes from the remote Git repository 00:00:00.058 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.112 Using shallow fetch with depth 1 00:00:00.112 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.112 > git --version # timeout=10 00:00:00.177 > git --version # 'git version 2.39.2' 00:00:00.177 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.265 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.265 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:04.253 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:04.265 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:04.278 Checking out Revision 9bbc799d7020f50509d938dbe97dc05da0c1b5c3 (FETCH_HEAD) 00:00:04.278 > git config core.sparsecheckout # timeout=10 00:00:04.290 > git read-tree -mu HEAD # timeout=10 00:00:04.315 > git checkout -f 9bbc799d7020f50509d938dbe97dc05da0c1b5c3 # timeout=5 00:00:04.335 Commit message: "pool: fixes for VisualBuild class" 00:00:04.335 > git rev-list --no-walk 9bbc799d7020f50509d938dbe97dc05da0c1b5c3 # timeout=10 00:00:04.419 [Pipeline] Start of Pipeline 00:00:04.435 [Pipeline] library 00:00:04.436 Loading library shm_lib@master 00:00:04.436 Library shm_lib@master is cached. Copying from home. 00:00:04.455 [Pipeline] node 00:00:19.457 Still waiting to schedule task 00:00:19.458 ‘FCP03’ doesn’t have label ‘DiskNvme&&DevQAT’ 00:00:19.458 ‘FCP04’ doesn’t have label ‘DiskNvme&&DevQAT’ 00:00:19.458 ‘FCP07’ doesn’t have label ‘DiskNvme&&DevQAT’ 00:00:19.458 ‘FCP08’ doesn’t have label ‘DiskNvme&&DevQAT’ 00:00:19.458 ‘FCP09’ doesn’t have label ‘DiskNvme&&DevQAT’ 00:00:19.458 ‘FCP10’ doesn’t have label ‘DiskNvme&&DevQAT’ 00:00:19.458 ‘FCP11’ doesn’t have label ‘DiskNvme&&DevQAT’ 00:00:19.458 ‘FCP12’ doesn’t have label ‘DiskNvme&&DevQAT’ 00:00:19.458 ‘GP10’ doesn’t have label ‘DiskNvme&&DevQAT’ 00:00:19.458 ‘GP11’ is offline 00:00:19.458 ‘GP12’ is offline 00:00:19.458 ‘GP13’ is offline 00:00:19.458 ‘GP14’ is offline 00:00:19.458 ‘GP15’ doesn’t have label ‘DiskNvme&&DevQAT’ 00:00:19.458 ‘GP16’ doesn’t have label ‘DiskNvme&&DevQAT’ 00:00:19.458 ‘GP18’ doesn’t have label ‘DiskNvme&&DevQAT’ 00:00:19.459 ‘GP19’ doesn’t have label ‘DiskNvme&&DevQAT’ 00:00:19.459 ‘GP1’ doesn’t have label ‘DiskNvme&&DevQAT’ 00:00:19.459 ‘GP20’ doesn’t have label ‘DiskNvme&&DevQAT’ 00:00:19.459 ‘GP21’ doesn’t have label ‘DiskNvme&&DevQAT’ 00:00:19.459 ‘GP22’ doesn’t have label ‘DiskNvme&&DevQAT’ 00:00:19.459 ‘GP24’ doesn’t have label ‘DiskNvme&&DevQAT’ 00:00:19.459 ‘GP2’ doesn’t have label ‘DiskNvme&&DevQAT’ 00:00:19.459 ‘GP3’ is offline 00:00:19.459 ‘GP4’ is offline 00:00:19.459 ‘GP5’ doesn’t have label ‘DiskNvme&&DevQAT’ 00:00:19.459 ‘GP6’ is offline 00:00:19.459 ‘GP8’ is offline 00:00:19.459 ‘GP9’ is offline 00:00:19.459 ‘ImageBuilder1’ doesn’t have label ‘DiskNvme&&DevQAT’ 00:00:19.459 ‘Jenkins’ doesn’t have label ‘DiskNvme&&DevQAT’ 00:00:19.459 ‘ME1’ doesn’t have label ‘DiskNvme&&DevQAT’ 00:00:19.459 ‘ME2’ doesn’t have label ‘DiskNvme&&DevQAT’ 00:00:19.459 ‘ME3’ doesn’t have label ‘DiskNvme&&DevQAT’ 00:00:19.459 ‘PE5’ doesn’t have label ‘DiskNvme&&DevQAT’ 00:00:19.460 ‘SM10’ is offline 00:00:19.460 ‘SM11’ is offline 00:00:19.460 ‘SM1’ doesn’t have label ‘DiskNvme&&DevQAT’ 00:00:19.460 ‘SM28’ doesn’t have label ‘DiskNvme&&DevQAT’ 00:00:19.460 ‘SM29’ doesn’t have label ‘DiskNvme&&DevQAT’ 00:00:19.460 ‘SM2’ doesn’t have label ‘DiskNvme&&DevQAT’ 00:00:19.460 ‘SM30’ doesn’t have label ‘DiskNvme&&DevQAT’ 00:00:19.460 ‘SM31’ doesn’t have label ‘DiskNvme&&DevQAT’ 00:00:19.460 ‘SM32’ doesn’t have label ‘DiskNvme&&DevQAT’ 00:00:19.460 ‘SM33’ doesn’t have label ‘DiskNvme&&DevQAT’ 00:00:19.460 ‘SM34’ doesn’t have label ‘DiskNvme&&DevQAT’ 00:00:19.460 ‘SM35’ doesn’t have label ‘DiskNvme&&DevQAT’ 00:00:19.460 ‘SM6’ doesn’t have label ‘DiskNvme&&DevQAT’ 00:00:19.460 ‘SM7’ doesn’t have label ‘DiskNvme&&DevQAT’ 00:00:19.460 ‘SM8’ doesn’t have label ‘DiskNvme&&DevQAT’ 00:00:19.460 ‘VM-host-WFP25’ doesn’t have label ‘DiskNvme&&DevQAT’ 00:00:19.460 ‘WCP0’ doesn’t have label ‘DiskNvme&&DevQAT’ 00:00:19.460 ‘WCP2’ is offline 00:00:19.460 ‘WCP4’ is offline 00:00:19.461 ‘WFP13’ is offline 00:00:19.461 ‘WFP17’ is offline 00:00:19.461 ‘WFP21’ is offline 00:00:19.461 ‘WFP23’ is offline 00:00:19.461 ‘WFP29’ doesn’t have label ‘DiskNvme&&DevQAT’ 00:00:19.461 ‘WFP2’ doesn’t have label ‘DiskNvme&&DevQAT’ 00:00:19.461 ‘WFP32’ doesn’t have label ‘DiskNvme&&DevQAT’ 00:00:19.461 ‘WFP33’ doesn’t have label ‘DiskNvme&&DevQAT’ 00:00:19.461 ‘WFP34’ doesn’t have label ‘DiskNvme&&DevQAT’ 00:00:19.461 ‘WFP35’ doesn’t have label ‘DiskNvme&&DevQAT’ 00:00:19.461 ‘WFP36’ doesn’t have label ‘DiskNvme&&DevQAT’ 00:00:19.461 ‘WFP37’ doesn’t have label ‘DiskNvme&&DevQAT’ 00:00:19.462 ‘WFP38’ is offline 00:00:19.462 ‘WFP41’ is offline 00:00:19.462 ‘WFP43’ doesn’t have label ‘DiskNvme&&DevQAT’ 00:00:19.462 ‘WFP49’ doesn’t have label ‘DiskNvme&&DevQAT’ 00:00:19.462 ‘WFP50’ is offline 00:00:19.462 ‘WFP63’ doesn’t have label ‘DiskNvme&&DevQAT’ 00:00:19.462 ‘WFP65’ is offline 00:00:19.462 ‘WFP66’ is offline 00:00:19.463 ‘WFP67’ is offline 00:00:19.463 ‘WFP68’ is offline 00:00:19.463 ‘WFP69’ doesn’t have label ‘DiskNvme&&DevQAT’ 00:00:19.463 ‘WFP6’ is offline 00:00:19.463 ‘WFP8’ is offline 00:00:19.463 ‘WFP9’ is offline 00:00:19.463 ‘prc_bsc_waikikibeach64’ doesn’t have label ‘DiskNvme&&DevQAT’ 00:00:19.463 ‘spdk-pxe-01’ doesn’t have label ‘DiskNvme&&DevQAT’ 00:00:19.463 ‘spdk-pxe-02’ doesn’t have label ‘DiskNvme&&DevQAT’ 00:19:12.797 Running on WFP52 in /var/jenkins/workspace/crypto-phy-autotest 00:19:12.799 [Pipeline] { 00:19:12.808 [Pipeline] catchError 00:19:12.810 [Pipeline] { 00:19:12.821 [Pipeline] wrap 00:19:12.827 [Pipeline] { 00:19:12.835 [Pipeline] stage 00:19:12.836 [Pipeline] { (Prologue) 00:19:12.998 [Pipeline] sh 00:19:13.275 + logger -p user.info -t JENKINS-CI 00:19:13.293 [Pipeline] echo 00:19:13.295 Node: WFP52 00:19:13.303 [Pipeline] sh 00:19:13.598 [Pipeline] setCustomBuildProperty 00:19:13.611 [Pipeline] echo 00:19:13.613 Cleanup processes 00:19:13.621 [Pipeline] sh 00:19:13.903 + sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:19:13.903 4175905 sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:19:13.915 [Pipeline] sh 00:19:14.195 ++ sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:19:14.196 ++ grep -v 'sudo pgrep' 00:19:14.196 ++ awk '{print $1}' 00:19:14.196 + sudo kill -9 00:19:14.196 + true 00:19:14.209 [Pipeline] cleanWs 00:19:14.218 [WS-CLEANUP] Deleting project workspace... 00:19:14.218 [WS-CLEANUP] Deferred wipeout is used... 00:19:14.224 [WS-CLEANUP] done 00:19:14.229 [Pipeline] setCustomBuildProperty 00:19:14.244 [Pipeline] sh 00:19:14.523 + sudo git config --global --replace-all safe.directory '*' 00:19:14.596 [Pipeline] nodesByLabel 00:19:14.598 Found a total of 2 nodes with the 'sorcerer' label 00:19:14.610 [Pipeline] httpRequest 00:19:14.615 HttpMethod: GET 00:19:14.616 URL: http://10.211.164.101/packages/jbp_9bbc799d7020f50509d938dbe97dc05da0c1b5c3.tar.gz 00:19:14.617 Sending request to url: http://10.211.164.101/packages/jbp_9bbc799d7020f50509d938dbe97dc05da0c1b5c3.tar.gz 00:19:14.621 Response Code: HTTP/1.1 200 OK 00:19:14.622 Success: Status code 200 is in the accepted range: 200,404 00:19:14.622 Saving response body to /var/jenkins/workspace/crypto-phy-autotest/jbp_9bbc799d7020f50509d938dbe97dc05da0c1b5c3.tar.gz 00:19:14.765 [Pipeline] sh 00:19:15.047 + tar --no-same-owner -xf jbp_9bbc799d7020f50509d938dbe97dc05da0c1b5c3.tar.gz 00:19:15.065 [Pipeline] httpRequest 00:19:15.070 HttpMethod: GET 00:19:15.071 URL: http://10.211.164.101/packages/spdk_a3f6419f1f5db09443dad17e3dc59785ff2e1617.tar.gz 00:19:15.071 Sending request to url: http://10.211.164.101/packages/spdk_a3f6419f1f5db09443dad17e3dc59785ff2e1617.tar.gz 00:19:15.073 Response Code: HTTP/1.1 200 OK 00:19:15.074 Success: Status code 200 is in the accepted range: 200,404 00:19:15.074 Saving response body to /var/jenkins/workspace/crypto-phy-autotest/spdk_a3f6419f1f5db09443dad17e3dc59785ff2e1617.tar.gz 00:19:17.241 [Pipeline] sh 00:19:17.527 + tar --no-same-owner -xf spdk_a3f6419f1f5db09443dad17e3dc59785ff2e1617.tar.gz 00:19:20.076 [Pipeline] sh 00:19:20.359 + git -C spdk log --oneline -n5 00:19:20.359 a3f6419f1 app/nvme_identify: Add NVM Identify Namespace Data for ELBA Format 00:19:20.359 3b7525570 nvme: Get PI format for Extended LBA format 00:19:20.359 1e8a0c991 nvme: Get NVM Identify Namespace Data for Extended LBA Format 00:19:20.359 493b11851 nvme: Use Host Behavior Support Feature to enable LBA Format Extension 00:19:20.359 e2612f201 nvme: Factor out getting ZNS Identify Namespace Data 00:19:20.372 [Pipeline] } 00:19:20.389 [Pipeline] // stage 00:19:20.399 [Pipeline] stage 00:19:20.401 [Pipeline] { (Prepare) 00:19:20.420 [Pipeline] writeFile 00:19:20.463 [Pipeline] sh 00:19:20.745 + logger -p user.info -t JENKINS-CI 00:19:20.759 [Pipeline] sh 00:19:21.042 + logger -p user.info -t JENKINS-CI 00:19:21.055 [Pipeline] sh 00:19:21.336 + cat autorun-spdk.conf 00:19:21.336 SPDK_RUN_FUNCTIONAL_TEST=1 00:19:21.336 SPDK_TEST_BLOCKDEV=1 00:19:21.336 SPDK_TEST_ISAL=1 00:19:21.336 SPDK_TEST_CRYPTO=1 00:19:21.336 SPDK_TEST_REDUCE=1 00:19:21.336 SPDK_TEST_VBDEV_COMPRESS=1 00:19:21.336 SPDK_RUN_UBSAN=1 00:19:21.343 RUN_NIGHTLY=0 00:19:21.348 [Pipeline] readFile 00:19:21.380 [Pipeline] withEnv 00:19:21.382 [Pipeline] { 00:19:21.396 [Pipeline] sh 00:19:21.678 + set -ex 00:19:21.678 + [[ -f /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf ]] 00:19:21.678 + source /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf 00:19:21.678 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:19:21.678 ++ SPDK_TEST_BLOCKDEV=1 00:19:21.678 ++ SPDK_TEST_ISAL=1 00:19:21.678 ++ SPDK_TEST_CRYPTO=1 00:19:21.678 ++ SPDK_TEST_REDUCE=1 00:19:21.678 ++ SPDK_TEST_VBDEV_COMPRESS=1 00:19:21.678 ++ SPDK_RUN_UBSAN=1 00:19:21.678 ++ RUN_NIGHTLY=0 00:19:21.678 + case $SPDK_TEST_NVMF_NICS in 00:19:21.678 + DRIVERS= 00:19:21.678 + [[ -n '' ]] 00:19:21.678 + exit 0 00:19:21.688 [Pipeline] } 00:19:21.709 [Pipeline] // withEnv 00:19:21.715 [Pipeline] } 00:19:21.732 [Pipeline] // stage 00:19:21.743 [Pipeline] catchError 00:19:21.746 [Pipeline] { 00:19:21.764 [Pipeline] timeout 00:19:21.764 Timeout set to expire in 40 min 00:19:21.766 [Pipeline] { 00:19:21.780 [Pipeline] stage 00:19:21.781 [Pipeline] { (Tests) 00:19:21.796 [Pipeline] sh 00:19:22.079 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/crypto-phy-autotest 00:19:22.079 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest 00:19:22.079 + DIR_ROOT=/var/jenkins/workspace/crypto-phy-autotest 00:19:22.079 + [[ -n /var/jenkins/workspace/crypto-phy-autotest ]] 00:19:22.079 + DIR_SPDK=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:19:22.079 + DIR_OUTPUT=/var/jenkins/workspace/crypto-phy-autotest/output 00:19:22.079 + [[ -d /var/jenkins/workspace/crypto-phy-autotest/spdk ]] 00:19:22.079 + [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/output ]] 00:19:22.079 + mkdir -p /var/jenkins/workspace/crypto-phy-autotest/output 00:19:22.079 + [[ -d /var/jenkins/workspace/crypto-phy-autotest/output ]] 00:19:22.079 + [[ crypto-phy-autotest == pkgdep-* ]] 00:19:22.079 + cd /var/jenkins/workspace/crypto-phy-autotest 00:19:22.079 + source /etc/os-release 00:19:22.079 ++ NAME='Fedora Linux' 00:19:22.079 ++ VERSION='38 (Cloud Edition)' 00:19:22.079 ++ ID=fedora 00:19:22.079 ++ VERSION_ID=38 00:19:22.079 ++ VERSION_CODENAME= 00:19:22.079 ++ PLATFORM_ID=platform:f38 00:19:22.079 ++ PRETTY_NAME='Fedora Linux 38 (Cloud Edition)' 00:19:22.079 ++ ANSI_COLOR='0;38;2;60;110;180' 00:19:22.079 ++ LOGO=fedora-logo-icon 00:19:22.079 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:38 00:19:22.079 ++ HOME_URL=https://fedoraproject.org/ 00:19:22.079 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f38/system-administrators-guide/ 00:19:22.079 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:19:22.079 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:19:22.079 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:19:22.079 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=38 00:19:22.079 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:19:22.079 ++ REDHAT_SUPPORT_PRODUCT_VERSION=38 00:19:22.079 ++ SUPPORT_END=2024-05-14 00:19:22.079 ++ VARIANT='Cloud Edition' 00:19:22.079 ++ VARIANT_ID=cloud 00:19:22.079 + uname -a 00:19:22.079 Linux spdk-wfp-52 6.7.0-68.fc38.x86_64 #1 SMP PREEMPT_DYNAMIC Mon Jan 15 00:59:40 UTC 2024 x86_64 GNU/Linux 00:19:22.079 + sudo /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:19:25.366 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:19:25.366 Hugepages 00:19:25.366 node hugesize free / total 00:19:25.366 node0 1048576kB 0 / 0 00:19:25.366 node0 2048kB 0 / 0 00:19:25.366 node1 1048576kB 0 / 0 00:19:25.366 node1 2048kB 0 / 0 00:19:25.366 00:19:25.366 Type BDF Vendor Device NUMA Driver Device Block devices 00:19:25.366 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:19:25.366 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:19:25.366 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:19:25.366 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:19:25.366 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:19:25.366 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:19:25.366 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:19:25.366 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:19:25.366 NVMe 0000:5e:00.0 8086 0b60 0 nvme nvme0 nvme0n1 00:19:25.366 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:19:25.366 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:19:25.366 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:19:25.366 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:19:25.367 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:19:25.367 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:19:25.367 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:19:25.367 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:19:25.367 VMD 0000:85:05.5 8086 201d 1 vfio-pci - - 00:19:25.367 + rm -f /tmp/spdk-ld-path 00:19:25.367 + source autorun-spdk.conf 00:19:25.367 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:19:25.367 ++ SPDK_TEST_BLOCKDEV=1 00:19:25.367 ++ SPDK_TEST_ISAL=1 00:19:25.367 ++ SPDK_TEST_CRYPTO=1 00:19:25.367 ++ SPDK_TEST_REDUCE=1 00:19:25.367 ++ SPDK_TEST_VBDEV_COMPRESS=1 00:19:25.367 ++ SPDK_RUN_UBSAN=1 00:19:25.367 ++ RUN_NIGHTLY=0 00:19:25.367 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:19:25.367 + [[ -n '' ]] 00:19:25.367 + sudo git config --global --add safe.directory /var/jenkins/workspace/crypto-phy-autotest/spdk 00:19:25.367 + for M in /var/spdk/build-*-manifest.txt 00:19:25.367 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:19:25.367 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/crypto-phy-autotest/output/ 00:19:25.367 + for M in /var/spdk/build-*-manifest.txt 00:19:25.367 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:19:25.367 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/crypto-phy-autotest/output/ 00:19:25.367 ++ uname 00:19:25.367 + [[ Linux == \L\i\n\u\x ]] 00:19:25.367 + sudo dmesg -T 00:19:25.367 + sudo dmesg --clear 00:19:25.626 + dmesg_pid=4177385 00:19:25.626 + [[ Fedora Linux == FreeBSD ]] 00:19:25.626 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:19:25.626 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:19:25.626 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:19:25.626 + export VM_IMAGE=/var/spdk/dependencies/vhost/spdk_test_image.qcow2 00:19:25.626 + VM_IMAGE=/var/spdk/dependencies/vhost/spdk_test_image.qcow2 00:19:25.626 + [[ -x /usr/src/fio-static/fio ]] 00:19:25.626 + sudo dmesg -Tw 00:19:25.626 + export FIO_BIN=/usr/src/fio-static/fio 00:19:25.626 + FIO_BIN=/usr/src/fio-static/fio 00:19:25.626 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\c\r\y\p\t\o\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:19:25.626 + [[ ! -v VFIO_QEMU_BIN ]] 00:19:25.626 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:19:25.626 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:19:25.626 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:19:25.626 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:19:25.626 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:19:25.626 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:19:25.626 + spdk/autorun.sh /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf 00:19:25.626 Test configuration: 00:19:25.626 SPDK_RUN_FUNCTIONAL_TEST=1 00:19:25.626 SPDK_TEST_BLOCKDEV=1 00:19:25.626 SPDK_TEST_ISAL=1 00:19:25.626 SPDK_TEST_CRYPTO=1 00:19:25.626 SPDK_TEST_REDUCE=1 00:19:25.626 SPDK_TEST_VBDEV_COMPRESS=1 00:19:25.626 SPDK_RUN_UBSAN=1 00:19:25.626 RUN_NIGHTLY=0 11:30:09 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:19:25.626 11:30:09 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:19:25.626 11:30:09 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:19:25.626 11:30:09 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:19:25.626 11:30:09 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:19:25.626 11:30:09 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:19:25.626 11:30:09 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:19:25.626 11:30:09 -- paths/export.sh@5 -- $ export PATH 00:19:25.626 11:30:09 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:19:25.626 11:30:09 -- common/autobuild_common.sh@436 -- $ out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:19:25.626 11:30:09 -- common/autobuild_common.sh@437 -- $ date +%s 00:19:25.626 11:30:09 -- common/autobuild_common.sh@437 -- $ mktemp -dt spdk_1718011809.XXXXXX 00:19:25.626 11:30:09 -- common/autobuild_common.sh@437 -- $ SPDK_WORKSPACE=/tmp/spdk_1718011809.z0yxVV 00:19:25.626 11:30:09 -- common/autobuild_common.sh@439 -- $ [[ -n '' ]] 00:19:25.626 11:30:09 -- common/autobuild_common.sh@443 -- $ '[' -n '' ']' 00:19:25.626 11:30:09 -- common/autobuild_common.sh@446 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/' 00:19:25.626 11:30:09 -- common/autobuild_common.sh@450 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp' 00:19:25.626 11:30:09 -- common/autobuild_common.sh@452 -- $ scanbuild='scan-build -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:19:25.626 11:30:09 -- common/autobuild_common.sh@453 -- $ get_config_params 00:19:25.626 11:30:09 -- common/autotest_common.sh@396 -- $ xtrace_disable 00:19:25.626 11:30:09 -- common/autotest_common.sh@10 -- $ set +x 00:19:25.626 11:30:09 -- common/autobuild_common.sh@453 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk' 00:19:25.626 11:30:09 -- common/autobuild_common.sh@455 -- $ start_monitor_resources 00:19:25.626 11:30:09 -- pm/common@17 -- $ local monitor 00:19:25.626 11:30:09 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:19:25.626 11:30:09 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:19:25.626 11:30:09 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:19:25.626 11:30:09 -- pm/common@21 -- $ date +%s 00:19:25.626 11:30:09 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:19:25.626 11:30:09 -- pm/common@21 -- $ date +%s 00:19:25.626 11:30:09 -- pm/common@21 -- $ date +%s 00:19:25.626 11:30:09 -- pm/common@25 -- $ sleep 1 00:19:25.626 11:30:09 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1718011809 00:19:25.626 11:30:09 -- pm/common@21 -- $ date +%s 00:19:25.626 11:30:09 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1718011809 00:19:25.626 11:30:09 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1718011809 00:19:25.626 11:30:09 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1718011809 00:19:25.626 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1718011809_collect-cpu-load.pm.log 00:19:25.626 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1718011809_collect-vmstat.pm.log 00:19:25.626 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1718011809_collect-cpu-temp.pm.log 00:19:25.884 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1718011809_collect-bmc-pm.bmc.pm.log 00:19:26.821 11:30:10 -- common/autobuild_common.sh@456 -- $ trap stop_monitor_resources EXIT 00:19:26.821 11:30:10 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:19:26.821 11:30:10 -- spdk/autobuild.sh@12 -- $ umask 022 00:19:26.821 11:30:10 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:19:26.821 11:30:10 -- spdk/autobuild.sh@16 -- $ date -u 00:19:26.821 Mon Jun 10 09:30:10 AM UTC 2024 00:19:26.821 11:30:10 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:19:26.821 v24.09-pre-59-ga3f6419f1 00:19:26.821 11:30:10 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:19:26.821 11:30:10 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:19:26.821 11:30:10 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:19:26.821 11:30:10 -- common/autotest_common.sh@1100 -- $ '[' 3 -le 1 ']' 00:19:26.821 11:30:10 -- common/autotest_common.sh@1106 -- $ xtrace_disable 00:19:26.821 11:30:10 -- common/autotest_common.sh@10 -- $ set +x 00:19:26.821 ************************************ 00:19:26.821 START TEST ubsan 00:19:26.821 ************************************ 00:19:26.821 11:30:10 ubsan -- common/autotest_common.sh@1124 -- $ echo 'using ubsan' 00:19:26.821 using ubsan 00:19:26.821 00:19:26.821 real 0m0.001s 00:19:26.821 user 0m0.000s 00:19:26.821 sys 0m0.001s 00:19:26.821 11:30:10 ubsan -- common/autotest_common.sh@1125 -- $ xtrace_disable 00:19:26.821 11:30:10 ubsan -- common/autotest_common.sh@10 -- $ set +x 00:19:26.821 ************************************ 00:19:26.821 END TEST ubsan 00:19:26.821 ************************************ 00:19:26.821 11:30:10 -- spdk/autobuild.sh@27 -- $ '[' -n '' ']' 00:19:26.821 11:30:10 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:19:26.821 11:30:10 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:19:26.821 11:30:10 -- spdk/autobuild.sh@51 -- $ [[ 0 -eq 1 ]] 00:19:26.821 11:30:10 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:19:26.821 11:30:10 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:19:26.821 11:30:10 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:19:26.821 11:30:10 -- spdk/autobuild.sh@62 -- $ [[ 0 -eq 1 ]] 00:19:26.821 11:30:10 -- spdk/autobuild.sh@67 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk --with-shared 00:19:26.821 Using default SPDK env in /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:19:26.821 Using default DPDK in /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:19:27.388 Using 'verbs' RDMA provider 00:19:40.544 Configuring ISA-L (logfile: /var/jenkins/workspace/crypto-phy-autotest/spdk/.spdk-isal.log)...done. 00:19:55.429 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/crypto-phy-autotest/spdk/.spdk-isal-crypto.log)...done. 00:19:55.429 Creating mk/config.mk...done. 00:19:55.429 Creating mk/cc.flags.mk...done. 00:19:55.429 Type 'make' to build. 00:19:55.429 11:30:37 -- spdk/autobuild.sh@69 -- $ run_test make make -j72 00:19:55.429 11:30:37 -- common/autotest_common.sh@1100 -- $ '[' 3 -le 1 ']' 00:19:55.429 11:30:37 -- common/autotest_common.sh@1106 -- $ xtrace_disable 00:19:55.429 11:30:37 -- common/autotest_common.sh@10 -- $ set +x 00:19:55.429 ************************************ 00:19:55.429 START TEST make 00:19:55.429 ************************************ 00:19:55.429 11:30:37 make -- common/autotest_common.sh@1124 -- $ make -j72 00:19:55.429 make[1]: Nothing to be done for 'all'. 00:20:27.516 The Meson build system 00:20:27.516 Version: 1.3.1 00:20:27.516 Source dir: /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk 00:20:27.517 Build dir: /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp 00:20:27.517 Build type: native build 00:20:27.517 Program cat found: YES (/usr/bin/cat) 00:20:27.517 Project name: DPDK 00:20:27.517 Project version: 24.03.0 00:20:27.517 C compiler for the host machine: cc (gcc 13.2.1 "cc (GCC) 13.2.1 20231011 (Red Hat 13.2.1-4)") 00:20:27.517 C linker for the host machine: cc ld.bfd 2.39-16 00:20:27.517 Host machine cpu family: x86_64 00:20:27.517 Host machine cpu: x86_64 00:20:27.517 Message: ## Building in Developer Mode ## 00:20:27.517 Program pkg-config found: YES (/usr/bin/pkg-config) 00:20:27.517 Program check-symbols.sh found: YES (/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/buildtools/check-symbols.sh) 00:20:27.517 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/buildtools/options-ibverbs-static.sh) 00:20:27.517 Program python3 found: YES (/usr/bin/python3) 00:20:27.517 Program cat found: YES (/usr/bin/cat) 00:20:27.517 Compiler for C supports arguments -march=native: YES 00:20:27.517 Checking for size of "void *" : 8 00:20:27.517 Checking for size of "void *" : 8 (cached) 00:20:27.517 Compiler for C supports link arguments -Wl,--undefined-version: NO 00:20:27.517 Library m found: YES 00:20:27.517 Library numa found: YES 00:20:27.517 Has header "numaif.h" : YES 00:20:27.517 Library fdt found: NO 00:20:27.517 Library execinfo found: NO 00:20:27.517 Has header "execinfo.h" : YES 00:20:27.517 Found pkg-config: YES (/usr/bin/pkg-config) 1.8.0 00:20:27.517 Run-time dependency libarchive found: NO (tried pkgconfig) 00:20:27.517 Run-time dependency libbsd found: NO (tried pkgconfig) 00:20:27.517 Run-time dependency jansson found: NO (tried pkgconfig) 00:20:27.517 Run-time dependency openssl found: YES 3.0.9 00:20:27.517 Run-time dependency libpcap found: YES 1.10.4 00:20:27.517 Has header "pcap.h" with dependency libpcap: YES 00:20:27.517 Compiler for C supports arguments -Wcast-qual: YES 00:20:27.517 Compiler for C supports arguments -Wdeprecated: YES 00:20:27.517 Compiler for C supports arguments -Wformat: YES 00:20:27.517 Compiler for C supports arguments -Wformat-nonliteral: NO 00:20:27.517 Compiler for C supports arguments -Wformat-security: NO 00:20:27.517 Compiler for C supports arguments -Wmissing-declarations: YES 00:20:27.517 Compiler for C supports arguments -Wmissing-prototypes: YES 00:20:27.517 Compiler for C supports arguments -Wnested-externs: YES 00:20:27.517 Compiler for C supports arguments -Wold-style-definition: YES 00:20:27.517 Compiler for C supports arguments -Wpointer-arith: YES 00:20:27.517 Compiler for C supports arguments -Wsign-compare: YES 00:20:27.517 Compiler for C supports arguments -Wstrict-prototypes: YES 00:20:27.517 Compiler for C supports arguments -Wundef: YES 00:20:27.517 Compiler for C supports arguments -Wwrite-strings: YES 00:20:27.517 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:20:27.517 Compiler for C supports arguments -Wno-packed-not-aligned: YES 00:20:27.517 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:20:27.517 Compiler for C supports arguments -Wno-zero-length-bounds: YES 00:20:27.517 Program objdump found: YES (/usr/bin/objdump) 00:20:27.517 Compiler for C supports arguments -mavx512f: YES 00:20:27.517 Checking if "AVX512 checking" compiles: YES 00:20:27.517 Fetching value of define "__SSE4_2__" : 1 00:20:27.517 Fetching value of define "__AES__" : 1 00:20:27.517 Fetching value of define "__AVX__" : 1 00:20:27.517 Fetching value of define "__AVX2__" : 1 00:20:27.517 Fetching value of define "__AVX512BW__" : 1 00:20:27.517 Fetching value of define "__AVX512CD__" : 1 00:20:27.517 Fetching value of define "__AVX512DQ__" : 1 00:20:27.517 Fetching value of define "__AVX512F__" : 1 00:20:27.517 Fetching value of define "__AVX512VL__" : 1 00:20:27.517 Fetching value of define "__PCLMUL__" : 1 00:20:27.517 Fetching value of define "__RDRND__" : 1 00:20:27.517 Fetching value of define "__RDSEED__" : 1 00:20:27.517 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:20:27.517 Fetching value of define "__znver1__" : (undefined) 00:20:27.517 Fetching value of define "__znver2__" : (undefined) 00:20:27.517 Fetching value of define "__znver3__" : (undefined) 00:20:27.517 Fetching value of define "__znver4__" : (undefined) 00:20:27.517 Compiler for C supports arguments -Wno-format-truncation: YES 00:20:27.517 Message: lib/log: Defining dependency "log" 00:20:27.517 Message: lib/kvargs: Defining dependency "kvargs" 00:20:27.517 Message: lib/telemetry: Defining dependency "telemetry" 00:20:27.517 Checking for function "getentropy" : NO 00:20:27.517 Message: lib/eal: Defining dependency "eal" 00:20:27.517 Message: lib/ring: Defining dependency "ring" 00:20:27.517 Message: lib/rcu: Defining dependency "rcu" 00:20:27.517 Message: lib/mempool: Defining dependency "mempool" 00:20:27.517 Message: lib/mbuf: Defining dependency "mbuf" 00:20:27.517 Fetching value of define "__PCLMUL__" : 1 (cached) 00:20:27.517 Fetching value of define "__AVX512F__" : 1 (cached) 00:20:27.517 Fetching value of define "__AVX512BW__" : 1 (cached) 00:20:27.517 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:20:27.517 Fetching value of define "__AVX512VL__" : 1 (cached) 00:20:27.517 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:20:27.517 Compiler for C supports arguments -mpclmul: YES 00:20:27.517 Compiler for C supports arguments -maes: YES 00:20:27.517 Compiler for C supports arguments -mavx512f: YES (cached) 00:20:27.517 Compiler for C supports arguments -mavx512bw: YES 00:20:27.517 Compiler for C supports arguments -mavx512dq: YES 00:20:27.517 Compiler for C supports arguments -mavx512vl: YES 00:20:27.517 Compiler for C supports arguments -mvpclmulqdq: YES 00:20:27.517 Compiler for C supports arguments -mavx2: YES 00:20:27.517 Compiler for C supports arguments -mavx: YES 00:20:27.517 Message: lib/net: Defining dependency "net" 00:20:27.517 Message: lib/meter: Defining dependency "meter" 00:20:27.517 Message: lib/ethdev: Defining dependency "ethdev" 00:20:27.517 Message: lib/pci: Defining dependency "pci" 00:20:27.517 Message: lib/cmdline: Defining dependency "cmdline" 00:20:27.517 Message: lib/hash: Defining dependency "hash" 00:20:27.517 Message: lib/timer: Defining dependency "timer" 00:20:27.517 Message: lib/compressdev: Defining dependency "compressdev" 00:20:27.517 Message: lib/cryptodev: Defining dependency "cryptodev" 00:20:27.517 Message: lib/dmadev: Defining dependency "dmadev" 00:20:27.517 Compiler for C supports arguments -Wno-cast-qual: YES 00:20:27.517 Message: lib/power: Defining dependency "power" 00:20:27.517 Message: lib/reorder: Defining dependency "reorder" 00:20:27.517 Message: lib/security: Defining dependency "security" 00:20:27.517 Has header "linux/userfaultfd.h" : YES 00:20:27.517 Has header "linux/vduse.h" : YES 00:20:27.517 Message: lib/vhost: Defining dependency "vhost" 00:20:27.517 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:20:27.517 Message: drivers/bus/auxiliary: Defining dependency "bus_auxiliary" 00:20:27.517 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:20:27.517 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:20:27.517 Compiler for C supports arguments -std=c11: YES 00:20:27.517 Compiler for C supports arguments -Wno-strict-prototypes: YES 00:20:27.517 Compiler for C supports arguments -D_BSD_SOURCE: YES 00:20:27.517 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES 00:20:27.517 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES 00:20:27.517 Run-time dependency libmlx5 found: YES 1.24.44.0 00:20:27.517 Run-time dependency libibverbs found: YES 1.14.44.0 00:20:27.517 Library mtcr_ul found: NO 00:20:27.517 Header "infiniband/verbs.h" has symbol "IBV_FLOW_SPEC_ESP" with dependencies libmlx5, libibverbs: YES 00:20:27.517 Header "infiniband/verbs.h" has symbol "IBV_RX_HASH_IPSEC_SPI" with dependencies libmlx5, libibverbs: YES 00:20:27.517 Header "infiniband/verbs.h" has symbol "IBV_ACCESS_RELAXED_ORDERING " with dependencies libmlx5, libibverbs: YES 00:20:27.517 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CQE_RES_FORMAT_CSUM_STRIDX" with dependencies libmlx5, libibverbs: YES 00:20:27.517 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_MASK_TUNNEL_OFFLOADS" with dependencies libmlx5, libibverbs: YES 00:20:27.517 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_FLAGS_MPW_ALLOWED" with dependencies libmlx5, libibverbs: YES 00:20:27.517 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_FLAGS_CQE_128B_COMP" with dependencies libmlx5, libibverbs: YES 00:20:27.517 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CQ_INIT_ATTR_FLAGS_CQE_PAD" with dependencies libmlx5, libibverbs: YES 00:20:27.517 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_create_flow_action_packet_reformat" with dependencies libmlx5, libibverbs: YES 00:20:27.517 Header "infiniband/verbs.h" has symbol "IBV_FLOW_SPEC_MPLS" with dependencies libmlx5, libibverbs: YES 00:20:27.517 Header "infiniband/verbs.h" has symbol "IBV_WQ_FLAGS_PCI_WRITE_END_PADDING" with dependencies libmlx5, libibverbs: YES 00:20:27.517 Header "infiniband/verbs.h" has symbol "IBV_WQ_FLAG_RX_END_PADDING" with dependencies libmlx5, libibverbs: NO 00:20:27.517 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_query_devx_port" with dependencies libmlx5, libibverbs: NO 00:20:27.517 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_query_port" with dependencies libmlx5, libibverbs: YES 00:20:27.517 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_ib_port" with dependencies libmlx5, libibverbs: YES 00:20:27.775 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_obj_create" with dependencies libmlx5, libibverbs: YES 00:20:27.775 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_FLOW_ACTION_COUNTERS_DEVX" with dependencies libmlx5, libibverbs: YES 00:20:27.775 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_FLOW_ACTION_DEFAULT_MISS" with dependencies libmlx5, libibverbs: YES 00:20:27.775 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_obj_query_async" with dependencies libmlx5, libibverbs: YES 00:20:27.775 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_qp_query" with dependencies libmlx5, libibverbs: YES 00:20:27.775 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_pp_alloc" with dependencies libmlx5, libibverbs: YES 00:20:27.775 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_devx_tir" with dependencies libmlx5, libibverbs: YES 00:20:27.775 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_get_event" with dependencies libmlx5, libibverbs: YES 00:20:27.775 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_flow_meter" with dependencies libmlx5, libibverbs: YES 00:20:27.775 Header "infiniband/mlx5dv.h" has symbol "MLX5_MMAP_GET_NC_PAGES_CMD" with dependencies libmlx5, libibverbs: YES 00:20:27.775 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_DOMAIN_TYPE_NIC_RX" with dependencies libmlx5, libibverbs: YES 00:20:27.775 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_DOMAIN_TYPE_FDB" with dependencies libmlx5, libibverbs: YES 00:20:27.775 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_push_vlan" with dependencies libmlx5, libibverbs: YES 00:20:27.775 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_alloc_var" with dependencies libmlx5, libibverbs: YES 00:20:27.775 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_ENHANCED_MPSW" with dependencies libmlx5, libibverbs: NO 00:20:27.775 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_SEND_EN" with dependencies libmlx5, libibverbs: NO 00:20:27.775 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_WAIT" with dependencies libmlx5, libibverbs: NO 00:20:27.775 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_ACCESS_ASO" with dependencies libmlx5, libibverbs: NO 00:20:27.775 Header "linux/if_link.h" has symbol "IFLA_NUM_VF" with dependencies libmlx5, libibverbs: YES 00:20:27.775 Header "linux/if_link.h" has symbol "IFLA_EXT_MASK" with dependencies libmlx5, libibverbs: YES 00:20:27.775 Header "linux/if_link.h" has symbol "IFLA_PHYS_SWITCH_ID" with dependencies libmlx5, libibverbs: YES 00:20:27.775 Header "linux/if_link.h" has symbol "IFLA_PHYS_PORT_NAME" with dependencies libmlx5, libibverbs: YES 00:20:27.775 Header "rdma/rdma_netlink.h" has symbol "RDMA_NL_NLDEV" with dependencies libmlx5, libibverbs: YES 00:20:27.776 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_CMD_GET" with dependencies libmlx5, libibverbs: YES 00:20:27.776 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_CMD_PORT_GET" with dependencies libmlx5, libibverbs: YES 00:20:27.776 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_DEV_INDEX" with dependencies libmlx5, libibverbs: YES 00:20:27.776 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_DEV_NAME" with dependencies libmlx5, libibverbs: YES 00:20:27.776 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_PORT_INDEX" with dependencies libmlx5, libibverbs: YES 00:20:27.776 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_PORT_STATE" with dependencies libmlx5, libibverbs: YES 00:20:27.776 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_NDEV_INDEX" with dependencies libmlx5, libibverbs: YES 00:20:27.776 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dump_dr_domain" with dependencies libmlx5, libibverbs: YES 00:20:27.776 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_flow_sampler" with dependencies libmlx5, libibverbs: YES 00:20:27.776 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_domain_set_reclaim_device_memory" with dependencies libmlx5, libibverbs: YES 00:20:27.776 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_array" with dependencies libmlx5, libibverbs: YES 00:20:27.776 Header "linux/devlink.h" has symbol "DEVLINK_GENL_NAME" with dependencies libmlx5, libibverbs: YES 00:20:27.776 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_aso" with dependencies libmlx5, libibverbs: YES 00:20:27.776 Header "infiniband/verbs.h" has symbol "INFINIBAND_VERBS_H" with dependencies libmlx5, libibverbs: YES 00:20:27.776 Header "infiniband/mlx5dv.h" has symbol "MLX5_WQE_UMR_CTRL_FLAG_INLINE" with dependencies libmlx5, libibverbs: YES 00:20:27.776 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dump_dr_rule" with dependencies libmlx5, libibverbs: YES 00:20:27.776 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_ACTION_FLAGS_ASO_CT_DIRECTION_INITIATOR" with dependencies libmlx5, libibverbs: YES 00:20:27.776 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_domain_allow_duplicate_rules" with dependencies libmlx5, libibverbs: YES 00:20:27.776 Header "infiniband/verbs.h" has symbol "ibv_reg_mr_iova" with dependencies libmlx5, libibverbs: YES 00:20:27.776 Header "infiniband/verbs.h" has symbol "ibv_import_device" with dependencies libmlx5, libibverbs: YES 00:20:27.776 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_root_table" with dependencies libmlx5, libibverbs: YES 00:20:27.776 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_create_steering_anchor" with dependencies libmlx5, libibverbs: YES 00:20:27.776 Header "infiniband/verbs.h" has symbol "ibv_is_fork_initialized" with dependencies libmlx5, libibverbs: YES 00:20:27.776 Checking whether type "struct mlx5dv_sw_parsing_caps" has member "sw_parsing_offloads" with dependencies libmlx5, libibverbs: YES 00:20:27.776 Checking whether type "struct ibv_counter_set_init_attr" has member "counter_set_id" with dependencies libmlx5, libibverbs: NO 00:20:27.776 Checking whether type "struct ibv_counters_init_attr" has member "comp_mask" with dependencies libmlx5, libibverbs: YES 00:20:27.776 Checking whether type "struct mlx5dv_devx_uar" has member "mmap_off" with dependencies libmlx5, libibverbs: YES 00:20:27.776 Checking whether type "struct mlx5dv_flow_matcher_attr" has member "ft_type" with dependencies libmlx5, libibverbs: YES 00:20:27.776 Configuring mlx5_autoconf.h using configuration 00:20:27.776 Message: drivers/common/mlx5: Defining dependency "common_mlx5" 00:20:27.776 Run-time dependency libcrypto found: YES 3.0.9 00:20:27.776 Library IPSec_MB found: YES 00:20:27.776 Fetching value of define "IMB_VERSION_STR" : "1.5.0" 00:20:27.776 Message: drivers/common/qat: Defining dependency "common_qat" 00:20:27.776 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:20:27.776 Message: Disabling raw/* drivers: missing internal dependency "rawdev" 00:20:27.776 Library IPSec_MB found: YES 00:20:27.776 Fetching value of define "IMB_VERSION_STR" : "1.5.0" (cached) 00:20:27.776 Message: drivers/crypto/ipsec_mb: Defining dependency "crypto_ipsec_mb" 00:20:27.776 Compiler for C supports arguments -std=c11: YES (cached) 00:20:27.776 Compiler for C supports arguments -Wno-strict-prototypes: YES (cached) 00:20:27.776 Compiler for C supports arguments -D_BSD_SOURCE: YES (cached) 00:20:27.776 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES (cached) 00:20:27.776 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES (cached) 00:20:27.776 Message: drivers/crypto/mlx5: Defining dependency "crypto_mlx5" 00:20:27.776 Run-time dependency libisal found: NO (tried pkgconfig) 00:20:27.776 Library libisal found: NO 00:20:27.776 Message: drivers/compress/isal: Defining dependency "compress_isal" 00:20:27.776 Compiler for C supports arguments -std=c11: YES (cached) 00:20:27.776 Compiler for C supports arguments -Wno-strict-prototypes: YES (cached) 00:20:27.776 Compiler for C supports arguments -D_BSD_SOURCE: YES (cached) 00:20:27.776 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES (cached) 00:20:27.776 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES (cached) 00:20:27.776 Message: drivers/compress/mlx5: Defining dependency "compress_mlx5" 00:20:27.776 Message: Disabling regex/* drivers: missing internal dependency "regexdev" 00:20:27.776 Message: Disabling ml/* drivers: missing internal dependency "mldev" 00:20:27.776 Message: Disabling event/* drivers: missing internal dependency "eventdev" 00:20:27.776 Message: Disabling baseband/* drivers: missing internal dependency "bbdev" 00:20:27.776 Message: Disabling gpu/* drivers: missing internal dependency "gpudev" 00:20:27.776 Program doxygen found: YES (/usr/bin/doxygen) 00:20:27.776 Configuring doxy-api-html.conf using configuration 00:20:27.776 Configuring doxy-api-man.conf using configuration 00:20:27.776 Program mandb found: YES (/usr/bin/mandb) 00:20:27.776 Program sphinx-build found: NO 00:20:27.776 Configuring rte_build_config.h using configuration 00:20:27.776 Message: 00:20:27.776 ================= 00:20:27.776 Applications Enabled 00:20:27.776 ================= 00:20:27.776 00:20:27.776 apps: 00:20:27.776 00:20:27.776 00:20:27.776 Message: 00:20:27.776 ================= 00:20:27.776 Libraries Enabled 00:20:27.776 ================= 00:20:27.776 00:20:27.776 libs: 00:20:27.776 log, kvargs, telemetry, eal, ring, rcu, mempool, mbuf, 00:20:27.776 net, meter, ethdev, pci, cmdline, hash, timer, compressdev, 00:20:27.776 cryptodev, dmadev, power, reorder, security, vhost, 00:20:27.776 00:20:27.776 Message: 00:20:27.776 =============== 00:20:27.776 Drivers Enabled 00:20:27.776 =============== 00:20:27.776 00:20:27.776 common: 00:20:27.776 mlx5, qat, 00:20:27.776 bus: 00:20:27.776 auxiliary, pci, vdev, 00:20:27.776 mempool: 00:20:27.776 ring, 00:20:27.776 dma: 00:20:27.776 00:20:27.776 net: 00:20:27.776 00:20:27.776 crypto: 00:20:27.776 ipsec_mb, mlx5, 00:20:27.776 compress: 00:20:27.776 isal, mlx5, 00:20:27.776 vdpa: 00:20:27.776 00:20:27.776 00:20:27.776 Message: 00:20:27.776 ================= 00:20:27.776 Content Skipped 00:20:27.776 ================= 00:20:27.776 00:20:27.776 apps: 00:20:27.776 dumpcap: explicitly disabled via build config 00:20:27.776 graph: explicitly disabled via build config 00:20:27.776 pdump: explicitly disabled via build config 00:20:27.776 proc-info: explicitly disabled via build config 00:20:27.776 test-acl: explicitly disabled via build config 00:20:27.776 test-bbdev: explicitly disabled via build config 00:20:27.776 test-cmdline: explicitly disabled via build config 00:20:27.776 test-compress-perf: explicitly disabled via build config 00:20:27.776 test-crypto-perf: explicitly disabled via build config 00:20:27.776 test-dma-perf: explicitly disabled via build config 00:20:27.776 test-eventdev: explicitly disabled via build config 00:20:27.776 test-fib: explicitly disabled via build config 00:20:27.776 test-flow-perf: explicitly disabled via build config 00:20:27.776 test-gpudev: explicitly disabled via build config 00:20:27.776 test-mldev: explicitly disabled via build config 00:20:27.776 test-pipeline: explicitly disabled via build config 00:20:27.776 test-pmd: explicitly disabled via build config 00:20:27.776 test-regex: explicitly disabled via build config 00:20:27.776 test-sad: explicitly disabled via build config 00:20:27.776 test-security-perf: explicitly disabled via build config 00:20:27.776 00:20:27.776 libs: 00:20:27.776 argparse: explicitly disabled via build config 00:20:27.776 metrics: explicitly disabled via build config 00:20:27.776 acl: explicitly disabled via build config 00:20:27.776 bbdev: explicitly disabled via build config 00:20:27.776 bitratestats: explicitly disabled via build config 00:20:27.776 bpf: explicitly disabled via build config 00:20:27.776 cfgfile: explicitly disabled via build config 00:20:27.776 distributor: explicitly disabled via build config 00:20:27.776 efd: explicitly disabled via build config 00:20:27.776 eventdev: explicitly disabled via build config 00:20:27.776 dispatcher: explicitly disabled via build config 00:20:27.776 gpudev: explicitly disabled via build config 00:20:27.776 gro: explicitly disabled via build config 00:20:27.776 gso: explicitly disabled via build config 00:20:27.776 ip_frag: explicitly disabled via build config 00:20:27.776 jobstats: explicitly disabled via build config 00:20:27.776 latencystats: explicitly disabled via build config 00:20:27.776 lpm: explicitly disabled via build config 00:20:27.776 member: explicitly disabled via build config 00:20:27.776 pcapng: explicitly disabled via build config 00:20:27.776 rawdev: explicitly disabled via build config 00:20:27.776 regexdev: explicitly disabled via build config 00:20:27.776 mldev: explicitly disabled via build config 00:20:27.776 rib: explicitly disabled via build config 00:20:27.776 sched: explicitly disabled via build config 00:20:27.776 stack: explicitly disabled via build config 00:20:27.776 ipsec: explicitly disabled via build config 00:20:27.776 pdcp: explicitly disabled via build config 00:20:27.776 fib: explicitly disabled via build config 00:20:27.776 port: explicitly disabled via build config 00:20:27.776 pdump: explicitly disabled via build config 00:20:27.776 table: explicitly disabled via build config 00:20:27.776 pipeline: explicitly disabled via build config 00:20:27.776 graph: explicitly disabled via build config 00:20:27.776 node: explicitly disabled via build config 00:20:27.777 00:20:27.777 drivers: 00:20:27.777 common/cpt: not in enabled drivers build config 00:20:27.777 common/dpaax: not in enabled drivers build config 00:20:27.777 common/iavf: not in enabled drivers build config 00:20:27.777 common/idpf: not in enabled drivers build config 00:20:27.777 common/ionic: not in enabled drivers build config 00:20:27.777 common/mvep: not in enabled drivers build config 00:20:27.777 common/octeontx: not in enabled drivers build config 00:20:27.777 bus/cdx: not in enabled drivers build config 00:20:27.777 bus/dpaa: not in enabled drivers build config 00:20:27.777 bus/fslmc: not in enabled drivers build config 00:20:27.777 bus/ifpga: not in enabled drivers build config 00:20:27.777 bus/platform: not in enabled drivers build config 00:20:27.777 bus/uacce: not in enabled drivers build config 00:20:27.777 bus/vmbus: not in enabled drivers build config 00:20:27.777 common/cnxk: not in enabled drivers build config 00:20:27.777 common/nfp: not in enabled drivers build config 00:20:27.777 common/nitrox: not in enabled drivers build config 00:20:27.777 common/sfc_efx: not in enabled drivers build config 00:20:27.777 mempool/bucket: not in enabled drivers build config 00:20:27.777 mempool/cnxk: not in enabled drivers build config 00:20:27.777 mempool/dpaa: not in enabled drivers build config 00:20:27.777 mempool/dpaa2: not in enabled drivers build config 00:20:27.777 mempool/octeontx: not in enabled drivers build config 00:20:27.777 mempool/stack: not in enabled drivers build config 00:20:27.777 dma/cnxk: not in enabled drivers build config 00:20:27.777 dma/dpaa: not in enabled drivers build config 00:20:27.777 dma/dpaa2: not in enabled drivers build config 00:20:27.777 dma/hisilicon: not in enabled drivers build config 00:20:27.777 dma/idxd: not in enabled drivers build config 00:20:27.777 dma/ioat: not in enabled drivers build config 00:20:27.777 dma/skeleton: not in enabled drivers build config 00:20:27.777 net/af_packet: not in enabled drivers build config 00:20:27.777 net/af_xdp: not in enabled drivers build config 00:20:27.777 net/ark: not in enabled drivers build config 00:20:27.777 net/atlantic: not in enabled drivers build config 00:20:27.777 net/avp: not in enabled drivers build config 00:20:27.777 net/axgbe: not in enabled drivers build config 00:20:27.777 net/bnx2x: not in enabled drivers build config 00:20:27.777 net/bnxt: not in enabled drivers build config 00:20:27.777 net/bonding: not in enabled drivers build config 00:20:27.777 net/cnxk: not in enabled drivers build config 00:20:27.777 net/cpfl: not in enabled drivers build config 00:20:27.777 net/cxgbe: not in enabled drivers build config 00:20:27.777 net/dpaa: not in enabled drivers build config 00:20:27.777 net/dpaa2: not in enabled drivers build config 00:20:27.777 net/e1000: not in enabled drivers build config 00:20:27.777 net/ena: not in enabled drivers build config 00:20:27.777 net/enetc: not in enabled drivers build config 00:20:27.777 net/enetfec: not in enabled drivers build config 00:20:27.777 net/enic: not in enabled drivers build config 00:20:27.777 net/failsafe: not in enabled drivers build config 00:20:27.777 net/fm10k: not in enabled drivers build config 00:20:27.777 net/gve: not in enabled drivers build config 00:20:27.777 net/hinic: not in enabled drivers build config 00:20:27.777 net/hns3: not in enabled drivers build config 00:20:27.777 net/i40e: not in enabled drivers build config 00:20:27.777 net/iavf: not in enabled drivers build config 00:20:27.777 net/ice: not in enabled drivers build config 00:20:27.777 net/idpf: not in enabled drivers build config 00:20:27.777 net/igc: not in enabled drivers build config 00:20:27.777 net/ionic: not in enabled drivers build config 00:20:27.777 net/ipn3ke: not in enabled drivers build config 00:20:27.777 net/ixgbe: not in enabled drivers build config 00:20:27.777 net/mana: not in enabled drivers build config 00:20:27.777 net/memif: not in enabled drivers build config 00:20:27.777 net/mlx4: not in enabled drivers build config 00:20:27.777 net/mlx5: not in enabled drivers build config 00:20:27.777 net/mvneta: not in enabled drivers build config 00:20:27.777 net/mvpp2: not in enabled drivers build config 00:20:27.777 net/netvsc: not in enabled drivers build config 00:20:27.777 net/nfb: not in enabled drivers build config 00:20:27.777 net/nfp: not in enabled drivers build config 00:20:27.777 net/ngbe: not in enabled drivers build config 00:20:27.777 net/null: not in enabled drivers build config 00:20:27.777 net/octeontx: not in enabled drivers build config 00:20:27.777 net/octeon_ep: not in enabled drivers build config 00:20:27.777 net/pcap: not in enabled drivers build config 00:20:27.777 net/pfe: not in enabled drivers build config 00:20:27.777 net/qede: not in enabled drivers build config 00:20:27.777 net/ring: not in enabled drivers build config 00:20:27.777 net/sfc: not in enabled drivers build config 00:20:27.777 net/softnic: not in enabled drivers build config 00:20:27.777 net/tap: not in enabled drivers build config 00:20:27.777 net/thunderx: not in enabled drivers build config 00:20:27.777 net/txgbe: not in enabled drivers build config 00:20:27.777 net/vdev_netvsc: not in enabled drivers build config 00:20:27.777 net/vhost: not in enabled drivers build config 00:20:27.777 net/virtio: not in enabled drivers build config 00:20:27.777 net/vmxnet3: not in enabled drivers build config 00:20:27.777 raw/*: missing internal dependency, "rawdev" 00:20:27.777 crypto/armv8: not in enabled drivers build config 00:20:27.777 crypto/bcmfs: not in enabled drivers build config 00:20:27.777 crypto/caam_jr: not in enabled drivers build config 00:20:27.777 crypto/ccp: not in enabled drivers build config 00:20:27.777 crypto/cnxk: not in enabled drivers build config 00:20:27.777 crypto/dpaa_sec: not in enabled drivers build config 00:20:27.777 crypto/dpaa2_sec: not in enabled drivers build config 00:20:27.777 crypto/mvsam: not in enabled drivers build config 00:20:27.777 crypto/nitrox: not in enabled drivers build config 00:20:27.777 crypto/null: not in enabled drivers build config 00:20:27.777 crypto/octeontx: not in enabled drivers build config 00:20:27.777 crypto/openssl: not in enabled drivers build config 00:20:27.777 crypto/scheduler: not in enabled drivers build config 00:20:27.777 crypto/uadk: not in enabled drivers build config 00:20:27.777 crypto/virtio: not in enabled drivers build config 00:20:27.777 compress/nitrox: not in enabled drivers build config 00:20:27.777 compress/octeontx: not in enabled drivers build config 00:20:27.777 compress/zlib: not in enabled drivers build config 00:20:27.777 regex/*: missing internal dependency, "regexdev" 00:20:27.777 ml/*: missing internal dependency, "mldev" 00:20:27.777 vdpa/ifc: not in enabled drivers build config 00:20:27.777 vdpa/mlx5: not in enabled drivers build config 00:20:27.777 vdpa/nfp: not in enabled drivers build config 00:20:27.777 vdpa/sfc: not in enabled drivers build config 00:20:27.777 event/*: missing internal dependency, "eventdev" 00:20:27.777 baseband/*: missing internal dependency, "bbdev" 00:20:27.777 gpu/*: missing internal dependency, "gpudev" 00:20:27.777 00:20:27.777 00:20:28.341 Build targets in project: 115 00:20:28.341 00:20:28.341 DPDK 24.03.0 00:20:28.341 00:20:28.341 User defined options 00:20:28.341 buildtype : debug 00:20:28.341 default_library : shared 00:20:28.341 libdir : lib 00:20:28.341 prefix : /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:20:28.341 c_args : -Wno-stringop-overflow -fcommon -Wno-stringop-overread -Wno-array-bounds -I/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib -DNO_COMPAT_IMB_API_053 -I/var/jenkins/workspace/crypto-phy-autotest/spdk/isa-l -I/var/jenkins/workspace/crypto-phy-autotest/spdk/isalbuild -fPIC -Werror 00:20:28.341 c_link_args : -L/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib -L/var/jenkins/workspace/crypto-phy-autotest/spdk/isa-l/.libs -lisal 00:20:28.341 cpu_instruction_set: native 00:20:28.341 disable_apps : test-sad,graph,test-regex,dumpcap,test-eventdev,test-compress-perf,pdump,test-security-perf,test-pmd,test-flow-perf,test-pipeline,test-crypto-perf,test-gpudev,test-cmdline,test-dma-perf,proc-info,test-bbdev,test-acl,test,test-mldev,test-fib 00:20:28.341 disable_libs : sched,port,dispatcher,graph,rawdev,pdcp,bitratestats,ipsec,pcapng,pdump,gso,cfgfile,gpudev,ip_frag,node,distributor,mldev,lpm,acl,bpf,latencystats,eventdev,regexdev,gro,stack,fib,argparse,pipeline,bbdev,table,metrics,member,jobstats,efd,rib 00:20:28.341 enable_docs : false 00:20:28.341 enable_drivers : bus,bus/pci,bus/vdev,mempool/ring,crypto/qat,compress/qat,common/qat,common/mlx5,bus/auxiliary,crypto,crypto/aesni_mb,crypto/mlx5,crypto/ipsec_mb,compress,compress/isal,compress/mlx5 00:20:28.341 enable_kmods : false 00:20:28.341 tests : false 00:20:28.341 00:20:28.341 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:20:28.605 ninja: Entering directory `/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp' 00:20:28.605 [1/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:20:28.605 [2/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:20:28.605 [3/378] Compiling C object lib/librte_log.a.p/log_log_linux.c.o 00:20:28.605 [4/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:20:28.605 [5/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:20:28.605 [6/378] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:20:28.605 [7/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:20:28.605 [8/378] Linking static target lib/librte_kvargs.a 00:20:28.605 [9/378] Compiling C object lib/librte_log.a.p/log_log.c.o 00:20:28.605 [10/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:20:28.605 [11/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:20:28.605 [12/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:20:28.605 [13/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:20:28.605 [14/378] Linking static target lib/librte_log.a 00:20:28.605 [15/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:20:28.863 [16/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:20:28.863 [17/378] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:20:28.863 [18/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:20:28.863 [19/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:20:29.129 [20/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:20:29.129 [21/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:20:29.129 [22/378] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:20:29.129 [23/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:20:29.129 [24/378] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:20:29.129 [25/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:20:29.129 [26/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:20:29.129 [27/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:20:29.129 [28/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:20:29.129 [29/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:20:29.129 [30/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:20:29.129 [31/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:20:29.129 [32/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:20:29.129 [33/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:20:29.129 [34/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:20:29.129 [35/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:20:29.129 [36/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:20:29.129 [37/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:20:29.129 [38/378] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:20:29.129 [39/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:20:29.129 [40/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:20:29.129 [41/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:20:29.129 [42/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:20:29.129 [43/378] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:20:29.129 [44/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:20:29.129 [45/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:20:29.129 [46/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:20:29.129 [47/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:20:29.129 [48/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:20:29.129 [49/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:20:29.129 [50/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:20:29.129 [51/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:20:29.129 [52/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:20:29.129 [53/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:20:29.129 [54/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:20:29.129 [55/378] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:20:29.129 [56/378] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:20:29.129 [57/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:20:29.129 [58/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:20:29.129 [59/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:20:29.129 [60/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:20:29.129 [61/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:20:29.129 [62/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:20:29.129 [63/378] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:20:29.129 [64/378] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:20:29.129 [65/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:20:29.129 [66/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:20:29.388 [67/378] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:20:29.388 [68/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:20:29.388 [69/378] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:20:29.388 [70/378] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:20:29.388 [71/378] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:20:29.388 [72/378] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:20:29.388 [73/378] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:20:29.388 [74/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:20:29.388 [75/378] Linking static target lib/librte_ring.a 00:20:29.388 [76/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:20:29.388 [77/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:20:29.388 [78/378] Linking static target lib/librte_telemetry.a 00:20:29.388 [79/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:20:29.388 [80/378] Linking static target lib/net/libnet_crc_avx512_lib.a 00:20:29.388 [81/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:20:29.388 [82/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:20:29.388 [83/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:20:29.388 [84/378] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:20:29.388 [85/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:20:29.388 [86/378] Compiling C object lib/librte_hash.a.p/hash_rte_hash_crc.c.o 00:20:29.388 [87/378] Linking static target lib/librte_pci.a 00:20:29.388 [88/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:20:29.388 [89/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:20:29.388 [90/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:20:29.388 [91/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:20:29.388 [92/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:20:29.388 [93/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:20:29.388 [94/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:20:29.388 [95/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:20:29.388 [96/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:20:29.388 [97/378] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:20:29.388 [98/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:20:29.388 [99/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:20:29.388 [100/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:20:29.388 [101/378] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:20:29.388 [102/378] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:20:29.388 [103/378] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:20:29.388 [104/378] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:20:29.388 [105/378] Linking static target lib/librte_rcu.a 00:20:29.388 [106/378] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:20:29.388 [107/378] Linking static target lib/librte_mempool.a 00:20:29.388 [108/378] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:20:29.388 [109/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:20:29.388 [110/378] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:20:29.388 [111/378] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:20:29.388 [112/378] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_auxiliary_params.c.o 00:20:29.650 [113/378] Linking static target lib/librte_eal.a 00:20:29.650 [114/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:20:29.650 [115/378] Generating lib/log.sym_chk with a custom command (wrapped by meson to capture output) 00:20:29.650 [116/378] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:20:29.650 [117/378] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:20:29.650 [118/378] Linking static target lib/librte_net.a 00:20:29.650 [119/378] Linking target lib/librte_log.so.24.1 00:20:29.651 [120/378] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:20:29.651 [121/378] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:20:29.651 [122/378] Linking static target lib/librte_meter.a 00:20:29.651 [123/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:20:29.651 [124/378] Linking static target lib/librte_mbuf.a 00:20:29.651 [125/378] Compiling C object lib/librte_hash.a.p/hash_rte_thash_gfni.c.o 00:20:29.651 [126/378] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:20:29.651 [127/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:20:29.651 [128/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:20:29.651 [129/378] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:20:29.910 [130/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:20:29.910 [131/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_linux_ethtool.c.o 00:20:29.910 [132/378] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:20:29.910 [133/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:20:29.910 [134/378] Generating symbol file lib/librte_log.so.24.1.p/librte_log.so.24.1.symbols 00:20:29.910 [135/378] Compiling C object lib/librte_power.a.p/power_rte_power_uncore.c.o 00:20:29.910 [136/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:20:29.910 [137/378] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev_trace_points.c.o 00:20:29.910 [138/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:20:29.910 [139/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:20:29.910 [140/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:20:29.910 [141/378] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:20:29.910 [142/378] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:20:29.910 [143/378] Compiling C object lib/librte_power.a.p/power_power_intel_uncore.c.o 00:20:29.910 [144/378] Linking static target lib/librte_timer.a 00:20:29.910 [145/378] Linking target lib/librte_kvargs.so.24.1 00:20:29.910 [146/378] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:20:29.910 [147/378] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:20:29.910 [148/378] Linking static target lib/librte_cmdline.a 00:20:29.910 [149/378] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:20:29.910 [150/378] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_auxiliary_common.c.o 00:20:29.910 [151/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:20:29.910 [152/378] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:20:29.910 [153/378] Compiling C object lib/librte_power.a.p/power_power_amd_pstate_cpufreq.c.o 00:20:29.910 [154/378] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:20:29.910 [155/378] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:20:29.910 [156/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:20:29.910 [157/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:20:29.910 [158/378] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_linux_auxiliary.c.o 00:20:29.910 [159/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_logs.c.o 00:20:29.910 [160/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:20:29.910 [161/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_telemetry.c.o 00:20:29.910 [162/378] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:20:29.910 [163/378] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:20:29.910 [164/378] Linking static target drivers/libtmp_rte_bus_auxiliary.a 00:20:29.910 [165/378] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:20:29.910 [166/378] Linking static target lib/librte_compressdev.a 00:20:29.910 [167/378] Linking static target lib/librte_dmadev.a 00:20:29.910 [168/378] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:20:29.910 [169/378] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:20:29.910 [170/378] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:20:29.910 [171/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:20:29.910 [172/378] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:20:29.910 [173/378] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net_ctrl.c.o 00:20:29.910 [174/378] Linking static target lib/librte_power.a 00:20:29.910 [175/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:20:29.910 [176/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:20:29.910 [177/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:20:30.177 [178/378] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:20:30.177 [179/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:20:30.177 [180/378] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:20:30.177 [181/378] Linking static target drivers/libtmp_rte_bus_vdev.a 00:20:30.177 [182/378] Linking static target lib/librte_reorder.a 00:20:30.177 [183/378] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:20:30.177 [184/378] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:20:30.177 [185/378] Linking target lib/librte_telemetry.so.24.1 00:20:30.177 [186/378] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:20:30.177 [187/378] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:20:30.177 [188/378] Generating symbol file lib/librte_kvargs.so.24.1.p/librte_kvargs.so.24.1.symbols 00:20:30.177 [189/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:20:30.177 [190/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:20:30.177 [191/378] Linking static target drivers/libtmp_rte_bus_pci.a 00:20:30.177 [192/378] Compiling C object lib/librte_vhost.a.p/vhost_vduse.c.o 00:20:30.177 [193/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_glue.c.o 00:20:30.177 [194/378] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:20:30.177 [195/378] Linking static target lib/librte_security.a 00:20:30.177 [196/378] Generating drivers/rte_bus_auxiliary.pmd.c with a custom command 00:20:30.437 [197/378] Generating symbol file lib/librte_telemetry.so.24.1.p/librte_telemetry.so.24.1.symbols 00:20:30.437 [198/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_mp.c.o 00:20:30.437 [199/378] Compiling C object drivers/librte_bus_auxiliary.a.p/meson-generated_.._rte_bus_auxiliary.pmd.c.o 00:20:30.437 [200/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:20:30.437 [201/378] Linking static target drivers/librte_bus_auxiliary.a 00:20:30.437 [202/378] Compiling C object drivers/librte_bus_auxiliary.so.24.1.p/meson-generated_.._rte_bus_auxiliary.pmd.c.o 00:20:30.437 [203/378] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:20:30.437 [204/378] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:20:30.437 [205/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common.c.o 00:20:30.437 [206/378] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:20:30.437 [207/378] Linking static target lib/librte_hash.a 00:20:30.437 [208/378] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:20:30.437 [209/378] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:20:30.437 [210/378] Compiling C object drivers/librte_bus_vdev.so.24.1.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:20:30.437 [211/378] Linking static target drivers/librte_bus_vdev.a 00:20:30.437 [212/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_malloc.c.o 00:20:30.437 [213/378] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:20:30.437 [214/378] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:20:30.437 [215/378] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:20:30.437 [216/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_pf2vf.c.o 00:20:30.437 [217/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_common.c.o 00:20:30.437 [218/378] Compiling C object drivers/librte_bus_pci.so.24.1.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:20:30.437 [219/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_pci.c.o 00:20:30.437 [220/378] Linking static target drivers/librte_bus_pci.a 00:20:30.437 [221/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen3.c.o 00:20:30.437 [222/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen1.c.o 00:20:30.437 [223/378] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:20:30.437 [224/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen2.c.o 00:20:30.437 [225/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_auxiliary.c.o 00:20:30.437 [226/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_devx.c.o 00:20:30.437 [227/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_utils.c.o 00:20:30.697 [228/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen5.c.o 00:20:30.697 [229/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen1.c.o 00:20:30.697 [230/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen2.c.o 00:20:30.697 [231/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen3.c.o 00:20:30.697 [232/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_verbs.c.o 00:20:30.697 [233/378] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:20:30.697 [234/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen5.c.o 00:20:30.697 [235/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen4.c.o 00:20:30.697 [236/378] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:20:30.697 [237/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_qat_comp_pmd.c.o 00:20:30.697 [238/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen4.c.o 00:20:30.697 [239/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen_lce.c.o 00:20:30.697 [240/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_device.c.o 00:20:30.697 [241/378] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:20:30.697 [242/378] Linking static target lib/librte_cryptodev.a 00:20:30.697 [243/378] Generating drivers/rte_bus_auxiliary.sym_chk with a custom command (wrapped by meson to capture output) 00:20:30.697 [244/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_mr.c.o 00:20:30.697 [245/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_crypto.c.o 00:20:30.697 [246/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:20:30.697 [247/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_os.c.o 00:20:30.697 [248/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_sym.c.o 00:20:30.697 [249/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen2.c.o 00:20:30.697 [250/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_nl.c.o 00:20:30.697 [251/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_asym_pmd_gen1.c.o 00:20:30.697 [252/378] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:20:30.697 [253/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen5.c.o 00:20:30.697 [254/378] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:20:30.697 [255/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_qp.c.o 00:20:30.697 [256/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen_lce.c.o 00:20:30.697 [257/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_ipsec_mb_private.c.o 00:20:30.697 [258/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_ipsec_mb_ops.c.o 00:20:30.697 [259/378] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:20:30.697 [260/378] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:20:30.697 [261/378] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:20:30.697 [262/378] Linking static target drivers/libtmp_rte_mempool_ring.a 00:20:30.956 [263/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_qat_comp.c.o 00:20:30.956 [264/378] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:20:30.956 [265/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen4.c.o 00:20:30.956 [266/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_chacha_poly.c.o 00:20:30.956 [267/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_aesni_gcm.c.o 00:20:30.956 [268/378] Compiling C object drivers/libtmp_rte_compress_isal.a.p/compress_isal_isal_compress_pmd_ops.c.o 00:20:30.956 [269/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_dek.c.o 00:20:30.956 [270/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_xts.c.o 00:20:30.956 [271/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_devx_cmds.c.o 00:20:30.956 [272/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_gcm.c.o 00:20:30.956 [273/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_zuc.c.o 00:20:30.956 [274/378] Linking static target drivers/libtmp_rte_common_mlx5.a 00:20:30.956 [275/378] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:20:30.956 [276/378] Compiling C object drivers/librte_mempool_ring.so.24.1.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:20:30.956 [277/378] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:20:30.956 [278/378] Linking static target drivers/librte_mempool_ring.a 00:20:30.956 [279/378] Compiling C object drivers/libtmp_rte_compress_mlx5.a.p/compress_mlx5_mlx5_compress.c.o 00:20:30.956 [280/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_kasumi.c.o 00:20:30.956 [281/378] Linking static target drivers/libtmp_rte_compress_mlx5.a 00:20:30.956 [282/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_sym_session.c.o 00:20:30.956 [283/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto.c.o 00:20:30.956 [284/378] Linking static target drivers/libtmp_rte_crypto_mlx5.a 00:20:30.956 [285/378] Compiling C object drivers/libtmp_rte_compress_isal.a.p/compress_isal_isal_compress_pmd.c.o 00:20:30.956 [286/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen3.c.o 00:20:30.956 [287/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_snow3g.c.o 00:20:30.956 [288/378] Linking static target drivers/libtmp_rte_compress_isal.a 00:20:30.956 [289/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:20:31.215 [290/378] Linking static target lib/librte_ethdev.a 00:20:31.215 [291/378] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:20:31.215 [292/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_aesni_mb.c.o 00:20:31.215 [293/378] Linking static target drivers/libtmp_rte_crypto_ipsec_mb.a 00:20:31.215 [294/378] Generating drivers/rte_common_mlx5.pmd.c with a custom command 00:20:31.215 [295/378] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:20:31.215 [296/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_sym_pmd_gen1.c.o 00:20:31.215 [297/378] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:20:31.215 [298/378] Compiling C object drivers/librte_common_mlx5.so.24.1.p/meson-generated_.._rte_common_mlx5.pmd.c.o 00:20:31.215 [299/378] Generating drivers/rte_compress_mlx5.pmd.c with a custom command 00:20:31.215 [300/378] Compiling C object drivers/librte_common_mlx5.a.p/meson-generated_.._rte_common_mlx5.pmd.c.o 00:20:31.215 [301/378] Generating drivers/rte_crypto_mlx5.pmd.c with a custom command 00:20:31.215 [302/378] Generating drivers/rte_compress_isal.pmd.c with a custom command 00:20:31.215 [303/378] Linking static target drivers/librte_common_mlx5.a 00:20:31.215 [304/378] Compiling C object drivers/librte_compress_mlx5.so.24.1.p/meson-generated_.._rte_compress_mlx5.pmd.c.o 00:20:31.215 [305/378] Compiling C object drivers/librte_compress_mlx5.a.p/meson-generated_.._rte_compress_mlx5.pmd.c.o 00:20:31.215 [306/378] Linking static target drivers/librte_compress_mlx5.a 00:20:31.215 [307/378] Compiling C object drivers/librte_compress_isal.so.24.1.p/meson-generated_.._rte_compress_isal.pmd.c.o 00:20:31.215 [308/378] Compiling C object drivers/librte_crypto_mlx5.a.p/meson-generated_.._rte_crypto_mlx5.pmd.c.o 00:20:31.215 [309/378] Compiling C object drivers/librte_compress_isal.a.p/meson-generated_.._rte_compress_isal.pmd.c.o 00:20:31.215 [310/378] Compiling C object drivers/librte_crypto_mlx5.so.24.1.p/meson-generated_.._rte_crypto_mlx5.pmd.c.o 00:20:31.215 [311/378] Linking static target drivers/librte_compress_isal.a 00:20:31.215 [312/378] Linking static target drivers/librte_crypto_mlx5.a 00:20:31.474 [313/378] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:20:31.474 [314/378] Generating drivers/rte_crypto_ipsec_mb.pmd.c with a custom command 00:20:31.474 [315/378] Compiling C object drivers/librte_crypto_ipsec_mb.a.p/meson-generated_.._rte_crypto_ipsec_mb.pmd.c.o 00:20:31.474 [316/378] Compiling C object drivers/librte_crypto_ipsec_mb.so.24.1.p/meson-generated_.._rte_crypto_ipsec_mb.pmd.c.o 00:20:31.474 [317/378] Linking static target drivers/librte_crypto_ipsec_mb.a 00:20:31.733 [318/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_asym.c.o 00:20:31.733 [319/378] Linking static target drivers/libtmp_rte_common_qat.a 00:20:31.992 [320/378] Generating drivers/rte_common_qat.pmd.c with a custom command 00:20:31.992 [321/378] Compiling C object drivers/librte_common_qat.so.24.1.p/meson-generated_.._rte_common_qat.pmd.c.o 00:20:31.992 [322/378] Compiling C object drivers/librte_common_qat.a.p/meson-generated_.._rte_common_qat.pmd.c.o 00:20:31.992 [323/378] Linking static target drivers/librte_common_qat.a 00:20:32.250 [324/378] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:20:32.250 [325/378] Linking static target lib/librte_vhost.a 00:20:32.818 [326/378] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:20:34.803 [327/378] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:20:36.713 [328/378] Generating drivers/rte_common_mlx5.sym_chk with a custom command (wrapped by meson to capture output) 00:20:40.909 [329/378] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:20:41.843 [330/378] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:20:41.843 [331/378] Linking target lib/librte_eal.so.24.1 00:20:41.843 [332/378] Generating symbol file lib/librte_eal.so.24.1.p/librte_eal.so.24.1.symbols 00:20:41.843 [333/378] Linking target lib/librte_dmadev.so.24.1 00:20:41.843 [334/378] Linking target lib/librte_ring.so.24.1 00:20:41.843 [335/378] Linking target lib/librte_meter.so.24.1 00:20:41.843 [336/378] Linking target lib/librte_pci.so.24.1 00:20:41.843 [337/378] Linking target lib/librte_timer.so.24.1 00:20:41.843 [338/378] Linking target drivers/librte_bus_vdev.so.24.1 00:20:41.843 [339/378] Linking target drivers/librte_bus_auxiliary.so.24.1 00:20:42.101 [340/378] Generating symbol file lib/librte_pci.so.24.1.p/librte_pci.so.24.1.symbols 00:20:42.101 [341/378] Generating symbol file lib/librte_meter.so.24.1.p/librte_meter.so.24.1.symbols 00:20:42.101 [342/378] Generating symbol file lib/librte_timer.so.24.1.p/librte_timer.so.24.1.symbols 00:20:42.101 [343/378] Generating symbol file lib/librte_ring.so.24.1.p/librte_ring.so.24.1.symbols 00:20:42.101 [344/378] Generating symbol file lib/librte_dmadev.so.24.1.p/librte_dmadev.so.24.1.symbols 00:20:42.101 [345/378] Generating symbol file drivers/librte_bus_vdev.so.24.1.p/librte_bus_vdev.so.24.1.symbols 00:20:42.101 [346/378] Generating symbol file drivers/librte_bus_auxiliary.so.24.1.p/librte_bus_auxiliary.so.24.1.symbols 00:20:42.101 [347/378] Linking target drivers/librte_bus_pci.so.24.1 00:20:42.101 [348/378] Linking target lib/librte_rcu.so.24.1 00:20:42.101 [349/378] Linking target lib/librte_mempool.so.24.1 00:20:42.101 [350/378] Generating symbol file lib/librte_mempool.so.24.1.p/librte_mempool.so.24.1.symbols 00:20:42.101 [351/378] Generating symbol file drivers/librte_bus_pci.so.24.1.p/librte_bus_pci.so.24.1.symbols 00:20:42.101 [352/378] Generating symbol file lib/librte_rcu.so.24.1.p/librte_rcu.so.24.1.symbols 00:20:42.359 [353/378] Linking target drivers/librte_mempool_ring.so.24.1 00:20:42.359 [354/378] Linking target lib/librte_mbuf.so.24.1 00:20:42.359 [355/378] Generating symbol file lib/librte_mbuf.so.24.1.p/librte_mbuf.so.24.1.symbols 00:20:42.359 [356/378] Linking target lib/librte_reorder.so.24.1 00:20:42.359 [357/378] Linking target lib/librte_cryptodev.so.24.1 00:20:42.359 [358/378] Linking target lib/librte_net.so.24.1 00:20:42.359 [359/378] Linking target lib/librte_compressdev.so.24.1 00:20:42.617 [360/378] Generating symbol file lib/librte_compressdev.so.24.1.p/librte_compressdev.so.24.1.symbols 00:20:42.617 [361/378] Generating symbol file lib/librte_cryptodev.so.24.1.p/librte_cryptodev.so.24.1.symbols 00:20:42.617 [362/378] Generating symbol file lib/librte_net.so.24.1.p/librte_net.so.24.1.symbols 00:20:42.617 [363/378] Linking target drivers/librte_compress_isal.so.24.1 00:20:42.617 [364/378] Linking target lib/librte_security.so.24.1 00:20:42.617 [365/378] Linking target lib/librte_hash.so.24.1 00:20:42.617 [366/378] Linking target lib/librte_cmdline.so.24.1 00:20:42.617 [367/378] Linking target lib/librte_ethdev.so.24.1 00:20:42.617 [368/378] Generating symbol file lib/librte_security.so.24.1.p/librte_security.so.24.1.symbols 00:20:42.876 [369/378] Generating symbol file lib/librte_hash.so.24.1.p/librte_hash.so.24.1.symbols 00:20:42.876 [370/378] Generating symbol file lib/librte_ethdev.so.24.1.p/librte_ethdev.so.24.1.symbols 00:20:42.876 [371/378] Linking target drivers/librte_common_mlx5.so.24.1 00:20:42.876 [372/378] Linking target lib/librte_power.so.24.1 00:20:42.876 [373/378] Linking target lib/librte_vhost.so.24.1 00:20:42.876 [374/378] Generating symbol file drivers/librte_common_mlx5.so.24.1.p/librte_common_mlx5.so.24.1.symbols 00:20:42.876 [375/378] Linking target drivers/librte_crypto_ipsec_mb.so.24.1 00:20:42.876 [376/378] Linking target drivers/librte_common_qat.so.24.1 00:20:43.135 [377/378] Linking target drivers/librte_compress_mlx5.so.24.1 00:20:43.135 [378/378] Linking target drivers/librte_crypto_mlx5.so.24.1 00:20:43.135 INFO: autodetecting backend as ninja 00:20:43.135 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp -j 72 00:20:44.070 CC lib/log/log_deprecated.o 00:20:44.070 CC lib/log/log.o 00:20:44.070 CC lib/log/log_flags.o 00:20:44.070 CC lib/ut_mock/mock.o 00:20:44.070 CC lib/ut/ut.o 00:20:44.328 LIB libspdk_log.a 00:20:44.328 LIB libspdk_ut.a 00:20:44.328 LIB libspdk_ut_mock.a 00:20:44.328 SO libspdk_log.so.7.0 00:20:44.328 SO libspdk_ut.so.2.0 00:20:44.328 SO libspdk_ut_mock.so.6.0 00:20:44.328 SYMLINK libspdk_log.so 00:20:44.328 SYMLINK libspdk_ut.so 00:20:44.328 SYMLINK libspdk_ut_mock.so 00:20:44.895 CC lib/dma/dma.o 00:20:44.895 CC lib/util/bit_array.o 00:20:44.895 CC lib/util/base64.o 00:20:44.895 CC lib/util/cpuset.o 00:20:44.895 CC lib/util/crc16.o 00:20:44.895 CC lib/util/crc32.o 00:20:44.895 CC lib/util/crc64.o 00:20:44.895 CC lib/util/crc32c.o 00:20:44.895 CC lib/util/crc32_ieee.o 00:20:44.895 CC lib/util/dif.o 00:20:44.895 CC lib/util/fd.o 00:20:44.895 CC lib/util/file.o 00:20:44.895 CC lib/util/hexlify.o 00:20:44.895 CC lib/ioat/ioat.o 00:20:44.895 CC lib/util/iov.o 00:20:44.895 CC lib/util/math.o 00:20:44.895 CC lib/util/pipe.o 00:20:44.895 CC lib/util/strerror_tls.o 00:20:44.895 CC lib/util/string.o 00:20:44.895 CC lib/util/uuid.o 00:20:44.895 CC lib/util/fd_group.o 00:20:44.895 CC lib/util/xor.o 00:20:44.895 CC lib/util/zipf.o 00:20:44.895 CXX lib/trace_parser/trace.o 00:20:44.895 CC lib/vfio_user/host/vfio_user_pci.o 00:20:44.895 CC lib/vfio_user/host/vfio_user.o 00:20:44.895 LIB libspdk_dma.a 00:20:44.895 SO libspdk_dma.so.4.0 00:20:44.895 SYMLINK libspdk_dma.so 00:20:45.154 LIB libspdk_ioat.a 00:20:45.154 SO libspdk_ioat.so.7.0 00:20:45.154 LIB libspdk_vfio_user.a 00:20:45.154 SYMLINK libspdk_ioat.so 00:20:45.154 SO libspdk_vfio_user.so.5.0 00:20:45.154 LIB libspdk_util.a 00:20:45.154 SYMLINK libspdk_vfio_user.so 00:20:45.412 SO libspdk_util.so.9.0 00:20:45.412 SYMLINK libspdk_util.so 00:20:45.412 LIB libspdk_trace_parser.a 00:20:45.412 SO libspdk_trace_parser.so.5.0 00:20:45.670 SYMLINK libspdk_trace_parser.so 00:20:45.670 CC lib/reduce/reduce.o 00:20:45.670 CC lib/conf/conf.o 00:20:45.670 CC lib/vmd/vmd.o 00:20:45.670 CC lib/vmd/led.o 00:20:45.670 CC lib/rdma/common.o 00:20:45.670 CC lib/rdma/rdma_verbs.o 00:20:45.670 CC lib/json/json_parse.o 00:20:45.670 CC lib/json/json_util.o 00:20:45.670 CC lib/json/json_write.o 00:20:45.670 CC lib/idxd/idxd.o 00:20:45.670 CC lib/env_dpdk/memory.o 00:20:45.670 CC lib/env_dpdk/env.o 00:20:45.670 CC lib/idxd/idxd_user.o 00:20:45.670 CC lib/idxd/idxd_kernel.o 00:20:45.670 CC lib/env_dpdk/pci.o 00:20:45.670 CC lib/env_dpdk/init.o 00:20:45.670 CC lib/env_dpdk/pci_ioat.o 00:20:45.670 CC lib/env_dpdk/threads.o 00:20:45.670 CC lib/env_dpdk/pci_virtio.o 00:20:45.670 CC lib/env_dpdk/pci_vmd.o 00:20:45.670 CC lib/env_dpdk/pci_idxd.o 00:20:45.670 CC lib/env_dpdk/pci_event.o 00:20:45.670 CC lib/env_dpdk/sigbus_handler.o 00:20:45.670 CC lib/env_dpdk/pci_dpdk.o 00:20:45.670 CC lib/env_dpdk/pci_dpdk_2207.o 00:20:45.670 CC lib/env_dpdk/pci_dpdk_2211.o 00:20:45.928 LIB libspdk_conf.a 00:20:45.928 SO libspdk_conf.so.6.0 00:20:45.928 LIB libspdk_rdma.a 00:20:46.186 LIB libspdk_json.a 00:20:46.186 SYMLINK libspdk_conf.so 00:20:46.186 SO libspdk_rdma.so.6.0 00:20:46.186 SO libspdk_json.so.6.0 00:20:46.186 SYMLINK libspdk_rdma.so 00:20:46.186 SYMLINK libspdk_json.so 00:20:46.186 LIB libspdk_idxd.a 00:20:46.186 LIB libspdk_reduce.a 00:20:46.186 SO libspdk_idxd.so.12.0 00:20:46.186 LIB libspdk_vmd.a 00:20:46.444 SO libspdk_reduce.so.6.0 00:20:46.444 SO libspdk_vmd.so.6.0 00:20:46.444 SYMLINK libspdk_idxd.so 00:20:46.444 SYMLINK libspdk_vmd.so 00:20:46.444 SYMLINK libspdk_reduce.so 00:20:46.444 CC lib/jsonrpc/jsonrpc_server.o 00:20:46.444 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:20:46.444 CC lib/jsonrpc/jsonrpc_client.o 00:20:46.444 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:20:46.701 LIB libspdk_jsonrpc.a 00:20:46.701 SO libspdk_jsonrpc.so.6.0 00:20:46.960 LIB libspdk_env_dpdk.a 00:20:46.960 SYMLINK libspdk_jsonrpc.so 00:20:46.960 SO libspdk_env_dpdk.so.14.1 00:20:46.960 SYMLINK libspdk_env_dpdk.so 00:20:47.218 CC lib/rpc/rpc.o 00:20:47.477 LIB libspdk_rpc.a 00:20:47.477 SO libspdk_rpc.so.6.0 00:20:47.477 SYMLINK libspdk_rpc.so 00:20:47.736 CC lib/trace/trace.o 00:20:47.736 CC lib/trace/trace_rpc.o 00:20:47.736 CC lib/trace/trace_flags.o 00:20:47.736 CC lib/keyring/keyring.o 00:20:47.736 CC lib/keyring/keyring_rpc.o 00:20:47.736 CC lib/notify/notify_rpc.o 00:20:47.736 CC lib/notify/notify.o 00:20:47.994 LIB libspdk_notify.a 00:20:47.994 SO libspdk_notify.so.6.0 00:20:47.994 LIB libspdk_trace.a 00:20:47.994 LIB libspdk_keyring.a 00:20:47.994 SO libspdk_trace.so.10.0 00:20:47.994 SO libspdk_keyring.so.1.0 00:20:47.994 SYMLINK libspdk_notify.so 00:20:48.251 SYMLINK libspdk_keyring.so 00:20:48.251 SYMLINK libspdk_trace.so 00:20:48.508 CC lib/thread/thread.o 00:20:48.508 CC lib/thread/iobuf.o 00:20:48.508 CC lib/sock/sock.o 00:20:48.508 CC lib/sock/sock_rpc.o 00:20:48.768 LIB libspdk_sock.a 00:20:49.033 SO libspdk_sock.so.9.0 00:20:49.033 SYMLINK libspdk_sock.so 00:20:49.291 CC lib/nvme/nvme_ctrlr_cmd.o 00:20:49.291 CC lib/nvme/nvme_ctrlr.o 00:20:49.291 CC lib/nvme/nvme_fabric.o 00:20:49.291 CC lib/nvme/nvme_ns.o 00:20:49.291 CC lib/nvme/nvme_pcie_common.o 00:20:49.291 CC lib/nvme/nvme_ns_cmd.o 00:20:49.291 CC lib/nvme/nvme_qpair.o 00:20:49.291 CC lib/nvme/nvme_pcie.o 00:20:49.291 CC lib/nvme/nvme_transport.o 00:20:49.291 CC lib/nvme/nvme.o 00:20:49.291 CC lib/nvme/nvme_quirks.o 00:20:49.291 CC lib/nvme/nvme_discovery.o 00:20:49.291 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:20:49.291 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:20:49.291 CC lib/nvme/nvme_tcp.o 00:20:49.291 CC lib/nvme/nvme_opal.o 00:20:49.291 CC lib/nvme/nvme_io_msg.o 00:20:49.291 CC lib/nvme/nvme_poll_group.o 00:20:49.291 CC lib/nvme/nvme_zns.o 00:20:49.291 CC lib/nvme/nvme_stubs.o 00:20:49.291 CC lib/nvme/nvme_auth.o 00:20:49.291 CC lib/nvme/nvme_cuse.o 00:20:49.291 CC lib/nvme/nvme_rdma.o 00:20:49.549 LIB libspdk_thread.a 00:20:49.549 SO libspdk_thread.so.10.0 00:20:49.807 SYMLINK libspdk_thread.so 00:20:50.066 CC lib/init/subsystem.o 00:20:50.066 CC lib/init/json_config.o 00:20:50.066 CC lib/init/subsystem_rpc.o 00:20:50.066 CC lib/init/rpc.o 00:20:50.066 CC lib/virtio/virtio.o 00:20:50.066 CC lib/virtio/virtio_vfio_user.o 00:20:50.066 CC lib/virtio/virtio_vhost_user.o 00:20:50.066 CC lib/virtio/virtio_pci.o 00:20:50.066 CC lib/accel/accel_sw.o 00:20:50.066 CC lib/accel/accel.o 00:20:50.066 CC lib/accel/accel_rpc.o 00:20:50.066 CC lib/blob/blobstore.o 00:20:50.066 CC lib/blob/request.o 00:20:50.066 CC lib/blob/blob_bs_dev.o 00:20:50.066 CC lib/blob/zeroes.o 00:20:50.325 LIB libspdk_init.a 00:20:50.325 SO libspdk_init.so.5.0 00:20:50.325 LIB libspdk_virtio.a 00:20:50.325 SO libspdk_virtio.so.7.0 00:20:50.325 SYMLINK libspdk_init.so 00:20:50.325 SYMLINK libspdk_virtio.so 00:20:50.583 CC lib/event/app.o 00:20:50.583 CC lib/event/reactor.o 00:20:50.583 CC lib/event/log_rpc.o 00:20:50.583 CC lib/event/app_rpc.o 00:20:50.583 CC lib/event/scheduler_static.o 00:20:50.842 LIB libspdk_accel.a 00:20:50.842 SO libspdk_accel.so.15.0 00:20:50.842 SYMLINK libspdk_accel.so 00:20:50.842 LIB libspdk_event.a 00:20:51.101 LIB libspdk_nvme.a 00:20:51.101 SO libspdk_event.so.13.1 00:20:51.101 SO libspdk_nvme.so.13.1 00:20:51.101 SYMLINK libspdk_event.so 00:20:51.101 CC lib/bdev/bdev_rpc.o 00:20:51.101 CC lib/bdev/bdev_zone.o 00:20:51.101 CC lib/bdev/bdev.o 00:20:51.101 CC lib/bdev/part.o 00:20:51.101 CC lib/bdev/scsi_nvme.o 00:20:51.360 SYMLINK libspdk_nvme.so 00:20:52.297 LIB libspdk_blob.a 00:20:52.297 SO libspdk_blob.so.11.0 00:20:52.297 SYMLINK libspdk_blob.so 00:20:52.556 CC lib/blobfs/blobfs.o 00:20:52.556 CC lib/blobfs/tree.o 00:20:52.556 CC lib/lvol/lvol.o 00:20:53.125 LIB libspdk_bdev.a 00:20:53.125 SO libspdk_bdev.so.15.0 00:20:53.125 SYMLINK libspdk_bdev.so 00:20:53.125 LIB libspdk_blobfs.a 00:20:53.125 SO libspdk_blobfs.so.10.0 00:20:53.383 SYMLINK libspdk_blobfs.so 00:20:53.383 LIB libspdk_lvol.a 00:20:53.383 SO libspdk_lvol.so.10.0 00:20:53.383 CC lib/scsi/dev.o 00:20:53.383 CC lib/scsi/lun.o 00:20:53.383 CC lib/scsi/port.o 00:20:53.383 CC lib/scsi/scsi.o 00:20:53.383 CC lib/scsi/scsi_rpc.o 00:20:53.383 CC lib/scsi/scsi_bdev.o 00:20:53.383 CC lib/scsi/scsi_pr.o 00:20:53.383 CC lib/ublk/ublk.o 00:20:53.383 CC lib/ublk/ublk_rpc.o 00:20:53.383 CC lib/scsi/task.o 00:20:53.383 CC lib/ftl/ftl_init.o 00:20:53.383 CC lib/ftl/ftl_core.o 00:20:53.383 CC lib/ftl/ftl_layout.o 00:20:53.383 SYMLINK libspdk_lvol.so 00:20:53.383 CC lib/ftl/ftl_sb.o 00:20:53.383 CC lib/ftl/ftl_debug.o 00:20:53.383 CC lib/ftl/ftl_io.o 00:20:53.383 CC lib/ftl/ftl_l2p_flat.o 00:20:53.383 CC lib/ftl/ftl_l2p.o 00:20:53.383 CC lib/ftl/ftl_nv_cache.o 00:20:53.383 CC lib/ftl/ftl_band.o 00:20:53.383 CC lib/nvmf/ctrlr.o 00:20:53.383 CC lib/ftl/ftl_writer.o 00:20:53.383 CC lib/ftl/ftl_band_ops.o 00:20:53.383 CC lib/nvmf/ctrlr_discovery.o 00:20:53.383 CC lib/ftl/ftl_rq.o 00:20:53.383 CC lib/ftl/ftl_reloc.o 00:20:53.383 CC lib/ftl/ftl_p2l.o 00:20:53.383 CC lib/nvmf/ctrlr_bdev.o 00:20:53.383 CC lib/ftl/ftl_l2p_cache.o 00:20:53.383 CC lib/nvmf/nvmf.o 00:20:53.383 CC lib/nvmf/subsystem.o 00:20:53.383 CC lib/ftl/mngt/ftl_mngt.o 00:20:53.383 CC lib/nvmf/nvmf_rpc.o 00:20:53.383 CC lib/nvmf/transport.o 00:20:53.383 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:20:53.383 CC lib/nvmf/stubs.o 00:20:53.383 CC lib/nvmf/tcp.o 00:20:53.383 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:20:53.383 CC lib/ftl/mngt/ftl_mngt_startup.o 00:20:53.383 CC lib/nbd/nbd.o 00:20:53.383 CC lib/nvmf/rdma.o 00:20:53.383 CC lib/nbd/nbd_rpc.o 00:20:53.383 CC lib/ftl/mngt/ftl_mngt_md.o 00:20:53.383 CC lib/nvmf/auth.o 00:20:53.383 CC lib/ftl/mngt/ftl_mngt_misc.o 00:20:53.383 CC lib/nvmf/mdns_server.o 00:20:53.383 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:20:53.383 CC lib/ftl/mngt/ftl_mngt_band.o 00:20:53.383 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:20:53.383 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:20:53.383 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:20:53.383 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:20:53.383 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:20:53.383 CC lib/ftl/utils/ftl_conf.o 00:20:53.383 CC lib/ftl/utils/ftl_md.o 00:20:53.645 CC lib/ftl/utils/ftl_mempool.o 00:20:53.645 CC lib/ftl/utils/ftl_bitmap.o 00:20:53.645 CC lib/ftl/utils/ftl_property.o 00:20:53.645 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:20:53.645 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:20:53.645 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:20:53.645 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:20:53.645 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:20:53.645 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:20:53.645 CC lib/ftl/upgrade/ftl_trim_upgrade.o 00:20:53.645 CC lib/ftl/upgrade/ftl_sb_v3.o 00:20:53.645 CC lib/ftl/nvc/ftl_nvc_dev.o 00:20:53.645 CC lib/ftl/upgrade/ftl_sb_v5.o 00:20:53.645 CC lib/ftl/base/ftl_base_dev.o 00:20:53.645 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:20:53.645 CC lib/ftl/base/ftl_base_bdev.o 00:20:53.645 CC lib/ftl/ftl_trace.o 00:20:54.213 LIB libspdk_scsi.a 00:20:54.213 LIB libspdk_nbd.a 00:20:54.213 SO libspdk_scsi.so.9.0 00:20:54.213 SO libspdk_nbd.so.7.0 00:20:54.213 LIB libspdk_ublk.a 00:20:54.213 SYMLINK libspdk_nbd.so 00:20:54.213 SYMLINK libspdk_scsi.so 00:20:54.213 SO libspdk_ublk.so.3.0 00:20:54.213 SYMLINK libspdk_ublk.so 00:20:54.473 LIB libspdk_ftl.a 00:20:54.473 CC lib/iscsi/conn.o 00:20:54.473 CC lib/iscsi/init_grp.o 00:20:54.473 CC lib/iscsi/iscsi.o 00:20:54.473 CC lib/iscsi/md5.o 00:20:54.473 CC lib/iscsi/param.o 00:20:54.473 CC lib/iscsi/portal_grp.o 00:20:54.473 CC lib/iscsi/iscsi_subsystem.o 00:20:54.473 CC lib/iscsi/tgt_node.o 00:20:54.473 CC lib/iscsi/task.o 00:20:54.473 CC lib/iscsi/iscsi_rpc.o 00:20:54.473 CC lib/vhost/vhost_scsi.o 00:20:54.473 CC lib/vhost/vhost.o 00:20:54.473 CC lib/vhost/vhost_rpc.o 00:20:54.473 CC lib/vhost/vhost_blk.o 00:20:54.473 CC lib/vhost/rte_vhost_user.o 00:20:54.732 SO libspdk_ftl.so.9.0 00:20:54.992 SYMLINK libspdk_ftl.so 00:20:55.293 LIB libspdk_nvmf.a 00:20:55.293 SO libspdk_nvmf.so.18.1 00:20:55.293 LIB libspdk_vhost.a 00:20:55.293 SO libspdk_vhost.so.8.0 00:20:55.552 SYMLINK libspdk_vhost.so 00:20:55.552 SYMLINK libspdk_nvmf.so 00:20:55.552 LIB libspdk_iscsi.a 00:20:55.552 SO libspdk_iscsi.so.8.0 00:20:55.812 SYMLINK libspdk_iscsi.so 00:20:56.379 CC module/env_dpdk/env_dpdk_rpc.o 00:20:56.379 CC module/keyring/linux/keyring.o 00:20:56.379 CC module/keyring/linux/keyring_rpc.o 00:20:56.379 LIB libspdk_env_dpdk_rpc.a 00:20:56.379 CC module/accel/dpdk_compressdev/accel_dpdk_compressdev_rpc.o 00:20:56.379 CC module/accel/dpdk_compressdev/accel_dpdk_compressdev.o 00:20:56.379 CC module/accel/ioat/accel_ioat.o 00:20:56.379 CC module/accel/dpdk_cryptodev/accel_dpdk_cryptodev.o 00:20:56.379 CC module/accel/ioat/accel_ioat_rpc.o 00:20:56.379 CC module/accel/dpdk_cryptodev/accel_dpdk_cryptodev_rpc.o 00:20:56.379 CC module/keyring/file/keyring_rpc.o 00:20:56.379 CC module/keyring/file/keyring.o 00:20:56.379 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:20:56.379 CC module/blob/bdev/blob_bdev.o 00:20:56.379 CC module/scheduler/dynamic/scheduler_dynamic.o 00:20:56.379 CC module/accel/dsa/accel_dsa.o 00:20:56.379 CC module/accel/dsa/accel_dsa_rpc.o 00:20:56.379 CC module/accel/error/accel_error_rpc.o 00:20:56.379 CC module/scheduler/gscheduler/gscheduler.o 00:20:56.379 CC module/accel/error/accel_error.o 00:20:56.379 CC module/sock/posix/posix.o 00:20:56.379 SO libspdk_env_dpdk_rpc.so.6.0 00:20:56.379 CC module/accel/iaa/accel_iaa.o 00:20:56.379 CC module/accel/iaa/accel_iaa_rpc.o 00:20:56.379 SYMLINK libspdk_env_dpdk_rpc.so 00:20:56.637 LIB libspdk_keyring_linux.a 00:20:56.637 LIB libspdk_scheduler_dpdk_governor.a 00:20:56.637 LIB libspdk_keyring_file.a 00:20:56.637 SO libspdk_keyring_linux.so.1.0 00:20:56.637 LIB libspdk_accel_ioat.a 00:20:56.637 LIB libspdk_scheduler_gscheduler.a 00:20:56.637 LIB libspdk_scheduler_dynamic.a 00:20:56.637 SO libspdk_scheduler_dpdk_governor.so.4.0 00:20:56.637 SO libspdk_accel_ioat.so.6.0 00:20:56.637 SO libspdk_keyring_file.so.1.0 00:20:56.637 LIB libspdk_accel_error.a 00:20:56.637 SO libspdk_scheduler_gscheduler.so.4.0 00:20:56.637 SYMLINK libspdk_keyring_linux.so 00:20:56.637 SO libspdk_scheduler_dynamic.so.4.0 00:20:56.637 LIB libspdk_blob_bdev.a 00:20:56.637 LIB libspdk_accel_iaa.a 00:20:56.637 LIB libspdk_accel_dsa.a 00:20:56.637 SYMLINK libspdk_scheduler_dpdk_governor.so 00:20:56.637 SO libspdk_accel_error.so.2.0 00:20:56.637 SO libspdk_blob_bdev.so.11.0 00:20:56.637 SYMLINK libspdk_accel_ioat.so 00:20:56.637 SYMLINK libspdk_scheduler_gscheduler.so 00:20:56.637 SO libspdk_accel_iaa.so.3.0 00:20:56.637 SYMLINK libspdk_scheduler_dynamic.so 00:20:56.637 SYMLINK libspdk_keyring_file.so 00:20:56.637 SO libspdk_accel_dsa.so.5.0 00:20:56.637 SYMLINK libspdk_blob_bdev.so 00:20:56.637 SYMLINK libspdk_accel_error.so 00:20:56.637 SYMLINK libspdk_accel_iaa.so 00:20:56.637 SYMLINK libspdk_accel_dsa.so 00:20:56.894 LIB libspdk_sock_posix.a 00:20:57.151 SO libspdk_sock_posix.so.6.0 00:20:57.151 SYMLINK libspdk_sock_posix.so 00:20:57.151 CC module/bdev/delay/vbdev_delay_rpc.o 00:20:57.151 CC module/bdev/delay/vbdev_delay.o 00:20:57.152 CC module/bdev/gpt/vbdev_gpt.o 00:20:57.152 CC module/bdev/gpt/gpt.o 00:20:57.152 CC module/bdev/error/vbdev_error.o 00:20:57.152 CC module/bdev/error/vbdev_error_rpc.o 00:20:57.152 CC module/bdev/nvme/bdev_nvme_rpc.o 00:20:57.152 CC module/bdev/crypto/vbdev_crypto.o 00:20:57.152 CC module/bdev/passthru/vbdev_passthru.o 00:20:57.152 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:20:57.152 CC module/bdev/nvme/nvme_rpc.o 00:20:57.152 CC module/bdev/nvme/bdev_nvme.o 00:20:57.152 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:20:57.152 CC module/blobfs/bdev/blobfs_bdev.o 00:20:57.152 CC module/bdev/nvme/vbdev_opal.o 00:20:57.152 CC module/bdev/crypto/vbdev_crypto_rpc.o 00:20:57.152 CC module/bdev/nvme/vbdev_opal_rpc.o 00:20:57.152 CC module/bdev/nvme/bdev_mdns_client.o 00:20:57.152 CC module/bdev/null/bdev_null_rpc.o 00:20:57.152 CC module/bdev/null/bdev_null.o 00:20:57.152 CC module/bdev/iscsi/bdev_iscsi.o 00:20:57.152 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:20:57.152 CC module/bdev/virtio/bdev_virtio_scsi.o 00:20:57.152 CC module/bdev/split/vbdev_split_rpc.o 00:20:57.152 CC module/bdev/virtio/bdev_virtio_blk.o 00:20:57.152 CC module/bdev/split/vbdev_split.o 00:20:57.152 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:20:57.152 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:20:57.152 CC module/bdev/zone_block/vbdev_zone_block.o 00:20:57.152 CC module/bdev/malloc/bdev_malloc_rpc.o 00:20:57.152 CC module/bdev/malloc/bdev_malloc.o 00:20:57.152 CC module/bdev/ftl/bdev_ftl.o 00:20:57.152 CC module/bdev/virtio/bdev_virtio_rpc.o 00:20:57.152 CC module/bdev/aio/bdev_aio_rpc.o 00:20:57.152 CC module/bdev/aio/bdev_aio.o 00:20:57.152 CC module/bdev/ftl/bdev_ftl_rpc.o 00:20:57.152 CC module/bdev/raid/bdev_raid.o 00:20:57.152 CC module/bdev/raid/bdev_raid_rpc.o 00:20:57.152 CC module/bdev/lvol/vbdev_lvol.o 00:20:57.152 CC module/bdev/raid/bdev_raid_sb.o 00:20:57.152 CC module/bdev/raid/raid0.o 00:20:57.152 CC module/bdev/raid/raid1.o 00:20:57.152 CC module/bdev/raid/concat.o 00:20:57.152 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:20:57.152 CC module/bdev/compress/vbdev_compress.o 00:20:57.152 CC module/bdev/compress/vbdev_compress_rpc.o 00:20:57.152 LIB libspdk_accel_dpdk_compressdev.a 00:20:57.409 SO libspdk_accel_dpdk_compressdev.so.3.0 00:20:57.409 SYMLINK libspdk_accel_dpdk_compressdev.so 00:20:57.409 LIB libspdk_blobfs_bdev.a 00:20:57.409 LIB libspdk_bdev_split.a 00:20:57.409 LIB libspdk_bdev_null.a 00:20:57.409 SO libspdk_blobfs_bdev.so.6.0 00:20:57.409 SO libspdk_bdev_split.so.6.0 00:20:57.409 LIB libspdk_accel_dpdk_cryptodev.a 00:20:57.409 SO libspdk_bdev_null.so.6.0 00:20:57.409 LIB libspdk_bdev_passthru.a 00:20:57.409 LIB libspdk_bdev_ftl.a 00:20:57.409 LIB libspdk_bdev_aio.a 00:20:57.668 LIB libspdk_bdev_zone_block.a 00:20:57.668 LIB libspdk_bdev_error.a 00:20:57.668 SO libspdk_accel_dpdk_cryptodev.so.3.0 00:20:57.668 SO libspdk_bdev_passthru.so.6.0 00:20:57.668 LIB libspdk_bdev_iscsi.a 00:20:57.668 SO libspdk_bdev_ftl.so.6.0 00:20:57.668 LIB libspdk_bdev_gpt.a 00:20:57.668 SO libspdk_bdev_aio.so.6.0 00:20:57.668 SYMLINK libspdk_bdev_split.so 00:20:57.668 SYMLINK libspdk_blobfs_bdev.so 00:20:57.668 LIB libspdk_bdev_crypto.a 00:20:57.668 SYMLINK libspdk_bdev_null.so 00:20:57.668 SO libspdk_bdev_iscsi.so.6.0 00:20:57.668 SO libspdk_bdev_error.so.6.0 00:20:57.668 SO libspdk_bdev_gpt.so.6.0 00:20:57.668 SO libspdk_bdev_zone_block.so.6.0 00:20:57.668 SO libspdk_bdev_crypto.so.6.0 00:20:57.668 LIB libspdk_bdev_delay.a 00:20:57.668 LIB libspdk_bdev_malloc.a 00:20:57.668 SYMLINK libspdk_bdev_aio.so 00:20:57.668 SYMLINK libspdk_accel_dpdk_cryptodev.so 00:20:57.668 SYMLINK libspdk_bdev_passthru.so 00:20:57.668 SYMLINK libspdk_bdev_ftl.so 00:20:57.668 SYMLINK libspdk_bdev_gpt.so 00:20:57.668 SYMLINK libspdk_bdev_iscsi.so 00:20:57.668 SO libspdk_bdev_malloc.so.6.0 00:20:57.668 SYMLINK libspdk_bdev_error.so 00:20:57.668 SO libspdk_bdev_delay.so.6.0 00:20:57.668 SYMLINK libspdk_bdev_crypto.so 00:20:57.668 SYMLINK libspdk_bdev_zone_block.so 00:20:57.668 LIB libspdk_bdev_virtio.a 00:20:57.668 LIB libspdk_bdev_compress.a 00:20:57.668 SYMLINK libspdk_bdev_malloc.so 00:20:57.668 SO libspdk_bdev_virtio.so.6.0 00:20:57.668 SO libspdk_bdev_compress.so.6.0 00:20:57.668 SYMLINK libspdk_bdev_delay.so 00:20:57.668 LIB libspdk_bdev_lvol.a 00:20:57.668 SYMLINK libspdk_bdev_compress.so 00:20:57.668 SYMLINK libspdk_bdev_virtio.so 00:20:57.925 SO libspdk_bdev_lvol.so.6.0 00:20:57.925 SYMLINK libspdk_bdev_lvol.so 00:20:57.925 LIB libspdk_bdev_raid.a 00:20:57.925 SO libspdk_bdev_raid.so.6.0 00:20:58.183 SYMLINK libspdk_bdev_raid.so 00:20:58.750 LIB libspdk_bdev_nvme.a 00:20:59.008 SO libspdk_bdev_nvme.so.7.0 00:20:59.008 SYMLINK libspdk_bdev_nvme.so 00:20:59.573 CC module/event/subsystems/sock/sock.o 00:20:59.573 CC module/event/subsystems/iobuf/iobuf.o 00:20:59.573 CC module/event/subsystems/vmd/vmd.o 00:20:59.573 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:20:59.573 CC module/event/subsystems/vmd/vmd_rpc.o 00:20:59.573 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:20:59.573 CC module/event/subsystems/keyring/keyring.o 00:20:59.573 CC module/event/subsystems/scheduler/scheduler.o 00:20:59.833 LIB libspdk_event_keyring.a 00:20:59.833 LIB libspdk_event_sock.a 00:20:59.833 LIB libspdk_event_scheduler.a 00:20:59.833 LIB libspdk_event_iobuf.a 00:20:59.833 LIB libspdk_event_vhost_blk.a 00:20:59.833 SO libspdk_event_keyring.so.1.0 00:20:59.833 LIB libspdk_event_vmd.a 00:20:59.833 SO libspdk_event_scheduler.so.4.0 00:20:59.833 SO libspdk_event_sock.so.5.0 00:20:59.833 SO libspdk_event_iobuf.so.3.0 00:20:59.833 SO libspdk_event_vmd.so.6.0 00:20:59.833 SO libspdk_event_vhost_blk.so.3.0 00:20:59.833 SYMLINK libspdk_event_keyring.so 00:20:59.833 SYMLINK libspdk_event_scheduler.so 00:20:59.833 SYMLINK libspdk_event_sock.so 00:20:59.833 SYMLINK libspdk_event_iobuf.so 00:20:59.833 SYMLINK libspdk_event_vmd.so 00:20:59.833 SYMLINK libspdk_event_vhost_blk.so 00:21:00.448 CC module/event/subsystems/accel/accel.o 00:21:00.448 LIB libspdk_event_accel.a 00:21:00.448 SO libspdk_event_accel.so.6.0 00:21:00.448 SYMLINK libspdk_event_accel.so 00:21:01.014 CC module/event/subsystems/bdev/bdev.o 00:21:01.014 LIB libspdk_event_bdev.a 00:21:01.014 SO libspdk_event_bdev.so.6.0 00:21:01.014 SYMLINK libspdk_event_bdev.so 00:21:01.579 CC module/event/subsystems/scsi/scsi.o 00:21:01.579 CC module/event/subsystems/nbd/nbd.o 00:21:01.579 CC module/event/subsystems/ublk/ublk.o 00:21:01.579 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:21:01.579 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:21:01.579 LIB libspdk_event_nbd.a 00:21:01.579 LIB libspdk_event_scsi.a 00:21:01.579 LIB libspdk_event_ublk.a 00:21:01.579 SO libspdk_event_nbd.so.6.0 00:21:01.579 SO libspdk_event_scsi.so.6.0 00:21:01.579 SO libspdk_event_ublk.so.3.0 00:21:01.579 SYMLINK libspdk_event_nbd.so 00:21:01.579 LIB libspdk_event_nvmf.a 00:21:01.579 SYMLINK libspdk_event_scsi.so 00:21:01.579 SYMLINK libspdk_event_ublk.so 00:21:01.836 SO libspdk_event_nvmf.so.6.0 00:21:01.836 SYMLINK libspdk_event_nvmf.so 00:21:02.094 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:21:02.094 CC module/event/subsystems/iscsi/iscsi.o 00:21:02.094 LIB libspdk_event_vhost_scsi.a 00:21:02.094 SO libspdk_event_vhost_scsi.so.3.0 00:21:02.094 LIB libspdk_event_iscsi.a 00:21:02.353 SO libspdk_event_iscsi.so.6.0 00:21:02.353 SYMLINK libspdk_event_vhost_scsi.so 00:21:02.353 SYMLINK libspdk_event_iscsi.so 00:21:02.611 SO libspdk.so.6.0 00:21:02.611 SYMLINK libspdk.so 00:21:02.876 CC app/trace_record/trace_record.o 00:21:02.876 CXX app/trace/trace.o 00:21:02.876 CC app/spdk_nvme_perf/perf.o 00:21:02.876 CC app/spdk_lspci/spdk_lspci.o 00:21:02.876 CC app/spdk_nvme_identify/identify.o 00:21:02.876 CC test/rpc_client/rpc_client_test.o 00:21:02.876 CC app/spdk_nvme_discover/discovery_aer.o 00:21:02.876 CC app/spdk_top/spdk_top.o 00:21:02.876 TEST_HEADER include/spdk/accel.h 00:21:02.876 TEST_HEADER include/spdk/accel_module.h 00:21:02.876 TEST_HEADER include/spdk/assert.h 00:21:02.876 TEST_HEADER include/spdk/barrier.h 00:21:02.876 TEST_HEADER include/spdk/base64.h 00:21:02.876 TEST_HEADER include/spdk/bdev.h 00:21:02.876 TEST_HEADER include/spdk/bdev_module.h 00:21:02.876 TEST_HEADER include/spdk/bdev_zone.h 00:21:02.876 TEST_HEADER include/spdk/bit_array.h 00:21:02.876 CC examples/interrupt_tgt/interrupt_tgt.o 00:21:02.876 TEST_HEADER include/spdk/bit_pool.h 00:21:02.876 TEST_HEADER include/spdk/blob_bdev.h 00:21:02.876 CC app/spdk_dd/spdk_dd.o 00:21:02.876 TEST_HEADER include/spdk/blobfs_bdev.h 00:21:02.876 TEST_HEADER include/spdk/blobfs.h 00:21:02.876 TEST_HEADER include/spdk/blob.h 00:21:02.876 TEST_HEADER include/spdk/conf.h 00:21:02.876 CC app/iscsi_tgt/iscsi_tgt.o 00:21:02.876 TEST_HEADER include/spdk/config.h 00:21:02.876 TEST_HEADER include/spdk/cpuset.h 00:21:02.876 CC app/nvmf_tgt/nvmf_main.o 00:21:02.876 TEST_HEADER include/spdk/crc16.h 00:21:02.876 TEST_HEADER include/spdk/crc32.h 00:21:02.876 CC app/vhost/vhost.o 00:21:02.876 TEST_HEADER include/spdk/crc64.h 00:21:02.876 TEST_HEADER include/spdk/dif.h 00:21:02.876 TEST_HEADER include/spdk/dma.h 00:21:03.135 TEST_HEADER include/spdk/endian.h 00:21:03.135 CC app/spdk_tgt/spdk_tgt.o 00:21:03.135 TEST_HEADER include/spdk/env_dpdk.h 00:21:03.135 TEST_HEADER include/spdk/env.h 00:21:03.135 CC examples/nvme/nvme_manage/nvme_manage.o 00:21:03.135 TEST_HEADER include/spdk/event.h 00:21:03.135 CC examples/vmd/lsvmd/lsvmd.o 00:21:03.135 CC examples/vmd/led/led.o 00:21:03.135 TEST_HEADER include/spdk/fd_group.h 00:21:03.135 TEST_HEADER include/spdk/fd.h 00:21:03.135 CC examples/nvme/hello_world/hello_world.o 00:21:03.135 CC examples/nvme/reconnect/reconnect.o 00:21:03.135 CC examples/nvme/abort/abort.o 00:21:03.135 CC examples/ioat/verify/verify.o 00:21:03.135 TEST_HEADER include/spdk/file.h 00:21:03.135 CC examples/nvme/cmb_copy/cmb_copy.o 00:21:03.135 CC app/fio/nvme/fio_plugin.o 00:21:03.135 CC examples/sock/hello_world/hello_sock.o 00:21:03.135 TEST_HEADER include/spdk/ftl.h 00:21:03.135 CC examples/util/zipf/zipf.o 00:21:03.135 CC examples/accel/perf/accel_perf.o 00:21:03.135 TEST_HEADER include/spdk/gpt_spec.h 00:21:03.135 CC test/nvme/overhead/overhead.o 00:21:03.135 CC examples/nvme/hotplug/hotplug.o 00:21:03.135 CC test/env/vtophys/vtophys.o 00:21:03.135 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:21:03.135 TEST_HEADER include/spdk/hexlify.h 00:21:03.135 CC test/app/jsoncat/jsoncat.o 00:21:03.135 TEST_HEADER include/spdk/histogram_data.h 00:21:03.135 CC examples/nvme/arbitration/arbitration.o 00:21:03.135 CC examples/ioat/perf/perf.o 00:21:03.135 CC test/thread/poller_perf/poller_perf.o 00:21:03.135 CC test/env/memory/memory_ut.o 00:21:03.135 CC test/nvme/reset/reset.o 00:21:03.135 TEST_HEADER include/spdk/idxd.h 00:21:03.135 CC test/nvme/fused_ordering/fused_ordering.o 00:21:03.135 CC test/event/event_perf/event_perf.o 00:21:03.135 CC test/app/stub/stub.o 00:21:03.135 CC test/nvme/reserve/reserve.o 00:21:03.135 CC test/nvme/e2edp/nvme_dp.o 00:21:03.135 CC test/nvme/sgl/sgl.o 00:21:03.135 CC test/nvme/simple_copy/simple_copy.o 00:21:03.135 TEST_HEADER include/spdk/idxd_spec.h 00:21:03.135 CC test/app/histogram_perf/histogram_perf.o 00:21:03.135 CC examples/idxd/perf/perf.o 00:21:03.135 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:21:03.135 TEST_HEADER include/spdk/init.h 00:21:03.135 CC test/nvme/compliance/nvme_compliance.o 00:21:03.135 CC test/nvme/startup/startup.o 00:21:03.135 CC test/nvme/connect_stress/connect_stress.o 00:21:03.135 CC test/nvme/boot_partition/boot_partition.o 00:21:03.135 TEST_HEADER include/spdk/ioat.h 00:21:03.135 CC test/event/reactor_perf/reactor_perf.o 00:21:03.135 CC test/nvme/aer/aer.o 00:21:03.135 TEST_HEADER include/spdk/ioat_spec.h 00:21:03.135 CC examples/nvmf/nvmf/nvmf.o 00:21:03.135 CC test/nvme/err_injection/err_injection.o 00:21:03.135 CC test/env/pci/pci_ut.o 00:21:03.135 TEST_HEADER include/spdk/iscsi_spec.h 00:21:03.135 CC examples/blob/cli/blobcli.o 00:21:03.135 TEST_HEADER include/spdk/json.h 00:21:03.135 CC test/event/reactor/reactor.o 00:21:03.135 TEST_HEADER include/spdk/jsonrpc.h 00:21:03.135 CC examples/bdev/bdevperf/bdevperf.o 00:21:03.135 TEST_HEADER include/spdk/keyring.h 00:21:03.135 TEST_HEADER include/spdk/keyring_module.h 00:21:03.135 CC examples/blob/hello_world/hello_blob.o 00:21:03.135 TEST_HEADER include/spdk/likely.h 00:21:03.135 TEST_HEADER include/spdk/log.h 00:21:03.135 CC examples/thread/thread/thread_ex.o 00:21:03.135 CC test/dma/test_dma/test_dma.o 00:21:03.135 TEST_HEADER include/spdk/lvol.h 00:21:03.135 CC test/event/app_repeat/app_repeat.o 00:21:03.135 TEST_HEADER include/spdk/memory.h 00:21:03.135 TEST_HEADER include/spdk/mmio.h 00:21:03.135 CC examples/bdev/hello_world/hello_bdev.o 00:21:03.135 TEST_HEADER include/spdk/nbd.h 00:21:03.135 CC app/fio/bdev/fio_plugin.o 00:21:03.135 TEST_HEADER include/spdk/notify.h 00:21:03.135 CC test/accel/dif/dif.o 00:21:03.135 TEST_HEADER include/spdk/nvme.h 00:21:03.135 CC test/app/bdev_svc/bdev_svc.o 00:21:03.135 TEST_HEADER include/spdk/nvme_intel.h 00:21:03.135 CC test/bdev/bdevio/bdevio.o 00:21:03.135 TEST_HEADER include/spdk/nvme_ocssd.h 00:21:03.135 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:21:03.135 TEST_HEADER include/spdk/nvme_spec.h 00:21:03.135 TEST_HEADER include/spdk/nvme_zns.h 00:21:03.135 TEST_HEADER include/spdk/nvmf_cmd.h 00:21:03.135 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:21:03.135 LINK spdk_lspci 00:21:03.135 TEST_HEADER include/spdk/nvmf.h 00:21:03.135 TEST_HEADER include/spdk/nvmf_spec.h 00:21:03.135 TEST_HEADER include/spdk/nvmf_transport.h 00:21:03.135 TEST_HEADER include/spdk/opal.h 00:21:03.135 TEST_HEADER include/spdk/opal_spec.h 00:21:03.135 CC test/blobfs/mkfs/mkfs.o 00:21:03.135 TEST_HEADER include/spdk/pci_ids.h 00:21:03.135 TEST_HEADER include/spdk/pipe.h 00:21:03.135 TEST_HEADER include/spdk/queue.h 00:21:03.135 TEST_HEADER include/spdk/reduce.h 00:21:03.135 TEST_HEADER include/spdk/rpc.h 00:21:03.135 TEST_HEADER include/spdk/scheduler.h 00:21:03.397 TEST_HEADER include/spdk/scsi.h 00:21:03.397 CC test/env/mem_callbacks/mem_callbacks.o 00:21:03.397 LINK rpc_client_test 00:21:03.397 TEST_HEADER include/spdk/scsi_spec.h 00:21:03.397 TEST_HEADER include/spdk/sock.h 00:21:03.397 TEST_HEADER include/spdk/stdinc.h 00:21:03.397 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:21:03.397 TEST_HEADER include/spdk/string.h 00:21:03.397 TEST_HEADER include/spdk/thread.h 00:21:03.397 TEST_HEADER include/spdk/trace.h 00:21:03.397 TEST_HEADER include/spdk/trace_parser.h 00:21:03.397 TEST_HEADER include/spdk/tree.h 00:21:03.397 TEST_HEADER include/spdk/ublk.h 00:21:03.397 LINK interrupt_tgt 00:21:03.397 CC test/lvol/esnap/esnap.o 00:21:03.397 TEST_HEADER include/spdk/util.h 00:21:03.397 LINK spdk_nvme_discover 00:21:03.397 TEST_HEADER include/spdk/uuid.h 00:21:03.397 TEST_HEADER include/spdk/version.h 00:21:03.397 TEST_HEADER include/spdk/vfio_user_pci.h 00:21:03.397 TEST_HEADER include/spdk/vfio_user_spec.h 00:21:03.397 TEST_HEADER include/spdk/vhost.h 00:21:03.397 LINK vhost 00:21:03.397 LINK nvmf_tgt 00:21:03.397 TEST_HEADER include/spdk/vmd.h 00:21:03.397 TEST_HEADER include/spdk/xor.h 00:21:03.397 TEST_HEADER include/spdk/zipf.h 00:21:03.397 LINK jsoncat 00:21:03.397 LINK iscsi_tgt 00:21:03.397 LINK vtophys 00:21:03.397 CXX test/cpp_headers/accel.o 00:21:03.397 LINK lsvmd 00:21:03.397 LINK led 00:21:03.397 LINK zipf 00:21:03.397 LINK reactor_perf 00:21:03.397 LINK event_perf 00:21:03.397 LINK pmr_persistence 00:21:03.397 LINK spdk_trace_record 00:21:03.397 LINK env_dpdk_post_init 00:21:03.397 LINK cmb_copy 00:21:03.397 LINK histogram_perf 00:21:03.397 LINK poller_perf 00:21:03.397 LINK spdk_tgt 00:21:03.397 LINK reactor 00:21:03.397 LINK reserve 00:21:03.397 LINK verify 00:21:03.397 LINK fused_ordering 00:21:03.397 LINK startup 00:21:03.397 LINK boot_partition 00:21:03.397 LINK app_repeat 00:21:03.397 LINK hello_sock 00:21:03.397 LINK stub 00:21:03.397 LINK simple_copy 00:21:03.397 LINK connect_stress 00:21:03.397 LINK hello_world 00:21:03.659 LINK err_injection 00:21:03.659 LINK ioat_perf 00:21:03.659 LINK reset 00:21:03.659 LINK bdev_svc 00:21:03.659 LINK hello_blob 00:21:03.659 LINK hotplug 00:21:03.659 LINK thread 00:21:03.659 LINK sgl 00:21:03.659 LINK spdk_trace 00:21:03.659 LINK overhead 00:21:03.659 LINK hello_bdev 00:21:03.659 LINK nvme_dp 00:21:03.659 LINK reconnect 00:21:03.659 LINK spdk_dd 00:21:03.659 CXX test/cpp_headers/accel_module.o 00:21:03.659 LINK mkfs 00:21:03.659 LINK aer 00:21:03.659 LINK abort 00:21:03.659 LINK nvmf 00:21:03.659 LINK arbitration 00:21:03.659 LINK nvme_compliance 00:21:03.659 LINK idxd_perf 00:21:03.659 CC test/nvme/doorbell_aers/doorbell_aers.o 00:21:03.659 LINK pci_ut 00:21:03.659 CXX test/cpp_headers/assert.o 00:21:03.659 CXX test/cpp_headers/barrier.o 00:21:03.923 CXX test/cpp_headers/base64.o 00:21:03.923 CXX test/cpp_headers/bdev.o 00:21:03.923 CXX test/cpp_headers/bdev_module.o 00:21:03.923 CXX test/cpp_headers/bdev_zone.o 00:21:03.923 CXX test/cpp_headers/bit_array.o 00:21:03.923 CXX test/cpp_headers/bit_pool.o 00:21:03.923 LINK bdevio 00:21:03.923 CXX test/cpp_headers/blob_bdev.o 00:21:03.923 LINK test_dma 00:21:03.923 CXX test/cpp_headers/blobfs_bdev.o 00:21:03.923 CXX test/cpp_headers/blobfs.o 00:21:03.923 CXX test/cpp_headers/blob.o 00:21:03.923 CXX test/cpp_headers/conf.o 00:21:03.923 LINK accel_perf 00:21:03.923 CXX test/cpp_headers/config.o 00:21:03.923 CXX test/cpp_headers/cpuset.o 00:21:03.923 CXX test/cpp_headers/crc16.o 00:21:03.923 CXX test/cpp_headers/crc32.o 00:21:03.923 CXX test/cpp_headers/crc64.o 00:21:03.923 CXX test/cpp_headers/dif.o 00:21:03.923 CXX test/cpp_headers/dma.o 00:21:03.923 CXX test/cpp_headers/endian.o 00:21:03.923 LINK dif 00:21:03.923 CXX test/cpp_headers/env_dpdk.o 00:21:03.923 CXX test/cpp_headers/env.o 00:21:03.923 CXX test/cpp_headers/event.o 00:21:03.923 CXX test/cpp_headers/fd_group.o 00:21:03.923 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:21:03.923 CXX test/cpp_headers/fd.o 00:21:03.923 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:21:03.923 LINK nvme_manage 00:21:03.923 CC test/event/scheduler/scheduler.o 00:21:03.923 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:21:03.923 CXX test/cpp_headers/file.o 00:21:03.923 CXX test/cpp_headers/ftl.o 00:21:03.923 CXX test/cpp_headers/gpt_spec.o 00:21:03.923 CXX test/cpp_headers/histogram_data.o 00:21:03.923 CXX test/cpp_headers/hexlify.o 00:21:03.923 CXX test/cpp_headers/idxd.o 00:21:03.923 CC test/nvme/cuse/cuse.o 00:21:03.923 CXX test/cpp_headers/idxd_spec.o 00:21:03.923 CXX test/cpp_headers/init.o 00:21:03.923 LINK spdk_nvme 00:21:03.923 CC test/nvme/fdp/fdp.o 00:21:03.923 LINK nvme_fuzz 00:21:03.923 CXX test/cpp_headers/ioat.o 00:21:03.923 LINK spdk_bdev 00:21:03.923 CXX test/cpp_headers/ioat_spec.o 00:21:03.923 CXX test/cpp_headers/iscsi_spec.o 00:21:03.923 LINK blobcli 00:21:03.923 CXX test/cpp_headers/json.o 00:21:03.923 CXX test/cpp_headers/jsonrpc.o 00:21:03.923 CXX test/cpp_headers/keyring.o 00:21:03.923 CXX test/cpp_headers/keyring_module.o 00:21:04.182 CXX test/cpp_headers/likely.o 00:21:04.182 CXX test/cpp_headers/log.o 00:21:04.182 CXX test/cpp_headers/lvol.o 00:21:04.182 CXX test/cpp_headers/mmio.o 00:21:04.182 CXX test/cpp_headers/memory.o 00:21:04.182 CXX test/cpp_headers/nbd.o 00:21:04.182 CXX test/cpp_headers/notify.o 00:21:04.182 LINK doorbell_aers 00:21:04.182 CXX test/cpp_headers/nvme.o 00:21:04.182 CXX test/cpp_headers/nvme_intel.o 00:21:04.182 CXX test/cpp_headers/nvme_ocssd.o 00:21:04.182 CXX test/cpp_headers/nvme_ocssd_spec.o 00:21:04.182 CXX test/cpp_headers/nvme_spec.o 00:21:04.182 LINK mem_callbacks 00:21:04.182 CXX test/cpp_headers/nvme_zns.o 00:21:04.182 LINK spdk_nvme_perf 00:21:04.182 CXX test/cpp_headers/nvmf_cmd.o 00:21:04.182 CXX test/cpp_headers/nvmf_fc_spec.o 00:21:04.182 CXX test/cpp_headers/nvmf.o 00:21:04.182 CXX test/cpp_headers/nvmf_spec.o 00:21:04.182 CXX test/cpp_headers/nvmf_transport.o 00:21:04.182 CXX test/cpp_headers/opal.o 00:21:04.182 CXX test/cpp_headers/pci_ids.o 00:21:04.182 CXX test/cpp_headers/pipe.o 00:21:04.182 CXX test/cpp_headers/opal_spec.o 00:21:04.182 CXX test/cpp_headers/queue.o 00:21:04.182 CXX test/cpp_headers/reduce.o 00:21:04.182 CXX test/cpp_headers/rpc.o 00:21:04.182 CXX test/cpp_headers/scheduler.o 00:21:04.441 CXX test/cpp_headers/scsi.o 00:21:04.441 CXX test/cpp_headers/scsi_spec.o 00:21:04.441 CXX test/cpp_headers/sock.o 00:21:04.441 CXX test/cpp_headers/stdinc.o 00:21:04.441 CXX test/cpp_headers/string.o 00:21:04.441 CXX test/cpp_headers/thread.o 00:21:04.441 CXX test/cpp_headers/trace.o 00:21:04.441 CXX test/cpp_headers/tree.o 00:21:04.441 CXX test/cpp_headers/trace_parser.o 00:21:04.441 LINK scheduler 00:21:04.441 CXX test/cpp_headers/ublk.o 00:21:04.441 CXX test/cpp_headers/util.o 00:21:04.441 CXX test/cpp_headers/uuid.o 00:21:04.441 CXX test/cpp_headers/version.o 00:21:04.441 CXX test/cpp_headers/vfio_user_pci.o 00:21:04.441 CXX test/cpp_headers/vfio_user_spec.o 00:21:04.441 CXX test/cpp_headers/vhost.o 00:21:04.441 CXX test/cpp_headers/vmd.o 00:21:04.441 CXX test/cpp_headers/xor.o 00:21:04.441 LINK spdk_nvme_identify 00:21:04.441 CXX test/cpp_headers/zipf.o 00:21:04.441 LINK spdk_top 00:21:04.441 LINK bdevperf 00:21:04.700 LINK vhost_fuzz 00:21:04.700 LINK memory_ut 00:21:04.700 LINK fdp 00:21:05.268 LINK cuse 00:21:05.527 LINK iscsi_fuzz 00:21:08.061 LINK esnap 00:21:08.061 00:21:08.061 real 1m13.865s 00:21:08.061 user 14m43.561s 00:21:08.061 sys 4m6.676s 00:21:08.061 11:31:51 make -- common/autotest_common.sh@1125 -- $ xtrace_disable 00:21:08.061 11:31:51 make -- common/autotest_common.sh@10 -- $ set +x 00:21:08.061 ************************************ 00:21:08.062 END TEST make 00:21:08.062 ************************************ 00:21:08.062 11:31:51 -- spdk/autobuild.sh@1 -- $ stop_monitor_resources 00:21:08.062 11:31:51 -- pm/common@29 -- $ signal_monitor_resources TERM 00:21:08.062 11:31:51 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:21:08.062 11:31:51 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:21:08.062 11:31:51 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:21:08.062 11:31:51 -- pm/common@44 -- $ pid=4177431 00:21:08.062 11:31:51 -- pm/common@50 -- $ kill -TERM 4177431 00:21:08.062 11:31:51 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:21:08.062 11:31:51 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:21:08.062 11:31:51 -- pm/common@44 -- $ pid=4177433 00:21:08.062 11:31:51 -- pm/common@50 -- $ kill -TERM 4177433 00:21:08.062 11:31:51 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:21:08.062 11:31:51 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:21:08.062 11:31:51 -- pm/common@44 -- $ pid=4177435 00:21:08.062 11:31:51 -- pm/common@50 -- $ kill -TERM 4177435 00:21:08.062 11:31:51 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:21:08.062 11:31:51 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:21:08.062 11:31:51 -- pm/common@44 -- $ pid=4177460 00:21:08.062 11:31:51 -- pm/common@50 -- $ sudo -E kill -TERM 4177460 00:21:08.062 11:31:51 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:21:08.062 11:31:51 -- nvmf/common.sh@7 -- # uname -s 00:21:08.062 11:31:51 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:21:08.062 11:31:51 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:21:08.062 11:31:51 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:21:08.062 11:31:51 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:21:08.062 11:31:51 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:21:08.062 11:31:51 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:21:08.062 11:31:51 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:21:08.062 11:31:51 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:21:08.062 11:31:51 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:21:08.062 11:31:51 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:21:08.062 11:31:51 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:804da62e-425e-e711-906e-0017a4403562 00:21:08.062 11:31:51 -- nvmf/common.sh@18 -- # NVME_HOSTID=804da62e-425e-e711-906e-0017a4403562 00:21:08.062 11:31:51 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:21:08.062 11:31:51 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:21:08.062 11:31:51 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:21:08.062 11:31:51 -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:21:08.062 11:31:51 -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:21:08.062 11:31:51 -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:21:08.062 11:31:51 -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:21:08.062 11:31:51 -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:21:08.062 11:31:51 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:21:08.062 11:31:51 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:21:08.062 11:31:51 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:21:08.062 11:31:51 -- paths/export.sh@5 -- # export PATH 00:21:08.062 11:31:51 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:21:08.062 11:31:51 -- nvmf/common.sh@47 -- # : 0 00:21:08.062 11:31:51 -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:21:08.062 11:31:51 -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:21:08.062 11:31:51 -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:21:08.062 11:31:51 -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:21:08.062 11:31:51 -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:21:08.062 11:31:51 -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:21:08.062 11:31:51 -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:21:08.062 11:31:51 -- nvmf/common.sh@51 -- # have_pci_nics=0 00:21:08.062 11:31:51 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:21:08.062 11:31:51 -- spdk/autotest.sh@32 -- # uname -s 00:21:08.062 11:31:51 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:21:08.062 11:31:51 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:21:08.062 11:31:51 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/coredumps 00:21:08.062 11:31:51 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:21:08.062 11:31:51 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/coredumps 00:21:08.062 11:31:51 -- spdk/autotest.sh@44 -- # modprobe nbd 00:21:08.062 11:31:51 -- spdk/autotest.sh@46 -- # type -P udevadm 00:21:08.062 11:31:51 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:21:08.062 11:31:51 -- spdk/autotest.sh@48 -- # udevadm_pid=47692 00:21:08.062 11:31:51 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:21:08.062 11:31:51 -- spdk/autotest.sh@53 -- # start_monitor_resources 00:21:08.062 11:31:51 -- pm/common@17 -- # local monitor 00:21:08.062 11:31:51 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:21:08.062 11:31:51 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:21:08.062 11:31:51 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:21:08.062 11:31:51 -- pm/common@21 -- # date +%s 00:21:08.062 11:31:51 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:21:08.062 11:31:51 -- pm/common@21 -- # date +%s 00:21:08.062 11:31:52 -- pm/common@25 -- # sleep 1 00:21:08.062 11:31:52 -- pm/common@21 -- # date +%s 00:21:08.062 11:31:52 -- pm/common@21 -- # date +%s 00:21:08.062 11:31:52 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1718011912 00:21:08.320 11:31:52 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1718011912 00:21:08.320 11:31:52 -- pm/common@21 -- # sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1718011912 00:21:08.320 11:31:52 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1718011912 00:21:08.320 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1718011912_collect-cpu-load.pm.log 00:21:08.320 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1718011912_collect-vmstat.pm.log 00:21:08.320 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1718011912_collect-cpu-temp.pm.log 00:21:08.320 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1718011912_collect-bmc-pm.bmc.pm.log 00:21:09.255 11:31:53 -- spdk/autotest.sh@55 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:21:09.255 11:31:53 -- spdk/autotest.sh@57 -- # timing_enter autotest 00:21:09.255 11:31:53 -- common/autotest_common.sh@723 -- # xtrace_disable 00:21:09.255 11:31:53 -- common/autotest_common.sh@10 -- # set +x 00:21:09.255 11:31:53 -- spdk/autotest.sh@59 -- # create_test_list 00:21:09.255 11:31:53 -- common/autotest_common.sh@747 -- # xtrace_disable 00:21:09.255 11:31:53 -- common/autotest_common.sh@10 -- # set +x 00:21:09.255 11:31:53 -- spdk/autotest.sh@61 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/autotest.sh 00:21:09.255 11:31:53 -- spdk/autotest.sh@61 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk 00:21:09.255 11:31:53 -- spdk/autotest.sh@61 -- # src=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:21:09.255 11:31:53 -- spdk/autotest.sh@62 -- # out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:21:09.255 11:31:53 -- spdk/autotest.sh@63 -- # cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:21:09.255 11:31:53 -- spdk/autotest.sh@65 -- # freebsd_update_contigmem_mod 00:21:09.255 11:31:53 -- common/autotest_common.sh@1454 -- # uname 00:21:09.255 11:31:53 -- common/autotest_common.sh@1454 -- # '[' Linux = FreeBSD ']' 00:21:09.255 11:31:53 -- spdk/autotest.sh@66 -- # freebsd_set_maxsock_buf 00:21:09.255 11:31:53 -- common/autotest_common.sh@1474 -- # uname 00:21:09.255 11:31:53 -- common/autotest_common.sh@1474 -- # [[ Linux = FreeBSD ]] 00:21:09.255 11:31:53 -- spdk/autotest.sh@71 -- # grep CC_TYPE mk/cc.mk 00:21:09.255 11:31:53 -- spdk/autotest.sh@71 -- # CC_TYPE=CC_TYPE=gcc 00:21:09.255 11:31:53 -- spdk/autotest.sh@72 -- # hash lcov 00:21:09.255 11:31:53 -- spdk/autotest.sh@72 -- # [[ CC_TYPE=gcc == *\c\l\a\n\g* ]] 00:21:09.255 11:31:53 -- spdk/autotest.sh@80 -- # export 'LCOV_OPTS= 00:21:09.255 --rc lcov_branch_coverage=1 00:21:09.255 --rc lcov_function_coverage=1 00:21:09.255 --rc genhtml_branch_coverage=1 00:21:09.255 --rc genhtml_function_coverage=1 00:21:09.255 --rc genhtml_legend=1 00:21:09.255 --rc geninfo_all_blocks=1 00:21:09.255 ' 00:21:09.255 11:31:53 -- spdk/autotest.sh@80 -- # LCOV_OPTS=' 00:21:09.255 --rc lcov_branch_coverage=1 00:21:09.255 --rc lcov_function_coverage=1 00:21:09.255 --rc genhtml_branch_coverage=1 00:21:09.255 --rc genhtml_function_coverage=1 00:21:09.255 --rc genhtml_legend=1 00:21:09.255 --rc geninfo_all_blocks=1 00:21:09.255 ' 00:21:09.255 11:31:53 -- spdk/autotest.sh@81 -- # export 'LCOV=lcov 00:21:09.255 --rc lcov_branch_coverage=1 00:21:09.255 --rc lcov_function_coverage=1 00:21:09.255 --rc genhtml_branch_coverage=1 00:21:09.255 --rc genhtml_function_coverage=1 00:21:09.255 --rc genhtml_legend=1 00:21:09.255 --rc geninfo_all_blocks=1 00:21:09.255 --no-external' 00:21:09.255 11:31:53 -- spdk/autotest.sh@81 -- # LCOV='lcov 00:21:09.255 --rc lcov_branch_coverage=1 00:21:09.255 --rc lcov_function_coverage=1 00:21:09.255 --rc genhtml_branch_coverage=1 00:21:09.255 --rc genhtml_function_coverage=1 00:21:09.255 --rc genhtml_legend=1 00:21:09.255 --rc geninfo_all_blocks=1 00:21:09.255 --no-external' 00:21:09.255 11:31:53 -- spdk/autotest.sh@83 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -v 00:21:09.255 lcov: LCOV version 1.14 00:21:09.255 11:31:53 -- spdk/autotest.sh@85 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -c -i -t Baseline -d /var/jenkins/workspace/crypto-phy-autotest/spdk -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_base.info 00:21:19.241 /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno:no functions found 00:21:19.241 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel_module.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel_module.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/barrier.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/barrier.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/assert.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/assert.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/base64.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/base64.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_module.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_module.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_zone.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_zone.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_array.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_array.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_pool.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_pool.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob_bdev.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob_bdev.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs_bdev.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs_bdev.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/conf.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/conf.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/config.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/config.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/cpuset.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/cpuset.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc16.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc16.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc32.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc32.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc64.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc64.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dif.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dif.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/endian.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/endian.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env_dpdk.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env_dpdk.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dma.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dma.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd_group.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd_group.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/event.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/event.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/file.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/file.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ftl.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ftl.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/gpt_spec.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/gpt_spec.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/histogram_data.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/histogram_data.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/hexlify.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/hexlify.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/init.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/init.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd_spec.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd_spec.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat_spec.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat_spec.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/iscsi_spec.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/iscsi_spec.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/json.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/json.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/jsonrpc.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/jsonrpc.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring_module.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring_module.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/likely.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/likely.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/log.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/log.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/mmio.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/mmio.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/memory.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/memory.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/lvol.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/lvol.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nbd.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nbd.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/notify.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/notify.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_intel.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_intel.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd_spec.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd_spec.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_spec.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_spec.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_zns.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_zns.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_cmd.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_cmd.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_spec.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_spec.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_transport.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_transport.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_fc_spec.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_fc_spec.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/queue.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/queue.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pipe.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pipe.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/rpc.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/rpc.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal_spec.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal_spec.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pci_ids.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pci_ids.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scheduler.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scheduler.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/reduce.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/reduce.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi_spec.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi_spec.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/sock.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/sock.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/string.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/string.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/stdinc.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/stdinc.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/thread.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/thread.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/tree.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/tree.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace_parser.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace_parser.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ublk.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ublk.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/util.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/util.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/uuid.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/uuid.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/version.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/version.gcno 00:21:34.145 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_pci.gcno:no functions found 00:21:34.145 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_pci.gcno 00:21:34.146 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vmd.gcno:no functions found 00:21:34.146 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vmd.gcno 00:21:34.146 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_spec.gcno:no functions found 00:21:34.146 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_spec.gcno 00:21:34.146 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vhost.gcno:no functions found 00:21:34.146 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vhost.gcno 00:21:34.146 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/xor.gcno:no functions found 00:21:34.146 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/xor.gcno 00:21:34.146 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/zipf.gcno:no functions found 00:21:34.146 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/zipf.gcno 00:21:34.146 11:32:17 -- spdk/autotest.sh@89 -- # timing_enter pre_cleanup 00:21:34.146 11:32:17 -- common/autotest_common.sh@723 -- # xtrace_disable 00:21:34.146 11:32:17 -- common/autotest_common.sh@10 -- # set +x 00:21:34.146 11:32:17 -- spdk/autotest.sh@91 -- # rm -f 00:21:34.146 11:32:17 -- spdk/autotest.sh@94 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:21:37.433 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:21:37.433 0000:5e:00.0 (8086 0b60): Already using the nvme driver 00:21:37.433 0000:00:04.7 (8086 2021): Already using the ioatdma driver 00:21:37.433 0000:00:04.6 (8086 2021): Already using the ioatdma driver 00:21:37.692 0000:00:04.5 (8086 2021): Already using the ioatdma driver 00:21:37.692 0000:00:04.4 (8086 2021): Already using the ioatdma driver 00:21:37.692 0000:00:04.3 (8086 2021): Already using the ioatdma driver 00:21:37.692 0000:00:04.2 (8086 2021): Already using the ioatdma driver 00:21:37.692 0000:00:04.1 (8086 2021): Already using the ioatdma driver 00:21:37.692 0000:00:04.0 (8086 2021): Already using the ioatdma driver 00:21:37.692 0000:80:04.7 (8086 2021): Already using the ioatdma driver 00:21:37.692 0000:80:04.6 (8086 2021): Already using the ioatdma driver 00:21:37.692 0000:80:04.5 (8086 2021): Already using the ioatdma driver 00:21:37.951 0000:80:04.4 (8086 2021): Already using the ioatdma driver 00:21:37.951 0000:80:04.3 (8086 2021): Already using the ioatdma driver 00:21:37.951 0000:80:04.2 (8086 2021): Already using the ioatdma driver 00:21:37.951 0000:80:04.1 (8086 2021): Already using the ioatdma driver 00:21:37.951 0000:80:04.0 (8086 2021): Already using the ioatdma driver 00:21:37.951 11:32:21 -- spdk/autotest.sh@96 -- # get_zoned_devs 00:21:37.951 11:32:21 -- common/autotest_common.sh@1668 -- # zoned_devs=() 00:21:37.951 11:32:21 -- common/autotest_common.sh@1668 -- # local -gA zoned_devs 00:21:37.951 11:32:21 -- common/autotest_common.sh@1669 -- # local nvme bdf 00:21:37.951 11:32:21 -- common/autotest_common.sh@1671 -- # for nvme in /sys/block/nvme* 00:21:37.951 11:32:21 -- common/autotest_common.sh@1672 -- # is_block_zoned nvme0n1 00:21:37.951 11:32:21 -- common/autotest_common.sh@1661 -- # local device=nvme0n1 00:21:37.951 11:32:21 -- common/autotest_common.sh@1663 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:21:37.951 11:32:21 -- common/autotest_common.sh@1664 -- # [[ none != none ]] 00:21:37.951 11:32:21 -- spdk/autotest.sh@98 -- # (( 0 > 0 )) 00:21:37.951 11:32:21 -- spdk/autotest.sh@110 -- # for dev in /dev/nvme*n!(*p*) 00:21:37.951 11:32:21 -- spdk/autotest.sh@112 -- # [[ -z '' ]] 00:21:37.951 11:32:21 -- spdk/autotest.sh@113 -- # block_in_use /dev/nvme0n1 00:21:37.951 11:32:21 -- scripts/common.sh@378 -- # local block=/dev/nvme0n1 pt 00:21:37.951 11:32:21 -- scripts/common.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:21:37.951 No valid GPT data, bailing 00:21:37.951 11:32:21 -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:21:37.951 11:32:21 -- scripts/common.sh@391 -- # pt= 00:21:37.951 11:32:21 -- scripts/common.sh@392 -- # return 1 00:21:37.951 11:32:21 -- spdk/autotest.sh@114 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:21:37.951 1+0 records in 00:21:37.951 1+0 records out 00:21:37.951 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00180127 s, 582 MB/s 00:21:37.951 11:32:21 -- spdk/autotest.sh@118 -- # sync 00:21:37.951 11:32:21 -- spdk/autotest.sh@120 -- # xtrace_disable_per_cmd reap_spdk_processes 00:21:37.951 11:32:21 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:21:37.951 11:32:21 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:21:43.226 11:32:26 -- spdk/autotest.sh@124 -- # uname -s 00:21:43.226 11:32:26 -- spdk/autotest.sh@124 -- # '[' Linux = Linux ']' 00:21:43.226 11:32:26 -- spdk/autotest.sh@125 -- # run_test setup.sh /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/test-setup.sh 00:21:43.226 11:32:26 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:21:43.226 11:32:26 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:21:43.226 11:32:26 -- common/autotest_common.sh@10 -- # set +x 00:21:43.226 ************************************ 00:21:43.226 START TEST setup.sh 00:21:43.226 ************************************ 00:21:43.226 11:32:26 setup.sh -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/test-setup.sh 00:21:43.226 * Looking for test storage... 00:21:43.226 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:21:43.226 11:32:26 setup.sh -- setup/test-setup.sh@10 -- # uname -s 00:21:43.226 11:32:27 setup.sh -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:21:43.226 11:32:27 setup.sh -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/acl.sh 00:21:43.226 11:32:27 setup.sh -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:21:43.226 11:32:27 setup.sh -- common/autotest_common.sh@1106 -- # xtrace_disable 00:21:43.226 11:32:27 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:21:43.226 ************************************ 00:21:43.226 START TEST acl 00:21:43.226 ************************************ 00:21:43.226 11:32:27 setup.sh.acl -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/acl.sh 00:21:43.226 * Looking for test storage... 00:21:43.226 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:21:43.226 11:32:27 setup.sh.acl -- setup/acl.sh@10 -- # get_zoned_devs 00:21:43.226 11:32:27 setup.sh.acl -- common/autotest_common.sh@1668 -- # zoned_devs=() 00:21:43.226 11:32:27 setup.sh.acl -- common/autotest_common.sh@1668 -- # local -gA zoned_devs 00:21:43.226 11:32:27 setup.sh.acl -- common/autotest_common.sh@1669 -- # local nvme bdf 00:21:43.226 11:32:27 setup.sh.acl -- common/autotest_common.sh@1671 -- # for nvme in /sys/block/nvme* 00:21:43.226 11:32:27 setup.sh.acl -- common/autotest_common.sh@1672 -- # is_block_zoned nvme0n1 00:21:43.226 11:32:27 setup.sh.acl -- common/autotest_common.sh@1661 -- # local device=nvme0n1 00:21:43.226 11:32:27 setup.sh.acl -- common/autotest_common.sh@1663 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:21:43.226 11:32:27 setup.sh.acl -- common/autotest_common.sh@1664 -- # [[ none != none ]] 00:21:43.226 11:32:27 setup.sh.acl -- setup/acl.sh@12 -- # devs=() 00:21:43.226 11:32:27 setup.sh.acl -- setup/acl.sh@12 -- # declare -a devs 00:21:43.226 11:32:27 setup.sh.acl -- setup/acl.sh@13 -- # drivers=() 00:21:43.226 11:32:27 setup.sh.acl -- setup/acl.sh@13 -- # declare -A drivers 00:21:43.226 11:32:27 setup.sh.acl -- setup/acl.sh@51 -- # setup reset 00:21:43.226 11:32:27 setup.sh.acl -- setup/common.sh@9 -- # [[ reset == output ]] 00:21:43.226 11:32:27 setup.sh.acl -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:21:47.417 11:32:30 setup.sh.acl -- setup/acl.sh@52 -- # collect_setup_devs 00:21:47.417 11:32:30 setup.sh.acl -- setup/acl.sh@16 -- # local dev driver 00:21:47.417 11:32:30 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:21:47.417 11:32:30 setup.sh.acl -- setup/acl.sh@15 -- # setup output status 00:21:47.417 11:32:30 setup.sh.acl -- setup/common.sh@9 -- # [[ output == output ]] 00:21:47.417 11:32:30 setup.sh.acl -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:21:50.780 11:32:34 setup.sh.acl -- setup/acl.sh@19 -- # [[ (8086 == *:*:*.* ]] 00:21:50.780 11:32:34 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:21:50.780 11:32:34 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:21:50.780 Hugepages 00:21:50.780 node hugesize free / total 00:21:50.780 11:32:34 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:21:50.780 11:32:34 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:21:50.780 11:32:34 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:21:50.780 11:32:34 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:21:50.780 11:32:34 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:21:50.780 11:32:34 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:21:50.780 11:32:34 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:21:50.780 11:32:34 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:21:50.780 11:32:34 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:21:50.780 00:21:50.780 Type BDF Vendor Device NUMA Driver Device Block devices 00:21:50.780 11:32:34 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:21:50.780 11:32:34 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:21:50.780 11:32:34 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:21:50.780 11:32:34 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.0 == *:*:*.* ]] 00:21:50.780 11:32:34 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:21:50.780 11:32:34 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:21:50.780 11:32:34 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:21:50.780 11:32:34 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.1 == *:*:*.* ]] 00:21:50.780 11:32:34 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:21:50.780 11:32:34 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:21:50.780 11:32:34 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:21:50.780 11:32:34 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.2 == *:*:*.* ]] 00:21:50.780 11:32:34 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:21:50.781 11:32:34 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:21:50.781 11:32:34 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:21:50.781 11:32:34 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.3 == *:*:*.* ]] 00:21:50.781 11:32:34 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:21:50.781 11:32:34 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:21:50.781 11:32:34 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:21:50.781 11:32:34 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.4 == *:*:*.* ]] 00:21:50.781 11:32:34 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:21:50.781 11:32:34 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:21:50.781 11:32:34 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:21:50.781 11:32:34 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.5 == *:*:*.* ]] 00:21:50.781 11:32:34 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:21:50.781 11:32:34 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:21:50.781 11:32:34 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:21:50.781 11:32:34 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.6 == *:*:*.* ]] 00:21:50.781 11:32:34 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:21:50.781 11:32:34 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:21:50.781 11:32:34 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:21:50.781 11:32:34 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.7 == *:*:*.* ]] 00:21:50.781 11:32:34 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:21:50.781 11:32:34 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:21:50.781 11:32:34 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:21:50.781 11:32:34 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:5e:00.0 == *:*:*.* ]] 00:21:50.781 11:32:34 setup.sh.acl -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:21:50.781 11:32:34 setup.sh.acl -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\5\e\:\0\0\.\0* ]] 00:21:50.781 11:32:34 setup.sh.acl -- setup/acl.sh@22 -- # devs+=("$dev") 00:21:50.781 11:32:34 setup.sh.acl -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:21:50.781 11:32:34 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:21:50.781 11:32:34 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.0 == *:*:*.* ]] 00:21:50.781 11:32:34 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:21:50.781 11:32:34 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:21:50.781 11:32:34 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:21:50.781 11:32:34 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.1 == *:*:*.* ]] 00:21:50.781 11:32:34 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:21:50.781 11:32:34 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:21:50.781 11:32:34 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:21:50.781 11:32:34 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.2 == *:*:*.* ]] 00:21:50.781 11:32:34 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:21:50.781 11:32:34 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:21:50.781 11:32:34 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:21:50.781 11:32:34 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.3 == *:*:*.* ]] 00:21:50.781 11:32:34 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:21:50.781 11:32:34 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:21:50.781 11:32:34 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:21:50.781 11:32:34 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.4 == *:*:*.* ]] 00:21:50.781 11:32:34 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:21:50.781 11:32:34 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:21:50.781 11:32:34 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:21:50.781 11:32:34 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.5 == *:*:*.* ]] 00:21:50.781 11:32:34 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:21:50.781 11:32:34 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:21:50.781 11:32:34 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:21:50.781 11:32:34 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.6 == *:*:*.* ]] 00:21:50.781 11:32:34 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:21:50.781 11:32:34 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:21:50.781 11:32:34 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:21:50.781 11:32:34 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.7 == *:*:*.* ]] 00:21:50.781 11:32:34 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:21:50.781 11:32:34 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:21:50.781 11:32:34 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:21:50.781 11:32:34 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:85:05.5 == *:*:*.* ]] 00:21:50.781 11:32:34 setup.sh.acl -- setup/acl.sh@20 -- # [[ vfio-pci == nvme ]] 00:21:50.781 11:32:34 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:21:50.781 11:32:34 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:21:50.781 11:32:34 setup.sh.acl -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:21:50.781 11:32:34 setup.sh.acl -- setup/acl.sh@54 -- # run_test denied denied 00:21:50.781 11:32:34 setup.sh.acl -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:21:50.781 11:32:34 setup.sh.acl -- common/autotest_common.sh@1106 -- # xtrace_disable 00:21:50.781 11:32:34 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:21:50.781 ************************************ 00:21:50.781 START TEST denied 00:21:50.781 ************************************ 00:21:50.781 11:32:34 setup.sh.acl.denied -- common/autotest_common.sh@1124 -- # denied 00:21:50.781 11:32:34 setup.sh.acl.denied -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:5e:00.0' 00:21:50.781 11:32:34 setup.sh.acl.denied -- setup/acl.sh@38 -- # setup output config 00:21:50.781 11:32:34 setup.sh.acl.denied -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:5e:00.0' 00:21:50.781 11:32:34 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ output == output ]] 00:21:50.781 11:32:34 setup.sh.acl.denied -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:21:54.984 0000:5e:00.0 (8086 0b60): Skipping denied controller at 0000:5e:00.0 00:21:54.984 11:32:38 setup.sh.acl.denied -- setup/acl.sh@40 -- # verify 0000:5e:00.0 00:21:54.984 11:32:38 setup.sh.acl.denied -- setup/acl.sh@28 -- # local dev driver 00:21:54.984 11:32:38 setup.sh.acl.denied -- setup/acl.sh@30 -- # for dev in "$@" 00:21:54.984 11:32:38 setup.sh.acl.denied -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:5e:00.0 ]] 00:21:54.984 11:32:38 setup.sh.acl.denied -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:5e:00.0/driver 00:21:54.984 11:32:38 setup.sh.acl.denied -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:21:54.984 11:32:38 setup.sh.acl.denied -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:21:54.984 11:32:38 setup.sh.acl.denied -- setup/acl.sh@41 -- # setup reset 00:21:54.984 11:32:38 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ reset == output ]] 00:21:54.984 11:32:38 setup.sh.acl.denied -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:21:59.179 00:21:59.179 real 0m8.542s 00:21:59.179 user 0m2.763s 00:21:59.179 sys 0m5.104s 00:21:59.179 11:32:42 setup.sh.acl.denied -- common/autotest_common.sh@1125 -- # xtrace_disable 00:21:59.179 11:32:42 setup.sh.acl.denied -- common/autotest_common.sh@10 -- # set +x 00:21:59.179 ************************************ 00:21:59.179 END TEST denied 00:21:59.179 ************************************ 00:21:59.179 11:32:42 setup.sh.acl -- setup/acl.sh@55 -- # run_test allowed allowed 00:21:59.179 11:32:42 setup.sh.acl -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:21:59.179 11:32:42 setup.sh.acl -- common/autotest_common.sh@1106 -- # xtrace_disable 00:21:59.179 11:32:42 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:21:59.179 ************************************ 00:21:59.179 START TEST allowed 00:21:59.179 ************************************ 00:21:59.179 11:32:42 setup.sh.acl.allowed -- common/autotest_common.sh@1124 -- # allowed 00:21:59.179 11:32:42 setup.sh.acl.allowed -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:5e:00.0 00:21:59.179 11:32:42 setup.sh.acl.allowed -- setup/acl.sh@45 -- # setup output config 00:21:59.179 11:32:42 setup.sh.acl.allowed -- setup/acl.sh@46 -- # grep -E '0000:5e:00.0 .*: nvme -> .*' 00:21:59.179 11:32:42 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ output == output ]] 00:21:59.179 11:32:42 setup.sh.acl.allowed -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:22:04.456 0000:5e:00.0 (8086 0b60): nvme -> vfio-pci 00:22:04.456 11:32:47 setup.sh.acl.allowed -- setup/acl.sh@47 -- # verify 00:22:04.456 11:32:47 setup.sh.acl.allowed -- setup/acl.sh@28 -- # local dev driver 00:22:04.456 11:32:47 setup.sh.acl.allowed -- setup/acl.sh@48 -- # setup reset 00:22:04.456 11:32:47 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ reset == output ]] 00:22:04.456 11:32:47 setup.sh.acl.allowed -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:22:06.989 00:22:06.989 real 0m7.930s 00:22:06.989 user 0m2.146s 00:22:06.989 sys 0m4.365s 00:22:06.989 11:32:50 setup.sh.acl.allowed -- common/autotest_common.sh@1125 -- # xtrace_disable 00:22:06.989 11:32:50 setup.sh.acl.allowed -- common/autotest_common.sh@10 -- # set +x 00:22:06.989 ************************************ 00:22:06.989 END TEST allowed 00:22:06.989 ************************************ 00:22:07.250 00:22:07.250 real 0m23.908s 00:22:07.250 user 0m7.466s 00:22:07.250 sys 0m14.570s 00:22:07.250 11:32:50 setup.sh.acl -- common/autotest_common.sh@1125 -- # xtrace_disable 00:22:07.250 11:32:50 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:22:07.250 ************************************ 00:22:07.250 END TEST acl 00:22:07.250 ************************************ 00:22:07.250 11:32:50 setup.sh -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/hugepages.sh 00:22:07.250 11:32:50 setup.sh -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:22:07.250 11:32:50 setup.sh -- common/autotest_common.sh@1106 -- # xtrace_disable 00:22:07.250 11:32:50 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:22:07.250 ************************************ 00:22:07.250 START TEST hugepages 00:22:07.250 ************************************ 00:22:07.250 11:32:51 setup.sh.hugepages -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/hugepages.sh 00:22:07.250 * Looking for test storage... 00:22:07.250 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:22:07.250 11:32:51 setup.sh.hugepages -- setup/hugepages.sh@10 -- # nodes_sys=() 00:22:07.250 11:32:51 setup.sh.hugepages -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:22:07.250 11:32:51 setup.sh.hugepages -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:22:07.250 11:32:51 setup.sh.hugepages -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:22:07.250 11:32:51 setup.sh.hugepages -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:22:07.250 11:32:51 setup.sh.hugepages -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:22:07.250 11:32:51 setup.sh.hugepages -- setup/common.sh@17 -- # local get=Hugepagesize 00:22:07.250 11:32:51 setup.sh.hugepages -- setup/common.sh@18 -- # local node= 00:22:07.250 11:32:51 setup.sh.hugepages -- setup/common.sh@19 -- # local var val 00:22:07.250 11:32:51 setup.sh.hugepages -- setup/common.sh@20 -- # local mem_f mem 00:22:07.250 11:32:51 setup.sh.hugepages -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:22:07.250 11:32:51 setup.sh.hugepages -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:22:07.250 11:32:51 setup.sh.hugepages -- setup/common.sh@25 -- # [[ -n '' ]] 00:22:07.250 11:32:51 setup.sh.hugepages -- setup/common.sh@28 -- # mapfile -t mem 00:22:07.250 11:32:51 setup.sh.hugepages -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:22:07.250 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:22:07.250 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:22:07.250 11:32:51 setup.sh.hugepages -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293508 kB' 'MemFree: 72702724 kB' 'MemAvailable: 76055716 kB' 'Buffers: 10256 kB' 'Cached: 13542536 kB' 'SwapCached: 0 kB' 'Active: 10581532 kB' 'Inactive: 3469844 kB' 'Active(anon): 10143968 kB' 'Inactive(anon): 0 kB' 'Active(file): 437564 kB' 'Inactive(file): 3469844 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 501940 kB' 'Mapped: 172464 kB' 'Shmem: 9645384 kB' 'KReclaimable: 198456 kB' 'Slab: 502004 kB' 'SReclaimable: 198456 kB' 'SUnreclaim: 303548 kB' 'KernelStack: 16192 kB' 'PageTables: 8028 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52438204 kB' 'Committed_AS: 11553912 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201028 kB' 'VmallocChunk: 0 kB' 'Percpu: 52800 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 667056 kB' 'DirectMap2M: 12640256 kB' 'DirectMap1G: 88080384 kB' 00:22:07.250 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:22:07.250 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:22:07.250 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:22:07.250 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:22:07.250 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:22:07.250 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:22:07.250 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:22:07.250 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:22:07.250 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:22:07.250 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:22:07.250 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:22:07.251 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/common.sh@33 -- # echo 2048 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/common.sh@33 -- # return 0 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/hugepages.sh@21 -- # unset -v HUGE_EVEN_ALLOC 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/hugepages.sh@22 -- # unset -v HUGEMEM 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/hugepages.sh@23 -- # unset -v HUGENODE 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/hugepages.sh@24 -- # unset -v NRHUGE 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/hugepages.sh@207 -- # get_nodes 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/hugepages.sh@27 -- # local node 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/hugepages.sh@32 -- # no_nodes=2 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/hugepages.sh@208 -- # clear_hp 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/hugepages.sh@37 -- # local node hp 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:22:07.252 11:32:51 setup.sh.hugepages -- setup/hugepages.sh@210 -- # run_test default_setup default_setup 00:22:07.252 11:32:51 setup.sh.hugepages -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:22:07.252 11:32:51 setup.sh.hugepages -- common/autotest_common.sh@1106 -- # xtrace_disable 00:22:07.252 11:32:51 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:22:07.511 ************************************ 00:22:07.511 START TEST default_setup 00:22:07.511 ************************************ 00:22:07.511 11:32:51 setup.sh.hugepages.default_setup -- common/autotest_common.sh@1124 -- # default_setup 00:22:07.511 11:32:51 setup.sh.hugepages.default_setup -- setup/hugepages.sh@136 -- # get_test_nr_hugepages 2097152 0 00:22:07.511 11:32:51 setup.sh.hugepages.default_setup -- setup/hugepages.sh@49 -- # local size=2097152 00:22:07.511 11:32:51 setup.sh.hugepages.default_setup -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:22:07.511 11:32:51 setup.sh.hugepages.default_setup -- setup/hugepages.sh@51 -- # shift 00:22:07.511 11:32:51 setup.sh.hugepages.default_setup -- setup/hugepages.sh@52 -- # node_ids=('0') 00:22:07.511 11:32:51 setup.sh.hugepages.default_setup -- setup/hugepages.sh@52 -- # local node_ids 00:22:07.511 11:32:51 setup.sh.hugepages.default_setup -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:22:07.511 11:32:51 setup.sh.hugepages.default_setup -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:22:07.511 11:32:51 setup.sh.hugepages.default_setup -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:22:07.511 11:32:51 setup.sh.hugepages.default_setup -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:22:07.511 11:32:51 setup.sh.hugepages.default_setup -- setup/hugepages.sh@62 -- # local user_nodes 00:22:07.511 11:32:51 setup.sh.hugepages.default_setup -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:22:07.511 11:32:51 setup.sh.hugepages.default_setup -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:22:07.511 11:32:51 setup.sh.hugepages.default_setup -- setup/hugepages.sh@67 -- # nodes_test=() 00:22:07.511 11:32:51 setup.sh.hugepages.default_setup -- setup/hugepages.sh@67 -- # local -g nodes_test 00:22:07.511 11:32:51 setup.sh.hugepages.default_setup -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:22:07.511 11:32:51 setup.sh.hugepages.default_setup -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:22:07.511 11:32:51 setup.sh.hugepages.default_setup -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:22:07.511 11:32:51 setup.sh.hugepages.default_setup -- setup/hugepages.sh@73 -- # return 0 00:22:07.511 11:32:51 setup.sh.hugepages.default_setup -- setup/hugepages.sh@137 -- # setup output 00:22:07.511 11:32:51 setup.sh.hugepages.default_setup -- setup/common.sh@9 -- # [[ output == output ]] 00:22:07.511 11:32:51 setup.sh.hugepages.default_setup -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:22:10.796 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:22:10.796 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:22:10.796 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:22:10.796 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:22:10.796 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:22:10.796 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:22:10.796 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:22:10.796 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:22:10.796 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:22:10.796 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:22:10.796 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:22:10.796 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:22:10.796 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:22:10.796 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:22:10.796 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:22:10.796 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:22:10.796 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:22:11.739 0000:5e:00.0 (8086 0b60): nvme -> vfio-pci 00:22:11.739 11:32:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@138 -- # verify_nr_hugepages 00:22:11.739 11:32:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@89 -- # local node 00:22:11.739 11:32:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@90 -- # local sorted_t 00:22:11.739 11:32:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@91 -- # local sorted_s 00:22:11.739 11:32:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@92 -- # local surp 00:22:11.739 11:32:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@93 -- # local resv 00:22:11.739 11:32:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@94 -- # local anon 00:22:11.739 11:32:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:22:11.739 11:32:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:22:11.739 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=AnonHugePages 00:22:11.739 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:22:11.739 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:22:11.739 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:22:11.739 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:22:11.739 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:22:11.739 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:22:11.739 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:22:11.739 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:22:11.739 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.739 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.739 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293508 kB' 'MemFree: 74817916 kB' 'MemAvailable: 78170836 kB' 'Buffers: 10256 kB' 'Cached: 13542648 kB' 'SwapCached: 0 kB' 'Active: 10600528 kB' 'Inactive: 3469844 kB' 'Active(anon): 10162964 kB' 'Inactive(anon): 0 kB' 'Active(file): 437564 kB' 'Inactive(file): 3469844 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 521352 kB' 'Mapped: 172688 kB' 'Shmem: 9645496 kB' 'KReclaimable: 198312 kB' 'Slab: 500792 kB' 'SReclaimable: 198312 kB' 'SUnreclaim: 302480 kB' 'KernelStack: 16576 kB' 'PageTables: 8580 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486780 kB' 'Committed_AS: 11575040 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201204 kB' 'VmallocChunk: 0 kB' 'Percpu: 52800 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 667056 kB' 'DirectMap2M: 12640256 kB' 'DirectMap1G: 88080384 kB' 00:22:11.739 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:11.739 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.739 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.740 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@97 -- # anon=0 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293508 kB' 'MemFree: 74817716 kB' 'MemAvailable: 78170636 kB' 'Buffers: 10256 kB' 'Cached: 13542648 kB' 'SwapCached: 0 kB' 'Active: 10600036 kB' 'Inactive: 3469844 kB' 'Active(anon): 10162472 kB' 'Inactive(anon): 0 kB' 'Active(file): 437564 kB' 'Inactive(file): 3469844 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 520728 kB' 'Mapped: 172672 kB' 'Shmem: 9645496 kB' 'KReclaimable: 198312 kB' 'Slab: 500760 kB' 'SReclaimable: 198312 kB' 'SUnreclaim: 302448 kB' 'KernelStack: 16496 kB' 'PageTables: 8704 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486780 kB' 'Committed_AS: 11575056 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201204 kB' 'VmallocChunk: 0 kB' 'Percpu: 52800 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 667056 kB' 'DirectMap2M: 12640256 kB' 'DirectMap1G: 88080384 kB' 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.741 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.742 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@99 -- # surp=0 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293508 kB' 'MemFree: 74817804 kB' 'MemAvailable: 78170724 kB' 'Buffers: 10256 kB' 'Cached: 13542652 kB' 'SwapCached: 0 kB' 'Active: 10600724 kB' 'Inactive: 3469844 kB' 'Active(anon): 10163160 kB' 'Inactive(anon): 0 kB' 'Active(file): 437564 kB' 'Inactive(file): 3469844 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 520848 kB' 'Mapped: 172680 kB' 'Shmem: 9645500 kB' 'KReclaimable: 198312 kB' 'Slab: 500828 kB' 'SReclaimable: 198312 kB' 'SUnreclaim: 302516 kB' 'KernelStack: 16432 kB' 'PageTables: 8256 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486780 kB' 'Committed_AS: 11576196 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201268 kB' 'VmallocChunk: 0 kB' 'Percpu: 52800 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 667056 kB' 'DirectMap2M: 12640256 kB' 'DirectMap1G: 88080384 kB' 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:11.743 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.744 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.744 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.744 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:11.744 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.744 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.744 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.744 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:11.744 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.744 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.744 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.744 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:11.744 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.744 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.744 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.744 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:11.744 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.744 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.744 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.744 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:11.744 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.744 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.744 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.744 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:11.744 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.744 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.744 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.744 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:11.744 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.744 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.744 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.744 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:11.744 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.744 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.744 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.744 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:11.745 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.745 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.745 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.745 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:11.745 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.745 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.745 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.745 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:11.745 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.745 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.745 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.745 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:11.745 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.745 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.745 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.745 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:11.745 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.745 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.745 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.745 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:11.745 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.745 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.745 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.745 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:11.745 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.745 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.745 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.745 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:11.745 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.745 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.745 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.745 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:11.745 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.745 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.745 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.745 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:11.745 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.745 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.745 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.745 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:11.745 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.745 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.745 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.745 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:11.745 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.745 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.745 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.745 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:11.745 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.745 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.745 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.745 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:11.745 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.745 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.745 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.745 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@100 -- # resv=0 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:22:11.746 nr_hugepages=1024 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:22:11.746 resv_hugepages=0 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:22:11.746 surplus_hugepages=0 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:22:11.746 anon_hugepages=0 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Total 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293508 kB' 'MemFree: 74817756 kB' 'MemAvailable: 78170676 kB' 'Buffers: 10256 kB' 'Cached: 13542688 kB' 'SwapCached: 0 kB' 'Active: 10599948 kB' 'Inactive: 3469844 kB' 'Active(anon): 10162384 kB' 'Inactive(anon): 0 kB' 'Active(file): 437564 kB' 'Inactive(file): 3469844 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 520076 kB' 'Mapped: 172680 kB' 'Shmem: 9645536 kB' 'KReclaimable: 198312 kB' 'Slab: 500956 kB' 'SReclaimable: 198312 kB' 'SUnreclaim: 302644 kB' 'KernelStack: 16400 kB' 'PageTables: 8048 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486780 kB' 'Committed_AS: 11576576 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201204 kB' 'VmallocChunk: 0 kB' 'Percpu: 52800 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 667056 kB' 'DirectMap2M: 12640256 kB' 'DirectMap1G: 88080384 kB' 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.746 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.747 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 1024 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@112 -- # get_nodes 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@27 -- # local node 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@32 -- # no_nodes=2 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node=0 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116952 kB' 'MemFree: 41677540 kB' 'MemUsed: 6439412 kB' 'SwapCached: 0 kB' 'Active: 3423276 kB' 'Inactive: 189308 kB' 'Active(anon): 3176576 kB' 'Inactive(anon): 0 kB' 'Active(file): 246700 kB' 'Inactive(file): 189308 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3327152 kB' 'Mapped: 86620 kB' 'AnonPages: 288660 kB' 'Shmem: 2891144 kB' 'KernelStack: 8968 kB' 'PageTables: 4044 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 118472 kB' 'Slab: 286488 kB' 'SReclaimable: 118472 kB' 'SUnreclaim: 168016 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.748 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:22:11.749 11:32:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:22:11.750 11:32:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:22:11.750 11:32:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:22:11.750 11:32:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:22:11.750 node0=1024 expecting 1024 00:22:11.750 11:32:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:22:11.750 00:22:11.750 real 0m4.465s 00:22:11.750 user 0m1.135s 00:22:11.750 sys 0m2.113s 00:22:11.750 11:32:55 setup.sh.hugepages.default_setup -- common/autotest_common.sh@1125 -- # xtrace_disable 00:22:11.750 11:32:55 setup.sh.hugepages.default_setup -- common/autotest_common.sh@10 -- # set +x 00:22:11.750 ************************************ 00:22:11.750 END TEST default_setup 00:22:11.750 ************************************ 00:22:12.008 11:32:55 setup.sh.hugepages -- setup/hugepages.sh@211 -- # run_test per_node_1G_alloc per_node_1G_alloc 00:22:12.008 11:32:55 setup.sh.hugepages -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:22:12.008 11:32:55 setup.sh.hugepages -- common/autotest_common.sh@1106 -- # xtrace_disable 00:22:12.008 11:32:55 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:22:12.008 ************************************ 00:22:12.008 START TEST per_node_1G_alloc 00:22:12.008 ************************************ 00:22:12.008 11:32:55 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@1124 -- # per_node_1G_alloc 00:22:12.008 11:32:55 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@143 -- # local IFS=, 00:22:12.008 11:32:55 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@145 -- # get_test_nr_hugepages 1048576 0 1 00:22:12.008 11:32:55 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@49 -- # local size=1048576 00:22:12.008 11:32:55 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@50 -- # (( 3 > 1 )) 00:22:12.008 11:32:55 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@51 -- # shift 00:22:12.008 11:32:55 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@52 -- # node_ids=('0' '1') 00:22:12.008 11:32:55 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@52 -- # local node_ids 00:22:12.008 11:32:55 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:22:12.008 11:32:55 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:22:12.008 11:32:55 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 1 00:22:12.008 11:32:55 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@62 -- # user_nodes=('0' '1') 00:22:12.008 11:32:55 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:22:12.008 11:32:55 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:22:12.008 11:32:55 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:22:12.008 11:32:55 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:22:12.008 11:32:55 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:22:12.008 11:32:55 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@69 -- # (( 2 > 0 )) 00:22:12.008 11:32:55 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:22:12.008 11:32:55 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:22:12.008 11:32:55 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:22:12.008 11:32:55 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:22:12.008 11:32:55 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@73 -- # return 0 00:22:12.008 11:32:55 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # NRHUGE=512 00:22:12.008 11:32:55 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # HUGENODE=0,1 00:22:12.008 11:32:55 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # setup output 00:22:12.008 11:32:55 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:22:12.008 11:32:55 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:22:14.541 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:22:14.803 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:22:14.803 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:22:14.803 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:22:14.803 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:22:14.803 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:22:14.803 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:22:14.803 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:22:14.803 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:22:14.803 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:22:14.803 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:22:14.803 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:22:14.803 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:22:14.803 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:22:14.803 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:22:14.803 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:22:14.803 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:22:14.803 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:22:14.803 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@147 -- # nr_hugepages=1024 00:22:14.803 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@147 -- # verify_nr_hugepages 00:22:14.803 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@89 -- # local node 00:22:14.803 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:22:14.803 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:22:14.803 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@92 -- # local surp 00:22:14.803 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@93 -- # local resv 00:22:14.803 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@94 -- # local anon 00:22:14.803 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:22:14.803 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:22:14.803 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:22:14.803 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:22:14.803 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:22:14.803 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:22:14.803 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:22:14.803 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:22:14.803 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:22:14.803 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:22:14.803 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:22:14.803 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.803 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.803 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293508 kB' 'MemFree: 74819032 kB' 'MemAvailable: 78171952 kB' 'Buffers: 10256 kB' 'Cached: 13542776 kB' 'SwapCached: 0 kB' 'Active: 10599212 kB' 'Inactive: 3469844 kB' 'Active(anon): 10161648 kB' 'Inactive(anon): 0 kB' 'Active(file): 437564 kB' 'Inactive(file): 3469844 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 517792 kB' 'Mapped: 172204 kB' 'Shmem: 9645624 kB' 'KReclaimable: 198312 kB' 'Slab: 500996 kB' 'SReclaimable: 198312 kB' 'SUnreclaim: 302684 kB' 'KernelStack: 16304 kB' 'PageTables: 8160 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486780 kB' 'Committed_AS: 11568920 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201108 kB' 'VmallocChunk: 0 kB' 'Percpu: 52800 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 667056 kB' 'DirectMap2M: 12640256 kB' 'DirectMap1G: 88080384 kB' 00:22:14.803 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:14.803 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.803 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.803 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.803 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:14.803 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.803 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.803 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.803 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:14.803 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.803 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.803 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.803 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:14.803 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.803 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.803 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.803 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:14.803 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.803 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.803 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.803 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:14.803 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.803 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:14.804 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@97 -- # anon=0 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293508 kB' 'MemFree: 74819692 kB' 'MemAvailable: 78172612 kB' 'Buffers: 10256 kB' 'Cached: 13542780 kB' 'SwapCached: 0 kB' 'Active: 10597916 kB' 'Inactive: 3469844 kB' 'Active(anon): 10160352 kB' 'Inactive(anon): 0 kB' 'Active(file): 437564 kB' 'Inactive(file): 3469844 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 517500 kB' 'Mapped: 171652 kB' 'Shmem: 9645628 kB' 'KReclaimable: 198312 kB' 'Slab: 500976 kB' 'SReclaimable: 198312 kB' 'SUnreclaim: 302664 kB' 'KernelStack: 16240 kB' 'PageTables: 7880 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486780 kB' 'Committed_AS: 11563540 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201028 kB' 'VmallocChunk: 0 kB' 'Percpu: 52800 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 667056 kB' 'DirectMap2M: 12640256 kB' 'DirectMap1G: 88080384 kB' 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.805 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.806 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@99 -- # surp=0 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293508 kB' 'MemFree: 74819088 kB' 'MemAvailable: 78172008 kB' 'Buffers: 10256 kB' 'Cached: 13542780 kB' 'SwapCached: 0 kB' 'Active: 10597240 kB' 'Inactive: 3469844 kB' 'Active(anon): 10159676 kB' 'Inactive(anon): 0 kB' 'Active(file): 437564 kB' 'Inactive(file): 3469844 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 517272 kB' 'Mapped: 171572 kB' 'Shmem: 9645628 kB' 'KReclaimable: 198312 kB' 'Slab: 500912 kB' 'SReclaimable: 198312 kB' 'SUnreclaim: 302600 kB' 'KernelStack: 16224 kB' 'PageTables: 7812 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486780 kB' 'Committed_AS: 11563560 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201028 kB' 'VmallocChunk: 0 kB' 'Percpu: 52800 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 667056 kB' 'DirectMap2M: 12640256 kB' 'DirectMap1G: 88080384 kB' 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.807 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:14.808 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@100 -- # resv=0 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:22:14.809 nr_hugepages=1024 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:22:14.809 resv_hugepages=0 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:22:14.809 surplus_hugepages=0 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:22:14.809 anon_hugepages=0 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293508 kB' 'MemFree: 74819088 kB' 'MemAvailable: 78172004 kB' 'Buffers: 10256 kB' 'Cached: 13542832 kB' 'SwapCached: 0 kB' 'Active: 10597268 kB' 'Inactive: 3469844 kB' 'Active(anon): 10159704 kB' 'Inactive(anon): 0 kB' 'Active(file): 437564 kB' 'Inactive(file): 3469844 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 517256 kB' 'Mapped: 171572 kB' 'Shmem: 9645680 kB' 'KReclaimable: 198304 kB' 'Slab: 500904 kB' 'SReclaimable: 198304 kB' 'SUnreclaim: 302600 kB' 'KernelStack: 16240 kB' 'PageTables: 7852 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486780 kB' 'Committed_AS: 11563584 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201028 kB' 'VmallocChunk: 0 kB' 'Percpu: 52800 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 667056 kB' 'DirectMap2M: 12640256 kB' 'DirectMap1G: 88080384 kB' 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.809 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.810 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.811 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:14.811 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.811 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.811 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.811 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:14.811 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.811 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.811 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.811 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:14.811 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.811 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.811 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.811 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:14.811 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.811 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.811 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.811 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:14.811 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.811 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.811 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.811 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:14.811 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.811 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.811 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.811 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:14.811 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.811 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.811 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.811 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:14.811 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:14.811 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:14.811 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:14.811 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:14.811 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 1024 00:22:14.811 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:22:14.811 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:22:14.811 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:22:14.811 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@27 -- # local node 00:22:14.811 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:22:14.811 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:22:14.811 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:22:14.811 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:22:14.811 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:22:14.811 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:22:14.811 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:22:14.811 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:22:14.811 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:22:14.811 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:22:15.070 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node=0 00:22:15.070 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:22:15.070 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:22:15.070 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:22:15.070 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:22:15.070 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:22:15.070 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:22:15.070 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:22:15.070 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.070 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.070 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116952 kB' 'MemFree: 42726476 kB' 'MemUsed: 5390476 kB' 'SwapCached: 0 kB' 'Active: 3422372 kB' 'Inactive: 189308 kB' 'Active(anon): 3175672 kB' 'Inactive(anon): 0 kB' 'Active(file): 246700 kB' 'Inactive(file): 189308 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3327256 kB' 'Mapped: 85812 kB' 'AnonPages: 287556 kB' 'Shmem: 2891248 kB' 'KernelStack: 8904 kB' 'PageTables: 3792 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 118464 kB' 'Slab: 286552 kB' 'SReclaimable: 118464 kB' 'SUnreclaim: 168088 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:22:15.070 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.070 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.070 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.070 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.070 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.070 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.070 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.070 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.070 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.070 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.070 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.070 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.070 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.070 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.071 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node=1 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44176556 kB' 'MemFree: 32092244 kB' 'MemUsed: 12084312 kB' 'SwapCached: 0 kB' 'Active: 7174932 kB' 'Inactive: 3280536 kB' 'Active(anon): 6984068 kB' 'Inactive(anon): 0 kB' 'Active(file): 190864 kB' 'Inactive(file): 3280536 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 10225840 kB' 'Mapped: 85760 kB' 'AnonPages: 229756 kB' 'Shmem: 6754440 kB' 'KernelStack: 7336 kB' 'PageTables: 4064 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 79840 kB' 'Slab: 214352 kB' 'SReclaimable: 79840 kB' 'SUnreclaim: 134512 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.072 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:22:15.073 node0=512 expecting 512 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:22:15.073 node1=512 expecting 512 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:22:15.073 00:22:15.073 real 0m3.065s 00:22:15.073 user 0m1.085s 00:22:15.073 sys 0m1.974s 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@1125 -- # xtrace_disable 00:22:15.073 11:32:58 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@10 -- # set +x 00:22:15.073 ************************************ 00:22:15.073 END TEST per_node_1G_alloc 00:22:15.073 ************************************ 00:22:15.073 11:32:58 setup.sh.hugepages -- setup/hugepages.sh@212 -- # run_test even_2G_alloc even_2G_alloc 00:22:15.073 11:32:58 setup.sh.hugepages -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:22:15.073 11:32:58 setup.sh.hugepages -- common/autotest_common.sh@1106 -- # xtrace_disable 00:22:15.073 11:32:58 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:22:15.073 ************************************ 00:22:15.073 START TEST even_2G_alloc 00:22:15.073 ************************************ 00:22:15.073 11:32:58 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1124 -- # even_2G_alloc 00:22:15.073 11:32:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@152 -- # get_test_nr_hugepages 2097152 00:22:15.073 11:32:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:22:15.073 11:32:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:22:15.073 11:32:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:22:15.073 11:32:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:22:15.073 11:32:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:22:15.073 11:32:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:22:15.073 11:32:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:22:15.073 11:32:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:22:15.073 11:32:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:22:15.073 11:32:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:22:15.073 11:32:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:22:15.073 11:32:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:22:15.073 11:32:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:22:15.073 11:32:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:22:15.073 11:32:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:22:15.073 11:32:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 512 00:22:15.073 11:32:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@84 -- # : 1 00:22:15.073 11:32:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:22:15.073 11:32:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:22:15.073 11:32:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 0 00:22:15.073 11:32:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@84 -- # : 0 00:22:15.073 11:32:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:22:15.073 11:32:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # NRHUGE=1024 00:22:15.073 11:32:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # HUGE_EVEN_ALLOC=yes 00:22:15.073 11:32:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # setup output 00:22:15.073 11:32:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:22:15.073 11:32:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:22:18.362 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:22:18.362 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:22:18.362 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:22:18.362 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:22:18.362 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:22:18.362 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:22:18.362 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:22:18.362 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:22:18.362 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:22:18.362 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:22:18.362 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:22:18.362 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:22:18.362 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:22:18.362 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:22:18.362 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:22:18.362 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:22:18.362 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:22:18.362 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:22:18.362 11:33:01 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@154 -- # verify_nr_hugepages 00:22:18.362 11:33:01 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@89 -- # local node 00:22:18.362 11:33:01 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:22:18.362 11:33:01 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:22:18.362 11:33:01 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@92 -- # local surp 00:22:18.362 11:33:01 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@93 -- # local resv 00:22:18.362 11:33:01 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@94 -- # local anon 00:22:18.362 11:33:01 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:22:18.362 11:33:01 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:22:18.362 11:33:01 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:22:18.362 11:33:01 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:22:18.362 11:33:01 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:22:18.362 11:33:01 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:22:18.362 11:33:01 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:22:18.362 11:33:01 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:22:18.362 11:33:01 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:22:18.362 11:33:01 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:22:18.362 11:33:01 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:22:18.362 11:33:01 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.362 11:33:01 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.362 11:33:01 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293508 kB' 'MemFree: 74744800 kB' 'MemAvailable: 78097716 kB' 'Buffers: 10256 kB' 'Cached: 13542928 kB' 'SwapCached: 0 kB' 'Active: 10604524 kB' 'Inactive: 3469844 kB' 'Active(anon): 10166960 kB' 'Inactive(anon): 0 kB' 'Active(file): 437564 kB' 'Inactive(file): 3469844 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 523964 kB' 'Mapped: 172584 kB' 'Shmem: 9645776 kB' 'KReclaimable: 198304 kB' 'Slab: 501272 kB' 'SReclaimable: 198304 kB' 'SUnreclaim: 302968 kB' 'KernelStack: 16352 kB' 'PageTables: 8272 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486780 kB' 'Committed_AS: 11573220 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201112 kB' 'VmallocChunk: 0 kB' 'Percpu: 52800 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 667056 kB' 'DirectMap2M: 12640256 kB' 'DirectMap1G: 88080384 kB' 00:22:18.362 11:33:01 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:18.362 11:33:01 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.362 11:33:01 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.362 11:33:01 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.362 11:33:01 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:18.362 11:33:01 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.362 11:33:01 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.362 11:33:01 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.362 11:33:01 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:18.362 11:33:01 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.362 11:33:01 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.363 11:33:01 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.363 11:33:01 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:18.363 11:33:01 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.363 11:33:01 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.363 11:33:01 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.363 11:33:01 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:18.363 11:33:01 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.363 11:33:01 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.363 11:33:01 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.363 11:33:01 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:18.363 11:33:01 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.363 11:33:01 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.363 11:33:01 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.363 11:33:01 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:18.363 11:33:01 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.363 11:33:01 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.363 11:33:01 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.363 11:33:01 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:18.363 11:33:01 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@97 -- # anon=0 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293508 kB' 'MemFree: 74747428 kB' 'MemAvailable: 78100344 kB' 'Buffers: 10256 kB' 'Cached: 13542932 kB' 'SwapCached: 0 kB' 'Active: 10604760 kB' 'Inactive: 3469844 kB' 'Active(anon): 10167196 kB' 'Inactive(anon): 0 kB' 'Active(file): 437564 kB' 'Inactive(file): 3469844 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 524252 kB' 'Mapped: 172580 kB' 'Shmem: 9645780 kB' 'KReclaimable: 198304 kB' 'Slab: 501272 kB' 'SReclaimable: 198304 kB' 'SUnreclaim: 302968 kB' 'KernelStack: 16368 kB' 'PageTables: 8304 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486780 kB' 'Committed_AS: 11573240 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201080 kB' 'VmallocChunk: 0 kB' 'Percpu: 52800 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 667056 kB' 'DirectMap2M: 12640256 kB' 'DirectMap1G: 88080384 kB' 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.363 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # surp=0 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293508 kB' 'MemFree: 74747444 kB' 'MemAvailable: 78100360 kB' 'Buffers: 10256 kB' 'Cached: 13542948 kB' 'SwapCached: 0 kB' 'Active: 10603828 kB' 'Inactive: 3469844 kB' 'Active(anon): 10166264 kB' 'Inactive(anon): 0 kB' 'Active(file): 437564 kB' 'Inactive(file): 3469844 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 523664 kB' 'Mapped: 172500 kB' 'Shmem: 9645796 kB' 'KReclaimable: 198304 kB' 'Slab: 501224 kB' 'SReclaimable: 198304 kB' 'SUnreclaim: 302920 kB' 'KernelStack: 16320 kB' 'PageTables: 8124 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486780 kB' 'Committed_AS: 11573260 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201080 kB' 'VmallocChunk: 0 kB' 'Percpu: 52800 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 667056 kB' 'DirectMap2M: 12640256 kB' 'DirectMap1G: 88080384 kB' 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.364 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@100 -- # resv=0 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:22:18.365 nr_hugepages=1024 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:22:18.365 resv_hugepages=0 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:22:18.365 surplus_hugepages=0 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:22:18.365 anon_hugepages=0 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293508 kB' 'MemFree: 74747444 kB' 'MemAvailable: 78100360 kB' 'Buffers: 10256 kB' 'Cached: 13542968 kB' 'SwapCached: 0 kB' 'Active: 10603856 kB' 'Inactive: 3469844 kB' 'Active(anon): 10166292 kB' 'Inactive(anon): 0 kB' 'Active(file): 437564 kB' 'Inactive(file): 3469844 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 523656 kB' 'Mapped: 172500 kB' 'Shmem: 9645816 kB' 'KReclaimable: 198304 kB' 'Slab: 501224 kB' 'SReclaimable: 198304 kB' 'SUnreclaim: 302920 kB' 'KernelStack: 16320 kB' 'PageTables: 8124 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486780 kB' 'Committed_AS: 11573280 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201080 kB' 'VmallocChunk: 0 kB' 'Percpu: 52800 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 667056 kB' 'DirectMap2M: 12640256 kB' 'DirectMap1G: 88080384 kB' 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:18.365 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 1024 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@27 -- # local node 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=0 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116952 kB' 'MemFree: 42672716 kB' 'MemUsed: 5444236 kB' 'SwapCached: 0 kB' 'Active: 3428732 kB' 'Inactive: 189308 kB' 'Active(anon): 3182032 kB' 'Inactive(anon): 0 kB' 'Active(file): 246700 kB' 'Inactive(file): 189308 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3327380 kB' 'Mapped: 86576 kB' 'AnonPages: 293816 kB' 'Shmem: 2891372 kB' 'KernelStack: 8984 kB' 'PageTables: 4016 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 118464 kB' 'Slab: 286556 kB' 'SReclaimable: 118464 kB' 'SUnreclaim: 168092 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.366 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=1 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44176556 kB' 'MemFree: 32074728 kB' 'MemUsed: 12101828 kB' 'SwapCached: 0 kB' 'Active: 7175124 kB' 'Inactive: 3280536 kB' 'Active(anon): 6984260 kB' 'Inactive(anon): 0 kB' 'Active(file): 190864 kB' 'Inactive(file): 3280536 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 10225868 kB' 'Mapped: 85924 kB' 'AnonPages: 229840 kB' 'Shmem: 6754468 kB' 'KernelStack: 7336 kB' 'PageTables: 4108 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 79840 kB' 'Slab: 214668 kB' 'SReclaimable: 79840 kB' 'SUnreclaim: 134828 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.367 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:22:18.368 node0=512 expecting 512 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:22:18.368 node1=512 expecting 512 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:22:18.368 00:22:18.368 real 0m3.307s 00:22:18.368 user 0m1.216s 00:22:18.368 sys 0m2.130s 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1125 -- # xtrace_disable 00:22:18.368 11:33:02 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@10 -- # set +x 00:22:18.368 ************************************ 00:22:18.368 END TEST even_2G_alloc 00:22:18.368 ************************************ 00:22:18.368 11:33:02 setup.sh.hugepages -- setup/hugepages.sh@213 -- # run_test odd_alloc odd_alloc 00:22:18.368 11:33:02 setup.sh.hugepages -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:22:18.368 11:33:02 setup.sh.hugepages -- common/autotest_common.sh@1106 -- # xtrace_disable 00:22:18.368 11:33:02 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:22:18.368 ************************************ 00:22:18.368 START TEST odd_alloc 00:22:18.368 ************************************ 00:22:18.368 11:33:02 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1124 -- # odd_alloc 00:22:18.368 11:33:02 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@159 -- # get_test_nr_hugepages 2098176 00:22:18.368 11:33:02 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@49 -- # local size=2098176 00:22:18.368 11:33:02 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:22:18.368 11:33:02 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:22:18.368 11:33:02 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1025 00:22:18.368 11:33:02 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:22:18.368 11:33:02 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:22:18.368 11:33:02 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:22:18.368 11:33:02 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1025 00:22:18.368 11:33:02 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:22:18.368 11:33:02 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:22:18.368 11:33:02 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:22:18.368 11:33:02 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:22:18.368 11:33:02 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:22:18.368 11:33:02 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:22:18.368 11:33:02 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:22:18.368 11:33:02 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 513 00:22:18.368 11:33:02 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@84 -- # : 1 00:22:18.368 11:33:02 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:22:18.368 11:33:02 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=513 00:22:18.368 11:33:02 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 0 00:22:18.368 11:33:02 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@84 -- # : 0 00:22:18.368 11:33:02 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:22:18.368 11:33:02 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # HUGEMEM=2049 00:22:18.368 11:33:02 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # HUGE_EVEN_ALLOC=yes 00:22:18.368 11:33:02 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # setup output 00:22:18.368 11:33:02 setup.sh.hugepages.odd_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:22:18.368 11:33:02 setup.sh.hugepages.odd_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:22:21.651 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:22:21.911 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:22:21.911 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:22:21.911 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:22:21.911 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:22:21.911 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:22:21.911 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:22:21.911 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:22:21.911 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:22:21.911 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:22:21.911 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:22:21.911 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:22:21.911 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:22:21.911 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:22:21.911 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:22:21.911 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:22:21.911 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:22:21.911 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@161 -- # verify_nr_hugepages 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@89 -- # local node 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@92 -- # local surp 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@93 -- # local resv 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@94 -- # local anon 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293508 kB' 'MemFree: 74730284 kB' 'MemAvailable: 78083200 kB' 'Buffers: 10256 kB' 'Cached: 13543076 kB' 'SwapCached: 0 kB' 'Active: 10596096 kB' 'Inactive: 3469844 kB' 'Active(anon): 10158532 kB' 'Inactive(anon): 0 kB' 'Active(file): 437564 kB' 'Inactive(file): 3469844 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 515804 kB' 'Mapped: 171608 kB' 'Shmem: 9645924 kB' 'KReclaimable: 198304 kB' 'Slab: 500792 kB' 'SReclaimable: 198304 kB' 'SUnreclaim: 302488 kB' 'KernelStack: 16256 kB' 'PageTables: 7936 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485756 kB' 'Committed_AS: 11566264 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201076 kB' 'VmallocChunk: 0 kB' 'Percpu: 52800 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 667056 kB' 'DirectMap2M: 12640256 kB' 'DirectMap1G: 88080384 kB' 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:21.911 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@97 -- # anon=0 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293508 kB' 'MemFree: 74731140 kB' 'MemAvailable: 78084056 kB' 'Buffers: 10256 kB' 'Cached: 13543092 kB' 'SwapCached: 0 kB' 'Active: 10595880 kB' 'Inactive: 3469844 kB' 'Active(anon): 10158316 kB' 'Inactive(anon): 0 kB' 'Active(file): 437564 kB' 'Inactive(file): 3469844 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 515608 kB' 'Mapped: 171604 kB' 'Shmem: 9645940 kB' 'KReclaimable: 198304 kB' 'Slab: 500828 kB' 'SReclaimable: 198304 kB' 'SUnreclaim: 302524 kB' 'KernelStack: 16240 kB' 'PageTables: 7900 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485756 kB' 'Committed_AS: 11566280 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201044 kB' 'VmallocChunk: 0 kB' 'Percpu: 52800 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 667056 kB' 'DirectMap2M: 12640256 kB' 'DirectMap1G: 88080384 kB' 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:21.912 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:21.913 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.177 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # surp=0 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293508 kB' 'MemFree: 74731568 kB' 'MemAvailable: 78084484 kB' 'Buffers: 10256 kB' 'Cached: 13543096 kB' 'SwapCached: 0 kB' 'Active: 10596208 kB' 'Inactive: 3469844 kB' 'Active(anon): 10158644 kB' 'Inactive(anon): 0 kB' 'Active(file): 437564 kB' 'Inactive(file): 3469844 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 515924 kB' 'Mapped: 171604 kB' 'Shmem: 9645944 kB' 'KReclaimable: 198304 kB' 'Slab: 500828 kB' 'SReclaimable: 198304 kB' 'SUnreclaim: 302524 kB' 'KernelStack: 16224 kB' 'PageTables: 7860 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485756 kB' 'Committed_AS: 11566304 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201060 kB' 'VmallocChunk: 0 kB' 'Percpu: 52800 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 667056 kB' 'DirectMap2M: 12640256 kB' 'DirectMap1G: 88080384 kB' 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.178 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@100 -- # resv=0 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1025 00:22:22.179 nr_hugepages=1025 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:22:22.179 resv_hugepages=0 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:22:22.179 surplus_hugepages=0 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:22:22.179 anon_hugepages=0 00:22:22.179 11:33:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@107 -- # (( 1025 == nr_hugepages + surp + resv )) 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages )) 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293508 kB' 'MemFree: 74731568 kB' 'MemAvailable: 78084484 kB' 'Buffers: 10256 kB' 'Cached: 13543116 kB' 'SwapCached: 0 kB' 'Active: 10596300 kB' 'Inactive: 3469844 kB' 'Active(anon): 10158736 kB' 'Inactive(anon): 0 kB' 'Active(file): 437564 kB' 'Inactive(file): 3469844 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 516020 kB' 'Mapped: 171604 kB' 'Shmem: 9645964 kB' 'KReclaimable: 198304 kB' 'Slab: 500828 kB' 'SReclaimable: 198304 kB' 'SUnreclaim: 302524 kB' 'KernelStack: 16256 kB' 'PageTables: 7948 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485756 kB' 'Committed_AS: 11566324 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201076 kB' 'VmallocChunk: 0 kB' 'Percpu: 52800 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 667056 kB' 'DirectMap2M: 12640256 kB' 'DirectMap1G: 88080384 kB' 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.180 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 1025 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@110 -- # (( 1025 == nr_hugepages + surp + resv )) 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@27 -- # local node 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:22:22.181 11:33:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=513 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=0 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116952 kB' 'MemFree: 42685292 kB' 'MemUsed: 5431660 kB' 'SwapCached: 0 kB' 'Active: 3421692 kB' 'Inactive: 189308 kB' 'Active(anon): 3174992 kB' 'Inactive(anon): 0 kB' 'Active(file): 246700 kB' 'Inactive(file): 189308 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3327440 kB' 'Mapped: 85812 kB' 'AnonPages: 286752 kB' 'Shmem: 2891432 kB' 'KernelStack: 8952 kB' 'PageTables: 3880 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 118464 kB' 'Slab: 286408 kB' 'SReclaimable: 118464 kB' 'SUnreclaim: 167944 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.182 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=1 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44176556 kB' 'MemFree: 32046596 kB' 'MemUsed: 12129960 kB' 'SwapCached: 0 kB' 'Active: 7174900 kB' 'Inactive: 3280536 kB' 'Active(anon): 6984036 kB' 'Inactive(anon): 0 kB' 'Active(file): 190864 kB' 'Inactive(file): 3280536 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 10225972 kB' 'Mapped: 85792 kB' 'AnonPages: 229588 kB' 'Shmem: 6754572 kB' 'KernelStack: 7304 kB' 'PageTables: 4072 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 79840 kB' 'Slab: 214420 kB' 'SReclaimable: 79840 kB' 'SUnreclaim: 134580 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.183 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 513' 00:22:22.184 node0=512 expecting 513 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@128 -- # echo 'node1=513 expecting 512' 00:22:22.184 node1=513 expecting 512 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@130 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:22:22.184 00:22:22.184 real 0m3.736s 00:22:22.184 user 0m1.370s 00:22:22.184 sys 0m2.443s 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1125 -- # xtrace_disable 00:22:22.184 11:33:05 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@10 -- # set +x 00:22:22.184 ************************************ 00:22:22.184 END TEST odd_alloc 00:22:22.184 ************************************ 00:22:22.184 11:33:06 setup.sh.hugepages -- setup/hugepages.sh@214 -- # run_test custom_alloc custom_alloc 00:22:22.184 11:33:06 setup.sh.hugepages -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:22:22.184 11:33:06 setup.sh.hugepages -- common/autotest_common.sh@1106 -- # xtrace_disable 00:22:22.184 11:33:06 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:22:22.184 ************************************ 00:22:22.184 START TEST custom_alloc 00:22:22.184 ************************************ 00:22:22.184 11:33:06 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1124 -- # custom_alloc 00:22:22.184 11:33:06 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@167 -- # local IFS=, 00:22:22.184 11:33:06 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@169 -- # local node 00:22:22.184 11:33:06 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@170 -- # nodes_hp=() 00:22:22.184 11:33:06 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@170 -- # local nodes_hp 00:22:22.184 11:33:06 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@172 -- # local nr_hugepages=0 _nr_hugepages=0 00:22:22.184 11:33:06 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@174 -- # get_test_nr_hugepages 1048576 00:22:22.184 11:33:06 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # local size=1048576 00:22:22.185 11:33:06 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:22:22.185 11:33:06 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:22:22.185 11:33:06 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:22:22.185 11:33:06 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:22:22.185 11:33:06 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:22:22.185 11:33:06 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:22:22.185 11:33:06 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:22:22.185 11:33:06 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:22:22.185 11:33:06 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:22:22.185 11:33:06 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:22:22.185 11:33:06 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:22:22.185 11:33:06 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:22:22.185 11:33:06 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:22:22.185 11:33:06 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:22:22.185 11:33:06 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 256 00:22:22.185 11:33:06 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@84 -- # : 1 00:22:22.185 11:33:06 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:22:22.185 11:33:06 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:22:22.185 11:33:06 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 0 00:22:22.185 11:33:06 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@84 -- # : 0 00:22:22.185 11:33:06 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:22:22.185 11:33:06 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@175 -- # nodes_hp[0]=512 00:22:22.185 11:33:06 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@176 -- # (( 2 > 1 )) 00:22:22.185 11:33:06 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@177 -- # get_test_nr_hugepages 2097152 00:22:22.185 11:33:06 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:22:22.185 11:33:06 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:22:22.185 11:33:06 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:22:22.185 11:33:06 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:22:22.185 11:33:06 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:22:22.185 11:33:06 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:22:22.185 11:33:06 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:22:22.185 11:33:06 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:22:22.185 11:33:06 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:22:22.185 11:33:06 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:22:22.185 11:33:06 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:22:22.185 11:33:06 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:22:22.185 11:33:06 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 1 > 0 )) 00:22:22.185 11:33:06 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:22:22.185 11:33:06 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:22:22.185 11:33:06 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@78 -- # return 0 00:22:22.185 11:33:06 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@178 -- # nodes_hp[1]=1024 00:22:22.185 11:33:06 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:22:22.185 11:33:06 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:22:22.185 11:33:06 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:22:22.185 11:33:06 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:22:22.185 11:33:06 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:22:22.185 11:33:06 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:22:22.185 11:33:06 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@186 -- # get_test_nr_hugepages_per_node 00:22:22.185 11:33:06 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:22:22.185 11:33:06 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:22:22.185 11:33:06 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:22:22.185 11:33:06 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:22:22.185 11:33:06 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:22:22.185 11:33:06 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:22:22.185 11:33:06 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:22:22.185 11:33:06 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 2 > 0 )) 00:22:22.185 11:33:06 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:22:22.185 11:33:06 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:22:22.185 11:33:06 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:22:22.185 11:33:06 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=1024 00:22:22.185 11:33:06 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@78 -- # return 0 00:22:22.185 11:33:06 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@187 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:22:22.185 11:33:06 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@187 -- # setup output 00:22:22.185 11:33:06 setup.sh.hugepages.custom_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:22:22.185 11:33:06 setup.sh.hugepages.custom_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:22:25.473 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:22:25.473 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:22:25.473 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:22:25.473 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:22:25.473 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:22:25.473 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:22:25.473 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:22:25.473 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:22:25.473 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:22:25.473 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:22:25.473 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:22:25.473 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:22:25.473 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:22:25.473 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:22:25.473 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:22:25.473 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:22:25.473 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:22:25.473 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:22:25.473 11:33:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@188 -- # nr_hugepages=1536 00:22:25.473 11:33:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@188 -- # verify_nr_hugepages 00:22:25.473 11:33:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@89 -- # local node 00:22:25.473 11:33:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:22:25.473 11:33:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:22:25.473 11:33:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@92 -- # local surp 00:22:25.473 11:33:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@93 -- # local resv 00:22:25.473 11:33:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@94 -- # local anon 00:22:25.473 11:33:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:22:25.473 11:33:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:22:25.473 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:22:25.473 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:22:25.473 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:22:25.473 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:22:25.473 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:22:25.473 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:22:25.473 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:22:25.473 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:22:25.473 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:22:25.473 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.473 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.473 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293508 kB' 'MemFree: 73723016 kB' 'MemAvailable: 77075932 kB' 'Buffers: 10256 kB' 'Cached: 13543232 kB' 'SwapCached: 0 kB' 'Active: 10596224 kB' 'Inactive: 3469844 kB' 'Active(anon): 10158660 kB' 'Inactive(anon): 0 kB' 'Active(file): 437564 kB' 'Inactive(file): 3469844 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 515792 kB' 'Mapped: 171624 kB' 'Shmem: 9646080 kB' 'KReclaimable: 198304 kB' 'Slab: 500452 kB' 'SReclaimable: 198304 kB' 'SUnreclaim: 302148 kB' 'KernelStack: 16224 kB' 'PageTables: 7900 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962492 kB' 'Committed_AS: 11564768 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201028 kB' 'VmallocChunk: 0 kB' 'Percpu: 52800 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 667056 kB' 'DirectMap2M: 12640256 kB' 'DirectMap1G: 88080384 kB' 00:22:25.473 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:25.473 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.473 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.473 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.473 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:25.473 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.473 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.473 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.473 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:25.473 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.473 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.473 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.473 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:25.473 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.473 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.473 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.473 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:25.473 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.474 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@97 -- # anon=0 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293508 kB' 'MemFree: 73723344 kB' 'MemAvailable: 77076260 kB' 'Buffers: 10256 kB' 'Cached: 13543236 kB' 'SwapCached: 0 kB' 'Active: 10595564 kB' 'Inactive: 3469844 kB' 'Active(anon): 10158000 kB' 'Inactive(anon): 0 kB' 'Active(file): 437564 kB' 'Inactive(file): 3469844 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 515116 kB' 'Mapped: 171616 kB' 'Shmem: 9646084 kB' 'KReclaimable: 198304 kB' 'Slab: 500536 kB' 'SReclaimable: 198304 kB' 'SUnreclaim: 302232 kB' 'KernelStack: 16176 kB' 'PageTables: 7736 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962492 kB' 'Committed_AS: 11564916 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200996 kB' 'VmallocChunk: 0 kB' 'Percpu: 52800 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 667056 kB' 'DirectMap2M: 12640256 kB' 'DirectMap1G: 88080384 kB' 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.739 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.740 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # surp=0 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293508 kB' 'MemFree: 73723640 kB' 'MemAvailable: 77076556 kB' 'Buffers: 10256 kB' 'Cached: 13543252 kB' 'SwapCached: 0 kB' 'Active: 10595576 kB' 'Inactive: 3469844 kB' 'Active(anon): 10158012 kB' 'Inactive(anon): 0 kB' 'Active(file): 437564 kB' 'Inactive(file): 3469844 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 515116 kB' 'Mapped: 171616 kB' 'Shmem: 9646100 kB' 'KReclaimable: 198304 kB' 'Slab: 500536 kB' 'SReclaimable: 198304 kB' 'SUnreclaim: 302232 kB' 'KernelStack: 16176 kB' 'PageTables: 7736 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962492 kB' 'Committed_AS: 11564944 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200996 kB' 'VmallocChunk: 0 kB' 'Percpu: 52800 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 667056 kB' 'DirectMap2M: 12640256 kB' 'DirectMap1G: 88080384 kB' 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:25.741 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.742 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@100 -- # resv=0 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1536 00:22:25.743 nr_hugepages=1536 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:22:25.743 resv_hugepages=0 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:22:25.743 surplus_hugepages=0 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:22:25.743 anon_hugepages=0 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@107 -- # (( 1536 == nr_hugepages + surp + resv )) 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages )) 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293508 kB' 'MemFree: 73724332 kB' 'MemAvailable: 77077248 kB' 'Buffers: 10256 kB' 'Cached: 13543260 kB' 'SwapCached: 0 kB' 'Active: 10595668 kB' 'Inactive: 3469844 kB' 'Active(anon): 10158104 kB' 'Inactive(anon): 0 kB' 'Active(file): 437564 kB' 'Inactive(file): 3469844 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 515216 kB' 'Mapped: 171616 kB' 'Shmem: 9646108 kB' 'KReclaimable: 198304 kB' 'Slab: 500536 kB' 'SReclaimable: 198304 kB' 'SUnreclaim: 302232 kB' 'KernelStack: 16192 kB' 'PageTables: 7828 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962492 kB' 'Committed_AS: 11565332 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201012 kB' 'VmallocChunk: 0 kB' 'Percpu: 52800 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 667056 kB' 'DirectMap2M: 12640256 kB' 'DirectMap1G: 88080384 kB' 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.743 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.744 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 1536 00:22:25.745 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@110 -- # (( 1536 == nr_hugepages + surp + resv )) 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@27 -- # local node 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=0 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116952 kB' 'MemFree: 42708940 kB' 'MemUsed: 5408012 kB' 'SwapCached: 0 kB' 'Active: 3421992 kB' 'Inactive: 189308 kB' 'Active(anon): 3175292 kB' 'Inactive(anon): 0 kB' 'Active(file): 246700 kB' 'Inactive(file): 189308 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3327456 kB' 'Mapped: 85812 kB' 'AnonPages: 287008 kB' 'Shmem: 2891448 kB' 'KernelStack: 8936 kB' 'PageTables: 3880 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 118464 kB' 'Slab: 286176 kB' 'SReclaimable: 118464 kB' 'SUnreclaim: 167712 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.746 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=1 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44176556 kB' 'MemFree: 31015424 kB' 'MemUsed: 13161132 kB' 'SwapCached: 0 kB' 'Active: 7174128 kB' 'Inactive: 3280536 kB' 'Active(anon): 6983264 kB' 'Inactive(anon): 0 kB' 'Active(file): 190864 kB' 'Inactive(file): 3280536 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 10226124 kB' 'Mapped: 85804 kB' 'AnonPages: 228584 kB' 'Shmem: 6754724 kB' 'KernelStack: 7272 kB' 'PageTables: 3972 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 79840 kB' 'Slab: 214360 kB' 'SReclaimable: 79840 kB' 'SUnreclaim: 134520 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.747 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.748 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.749 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.749 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.749 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.749 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.749 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.749 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.749 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.749 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.749 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:22:25.749 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:25.749 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:25.749 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:25.749 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:22:25.749 11:33:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:22:25.749 11:33:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:22:25.749 11:33:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:22:25.749 11:33:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:22:25.749 11:33:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:22:25.749 11:33:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:22:25.749 node0=512 expecting 512 00:22:25.749 11:33:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:22:25.749 11:33:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:22:25.749 11:33:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:22:25.749 11:33:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@128 -- # echo 'node1=1024 expecting 1024' 00:22:25.749 node1=1024 expecting 1024 00:22:25.749 11:33:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@130 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:22:25.749 00:22:25.749 real 0m3.483s 00:22:25.749 user 0m1.234s 00:22:25.749 sys 0m2.302s 00:22:25.749 11:33:09 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1125 -- # xtrace_disable 00:22:25.749 11:33:09 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@10 -- # set +x 00:22:25.749 ************************************ 00:22:25.749 END TEST custom_alloc 00:22:25.749 ************************************ 00:22:25.749 11:33:09 setup.sh.hugepages -- setup/hugepages.sh@215 -- # run_test no_shrink_alloc no_shrink_alloc 00:22:25.749 11:33:09 setup.sh.hugepages -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:22:25.749 11:33:09 setup.sh.hugepages -- common/autotest_common.sh@1106 -- # xtrace_disable 00:22:25.749 11:33:09 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:22:25.749 ************************************ 00:22:25.749 START TEST no_shrink_alloc 00:22:25.749 ************************************ 00:22:25.749 11:33:09 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1124 -- # no_shrink_alloc 00:22:25.749 11:33:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@195 -- # get_test_nr_hugepages 2097152 0 00:22:25.749 11:33:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:22:25.749 11:33:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:22:25.749 11:33:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@51 -- # shift 00:22:25.749 11:33:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@52 -- # node_ids=('0') 00:22:25.749 11:33:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@52 -- # local node_ids 00:22:25.749 11:33:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:22:25.749 11:33:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:22:25.749 11:33:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:22:25.749 11:33:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:22:25.749 11:33:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:22:25.749 11:33:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:22:25.749 11:33:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:22:25.749 11:33:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:22:25.749 11:33:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:22:25.749 11:33:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:22:25.749 11:33:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:22:25.749 11:33:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:22:25.749 11:33:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@73 -- # return 0 00:22:25.749 11:33:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@198 -- # setup output 00:22:25.749 11:33:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:22:25.749 11:33:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:22:29.096 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:22:29.096 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:22:29.096 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:22:29.096 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:22:29.096 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:22:29.096 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:22:29.096 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:22:29.096 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:22:29.096 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:22:29.096 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:22:29.096 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:22:29.096 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:22:29.096 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:22:29.096 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:22:29.096 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:22:29.096 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:22:29.096 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:22:29.096 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:22:29.372 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@199 -- # verify_nr_hugepages 00:22:29.372 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local node 00:22:29.372 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:22:29.372 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:22:29.372 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local surp 00:22:29.372 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local resv 00:22:29.372 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@94 -- # local anon 00:22:29.372 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:22:29.372 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:22:29.372 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:22:29.372 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:22:29.372 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:22:29.372 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:22:29.372 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:22:29.372 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:22:29.372 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:22:29.372 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:22:29.372 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:22:29.372 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.372 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.372 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293508 kB' 'MemFree: 74810872 kB' 'MemAvailable: 78163784 kB' 'Buffers: 10256 kB' 'Cached: 13543380 kB' 'SwapCached: 0 kB' 'Active: 10598100 kB' 'Inactive: 3469844 kB' 'Active(anon): 10160536 kB' 'Inactive(anon): 0 kB' 'Active(file): 437564 kB' 'Inactive(file): 3469844 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 517072 kB' 'Mapped: 172152 kB' 'Shmem: 9646228 kB' 'KReclaimable: 198296 kB' 'Slab: 500348 kB' 'SReclaimable: 198296 kB' 'SUnreclaim: 302052 kB' 'KernelStack: 16336 kB' 'PageTables: 8508 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486780 kB' 'Committed_AS: 11568676 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201140 kB' 'VmallocChunk: 0 kB' 'Percpu: 52800 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 667056 kB' 'DirectMap2M: 12640256 kB' 'DirectMap1G: 88080384 kB' 00:22:29.372 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:29.372 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.372 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.372 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.372 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:29.372 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.372 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.372 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.372 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:29.372 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.372 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.372 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.372 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:29.372 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.372 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.372 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.372 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:29.372 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.372 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.372 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.372 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:29.372 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.372 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.372 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.372 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:29.372 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.372 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.372 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.372 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:29.372 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.372 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.372 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.372 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:29.372 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.372 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.372 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.372 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:29.372 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.372 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.372 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.372 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:29.372 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.373 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # anon=0 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293508 kB' 'MemFree: 74810052 kB' 'MemAvailable: 78162964 kB' 'Buffers: 10256 kB' 'Cached: 13543384 kB' 'SwapCached: 0 kB' 'Active: 10598152 kB' 'Inactive: 3469844 kB' 'Active(anon): 10160588 kB' 'Inactive(anon): 0 kB' 'Active(file): 437564 kB' 'Inactive(file): 3469844 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 517088 kB' 'Mapped: 171740 kB' 'Shmem: 9646232 kB' 'KReclaimable: 198296 kB' 'Slab: 500348 kB' 'SReclaimable: 198296 kB' 'SUnreclaim: 302052 kB' 'KernelStack: 16256 kB' 'PageTables: 7956 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486780 kB' 'Committed_AS: 11568692 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201156 kB' 'VmallocChunk: 0 kB' 'Percpu: 52800 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 667056 kB' 'DirectMap2M: 12640256 kB' 'DirectMap1G: 88080384 kB' 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.374 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.375 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # surp=0 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293508 kB' 'MemFree: 74807360 kB' 'MemAvailable: 78160272 kB' 'Buffers: 10256 kB' 'Cached: 13543404 kB' 'SwapCached: 0 kB' 'Active: 10596552 kB' 'Inactive: 3469844 kB' 'Active(anon): 10158988 kB' 'Inactive(anon): 0 kB' 'Active(file): 437564 kB' 'Inactive(file): 3469844 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 515912 kB' 'Mapped: 171648 kB' 'Shmem: 9646252 kB' 'KReclaimable: 198296 kB' 'Slab: 500332 kB' 'SReclaimable: 198296 kB' 'SUnreclaim: 302036 kB' 'KernelStack: 16160 kB' 'PageTables: 7892 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486780 kB' 'Committed_AS: 11568716 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201140 kB' 'VmallocChunk: 0 kB' 'Percpu: 52800 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 667056 kB' 'DirectMap2M: 12640256 kB' 'DirectMap1G: 88080384 kB' 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:29.376 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:29.377 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # resv=0 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:22:29.378 nr_hugepages=1024 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:22:29.378 resv_hugepages=0 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:22:29.378 surplus_hugepages=0 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:22:29.378 anon_hugepages=0 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293508 kB' 'MemFree: 74805912 kB' 'MemAvailable: 78158824 kB' 'Buffers: 10256 kB' 'Cached: 13543424 kB' 'SwapCached: 0 kB' 'Active: 10597128 kB' 'Inactive: 3469844 kB' 'Active(anon): 10159564 kB' 'Inactive(anon): 0 kB' 'Active(file): 437564 kB' 'Inactive(file): 3469844 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 516496 kB' 'Mapped: 171648 kB' 'Shmem: 9646272 kB' 'KReclaimable: 198296 kB' 'Slab: 500420 kB' 'SReclaimable: 198296 kB' 'SUnreclaim: 302124 kB' 'KernelStack: 16304 kB' 'PageTables: 8084 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486780 kB' 'Committed_AS: 11567244 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201156 kB' 'VmallocChunk: 0 kB' 'Percpu: 52800 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 667056 kB' 'DirectMap2M: 12640256 kB' 'DirectMap1G: 88080384 kB' 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.378 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.379 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@27 -- # local node 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116952 kB' 'MemFree: 41689252 kB' 'MemUsed: 6427700 kB' 'SwapCached: 0 kB' 'Active: 3422348 kB' 'Inactive: 189308 kB' 'Active(anon): 3175648 kB' 'Inactive(anon): 0 kB' 'Active(file): 246700 kB' 'Inactive(file): 189308 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3327472 kB' 'Mapped: 85832 kB' 'AnonPages: 287320 kB' 'Shmem: 2891464 kB' 'KernelStack: 8920 kB' 'PageTables: 3788 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 118456 kB' 'Slab: 286196 kB' 'SReclaimable: 118456 kB' 'SUnreclaim: 167740 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.380 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.381 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.382 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.382 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.382 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.382 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.382 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.382 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.382 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.382 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.382 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.382 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.382 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.382 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.382 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:29.382 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:29.382 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:29.382 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:29.382 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:22:29.382 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:22:29.382 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:22:29.382 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:22:29.382 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:22:29.382 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:22:29.382 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:22:29.382 node0=1024 expecting 1024 00:22:29.382 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:22:29.382 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # CLEAR_HUGE=no 00:22:29.382 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # NRHUGE=512 00:22:29.382 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # setup output 00:22:29.382 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:22:29.382 11:33:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:22:33.578 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:22:33.578 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:22:33.578 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:22:33.578 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:22:33.578 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:22:33.578 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:22:33.578 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:22:33.578 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:22:33.578 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:22:33.578 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:22:33.578 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:22:33.578 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:22:33.578 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:22:33.578 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:22:33.578 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:22:33.578 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:22:33.578 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:22:33.578 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:22:33.578 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:22:33.578 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@204 -- # verify_nr_hugepages 00:22:33.578 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local node 00:22:33.578 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:22:33.578 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:22:33.578 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local surp 00:22:33.578 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local resv 00:22:33.578 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@94 -- # local anon 00:22:33.578 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:22:33.578 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:22:33.578 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:22:33.578 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:22:33.578 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:22:33.578 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:22:33.578 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:22:33.578 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:22:33.578 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:22:33.578 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:22:33.578 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:22:33.578 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.578 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.578 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293508 kB' 'MemFree: 74818684 kB' 'MemAvailable: 78171596 kB' 'Buffers: 10256 kB' 'Cached: 13543512 kB' 'SwapCached: 0 kB' 'Active: 10596896 kB' 'Inactive: 3469844 kB' 'Active(anon): 10159332 kB' 'Inactive(anon): 0 kB' 'Active(file): 437564 kB' 'Inactive(file): 3469844 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 516240 kB' 'Mapped: 171648 kB' 'Shmem: 9646360 kB' 'KReclaimable: 198296 kB' 'Slab: 500708 kB' 'SReclaimable: 198296 kB' 'SUnreclaim: 302412 kB' 'KernelStack: 16176 kB' 'PageTables: 7824 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486780 kB' 'Committed_AS: 11566576 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201044 kB' 'VmallocChunk: 0 kB' 'Percpu: 52800 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 667056 kB' 'DirectMap2M: 12640256 kB' 'DirectMap1G: 88080384 kB' 00:22:33.578 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:33.578 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.578 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.578 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.578 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:33.578 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.578 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.578 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.578 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:33.578 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.578 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.578 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.578 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:33.578 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.579 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # anon=0 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293508 kB' 'MemFree: 74818756 kB' 'MemAvailable: 78171668 kB' 'Buffers: 10256 kB' 'Cached: 13543512 kB' 'SwapCached: 0 kB' 'Active: 10597408 kB' 'Inactive: 3469844 kB' 'Active(anon): 10159844 kB' 'Inactive(anon): 0 kB' 'Active(file): 437564 kB' 'Inactive(file): 3469844 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 516856 kB' 'Mapped: 171632 kB' 'Shmem: 9646360 kB' 'KReclaimable: 198296 kB' 'Slab: 500788 kB' 'SReclaimable: 198296 kB' 'SUnreclaim: 302492 kB' 'KernelStack: 16192 kB' 'PageTables: 7904 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486780 kB' 'Committed_AS: 11567720 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200996 kB' 'VmallocChunk: 0 kB' 'Percpu: 52800 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 667056 kB' 'DirectMap2M: 12640256 kB' 'DirectMap1G: 88080384 kB' 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.580 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.581 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # surp=0 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293508 kB' 'MemFree: 74819824 kB' 'MemAvailable: 78172736 kB' 'Buffers: 10256 kB' 'Cached: 13543532 kB' 'SwapCached: 0 kB' 'Active: 10597280 kB' 'Inactive: 3469844 kB' 'Active(anon): 10159716 kB' 'Inactive(anon): 0 kB' 'Active(file): 437564 kB' 'Inactive(file): 3469844 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 516168 kB' 'Mapped: 171652 kB' 'Shmem: 9646380 kB' 'KReclaimable: 198296 kB' 'Slab: 500788 kB' 'SReclaimable: 198296 kB' 'SUnreclaim: 302492 kB' 'KernelStack: 16144 kB' 'PageTables: 7756 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486780 kB' 'Committed_AS: 11569232 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201012 kB' 'VmallocChunk: 0 kB' 'Percpu: 52800 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 667056 kB' 'DirectMap2M: 12640256 kB' 'DirectMap1G: 88080384 kB' 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.582 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:33.583 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # resv=0 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:22:33.584 nr_hugepages=1024 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:22:33.584 resv_hugepages=0 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:22:33.584 surplus_hugepages=0 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:22:33.584 anon_hugepages=0 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293508 kB' 'MemFree: 74820632 kB' 'MemAvailable: 78173544 kB' 'Buffers: 10256 kB' 'Cached: 13543556 kB' 'SwapCached: 0 kB' 'Active: 10597660 kB' 'Inactive: 3469844 kB' 'Active(anon): 10160096 kB' 'Inactive(anon): 0 kB' 'Active(file): 437564 kB' 'Inactive(file): 3469844 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 516988 kB' 'Mapped: 171652 kB' 'Shmem: 9646404 kB' 'KReclaimable: 198296 kB' 'Slab: 500788 kB' 'SReclaimable: 198296 kB' 'SUnreclaim: 302492 kB' 'KernelStack: 16272 kB' 'PageTables: 8000 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486780 kB' 'Committed_AS: 11569256 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201044 kB' 'VmallocChunk: 0 kB' 'Percpu: 52800 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 667056 kB' 'DirectMap2M: 12640256 kB' 'DirectMap1G: 88080384 kB' 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.584 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.585 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@27 -- # local node 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116952 kB' 'MemFree: 41684304 kB' 'MemUsed: 6432648 kB' 'SwapCached: 0 kB' 'Active: 3423944 kB' 'Inactive: 189308 kB' 'Active(anon): 3177244 kB' 'Inactive(anon): 0 kB' 'Active(file): 246700 kB' 'Inactive(file): 189308 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3327488 kB' 'Mapped: 85832 kB' 'AnonPages: 288984 kB' 'Shmem: 2891480 kB' 'KernelStack: 9160 kB' 'PageTables: 4264 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 118456 kB' 'Slab: 286620 kB' 'SReclaimable: 118456 kB' 'SUnreclaim: 168164 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.586 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.587 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.587 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.587 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.587 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.587 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.587 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.587 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.587 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.587 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.587 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.587 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.587 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.587 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.587 11:33:16 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:22:33.587 node0=1024 expecting 1024 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:22:33.587 00:22:33.587 real 0m7.369s 00:22:33.587 user 0m2.757s 00:22:33.587 sys 0m4.770s 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1125 -- # xtrace_disable 00:22:33.587 11:33:17 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@10 -- # set +x 00:22:33.587 ************************************ 00:22:33.587 END TEST no_shrink_alloc 00:22:33.587 ************************************ 00:22:33.587 11:33:17 setup.sh.hugepages -- setup/hugepages.sh@217 -- # clear_hp 00:22:33.587 11:33:17 setup.sh.hugepages -- setup/hugepages.sh@37 -- # local node hp 00:22:33.587 11:33:17 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:22:33.587 11:33:17 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:22:33.587 11:33:17 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:22:33.587 11:33:17 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:22:33.587 11:33:17 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:22:33.588 11:33:17 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:22:33.588 11:33:17 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:22:33.588 11:33:17 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:22:33.588 11:33:17 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:22:33.588 11:33:17 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:22:33.588 11:33:17 setup.sh.hugepages -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:22:33.588 11:33:17 setup.sh.hugepages -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:22:33.588 00:22:33.588 real 0m26.044s 00:22:33.588 user 0m9.054s 00:22:33.588 sys 0m16.146s 00:22:33.588 11:33:17 setup.sh.hugepages -- common/autotest_common.sh@1125 -- # xtrace_disable 00:22:33.588 11:33:17 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:22:33.588 ************************************ 00:22:33.588 END TEST hugepages 00:22:33.588 ************************************ 00:22:33.588 11:33:17 setup.sh -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/driver.sh 00:22:33.588 11:33:17 setup.sh -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:22:33.588 11:33:17 setup.sh -- common/autotest_common.sh@1106 -- # xtrace_disable 00:22:33.588 11:33:17 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:22:33.588 ************************************ 00:22:33.588 START TEST driver 00:22:33.588 ************************************ 00:22:33.588 11:33:17 setup.sh.driver -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/driver.sh 00:22:33.588 * Looking for test storage... 00:22:33.588 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:22:33.588 11:33:17 setup.sh.driver -- setup/driver.sh@68 -- # setup reset 00:22:33.588 11:33:17 setup.sh.driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:22:33.588 11:33:17 setup.sh.driver -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:22:38.858 11:33:22 setup.sh.driver -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:22:38.858 11:33:22 setup.sh.driver -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:22:38.858 11:33:22 setup.sh.driver -- common/autotest_common.sh@1106 -- # xtrace_disable 00:22:38.858 11:33:22 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:22:38.858 ************************************ 00:22:38.858 START TEST guess_driver 00:22:38.858 ************************************ 00:22:38.858 11:33:22 setup.sh.driver.guess_driver -- common/autotest_common.sh@1124 -- # guess_driver 00:22:38.858 11:33:22 setup.sh.driver.guess_driver -- setup/driver.sh@46 -- # local driver setup_driver marker 00:22:38.858 11:33:22 setup.sh.driver.guess_driver -- setup/driver.sh@47 -- # local fail=0 00:22:38.858 11:33:22 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # pick_driver 00:22:38.858 11:33:22 setup.sh.driver.guess_driver -- setup/driver.sh@36 -- # vfio 00:22:38.858 11:33:22 setup.sh.driver.guess_driver -- setup/driver.sh@21 -- # local iommu_grups 00:22:38.858 11:33:22 setup.sh.driver.guess_driver -- setup/driver.sh@22 -- # local unsafe_vfio 00:22:38.858 11:33:22 setup.sh.driver.guess_driver -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:22:38.858 11:33:22 setup.sh.driver.guess_driver -- setup/driver.sh@25 -- # unsafe_vfio=N 00:22:38.858 11:33:22 setup.sh.driver.guess_driver -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:22:38.858 11:33:22 setup.sh.driver.guess_driver -- setup/driver.sh@29 -- # (( 215 > 0 )) 00:22:38.858 11:33:22 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # is_driver vfio_pci 00:22:38.858 11:33:22 setup.sh.driver.guess_driver -- setup/driver.sh@14 -- # mod vfio_pci 00:22:38.858 11:33:22 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # dep vfio_pci 00:22:38.858 11:33:22 setup.sh.driver.guess_driver -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:22:38.858 11:33:22 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:22:38.858 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:22:38.858 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:22:38.858 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:22:38.858 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:22:38.858 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:22:38.858 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:22:38.858 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:22:38.858 11:33:22 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # return 0 00:22:38.858 11:33:22 setup.sh.driver.guess_driver -- setup/driver.sh@37 -- # echo vfio-pci 00:22:38.858 11:33:22 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # driver=vfio-pci 00:22:38.858 11:33:22 setup.sh.driver.guess_driver -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:22:38.858 11:33:22 setup.sh.driver.guess_driver -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:22:38.858 Looking for driver=vfio-pci 00:22:38.858 11:33:22 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:22:38.858 11:33:22 setup.sh.driver.guess_driver -- setup/driver.sh@45 -- # setup output config 00:22:38.858 11:33:22 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ output == output ]] 00:22:38.858 11:33:22 setup.sh.driver.guess_driver -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:22:42.149 11:33:25 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ not == \-\> ]] 00:22:42.149 11:33:25 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # continue 00:22:42.149 11:33:25 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:22:42.149 11:33:25 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:22:42.149 11:33:25 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:22:42.149 11:33:25 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:22:42.149 11:33:25 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:22:42.149 11:33:25 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:22:42.149 11:33:25 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:22:42.149 11:33:25 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:22:42.149 11:33:25 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:22:42.149 11:33:25 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:22:42.149 11:33:25 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:22:42.149 11:33:25 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:22:42.149 11:33:25 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:22:42.149 11:33:25 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:22:42.149 11:33:25 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:22:42.149 11:33:25 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:22:42.149 11:33:25 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:22:42.149 11:33:25 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:22:42.149 11:33:25 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:22:42.149 11:33:25 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:22:42.149 11:33:25 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:22:42.149 11:33:25 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:22:42.149 11:33:25 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:22:42.149 11:33:25 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:22:42.149 11:33:25 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:22:42.149 11:33:25 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:22:42.149 11:33:25 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:22:42.149 11:33:25 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:22:42.149 11:33:25 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:22:42.149 11:33:25 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:22:42.149 11:33:25 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:22:42.149 11:33:25 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:22:42.149 11:33:25 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:22:42.149 11:33:25 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:22:42.149 11:33:25 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:22:42.149 11:33:25 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:22:42.149 11:33:25 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:22:42.149 11:33:25 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:22:42.149 11:33:25 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:22:42.149 11:33:25 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:22:42.149 11:33:25 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:22:42.149 11:33:25 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:22:42.150 11:33:25 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:22:42.150 11:33:25 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:22:42.150 11:33:25 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:22:42.150 11:33:25 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:22:42.150 11:33:25 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:22:42.150 11:33:25 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:22:42.150 11:33:25 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:22:43.086 11:33:26 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:22:43.086 11:33:26 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:22:43.086 11:33:26 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:22:43.086 11:33:27 setup.sh.driver.guess_driver -- setup/driver.sh@64 -- # (( fail == 0 )) 00:22:43.086 11:33:27 setup.sh.driver.guess_driver -- setup/driver.sh@65 -- # setup reset 00:22:43.086 11:33:27 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:22:43.087 11:33:27 setup.sh.driver.guess_driver -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:22:48.356 00:22:48.356 real 0m9.535s 00:22:48.356 user 0m2.686s 00:22:48.357 sys 0m5.008s 00:22:48.357 11:33:31 setup.sh.driver.guess_driver -- common/autotest_common.sh@1125 -- # xtrace_disable 00:22:48.357 11:33:31 setup.sh.driver.guess_driver -- common/autotest_common.sh@10 -- # set +x 00:22:48.357 ************************************ 00:22:48.357 END TEST guess_driver 00:22:48.357 ************************************ 00:22:48.357 00:22:48.357 real 0m14.527s 00:22:48.357 user 0m4.151s 00:22:48.357 sys 0m7.795s 00:22:48.357 11:33:31 setup.sh.driver -- common/autotest_common.sh@1125 -- # xtrace_disable 00:22:48.357 11:33:31 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:22:48.357 ************************************ 00:22:48.357 END TEST driver 00:22:48.357 ************************************ 00:22:48.357 11:33:31 setup.sh -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/devices.sh 00:22:48.357 11:33:31 setup.sh -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:22:48.357 11:33:31 setup.sh -- common/autotest_common.sh@1106 -- # xtrace_disable 00:22:48.357 11:33:31 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:22:48.357 ************************************ 00:22:48.357 START TEST devices 00:22:48.357 ************************************ 00:22:48.357 11:33:31 setup.sh.devices -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/devices.sh 00:22:48.357 * Looking for test storage... 00:22:48.357 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:22:48.357 11:33:31 setup.sh.devices -- setup/devices.sh@190 -- # trap cleanup EXIT 00:22:48.357 11:33:31 setup.sh.devices -- setup/devices.sh@192 -- # setup reset 00:22:48.357 11:33:31 setup.sh.devices -- setup/common.sh@9 -- # [[ reset == output ]] 00:22:48.357 11:33:31 setup.sh.devices -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:22:51.644 11:33:35 setup.sh.devices -- setup/devices.sh@194 -- # get_zoned_devs 00:22:51.644 11:33:35 setup.sh.devices -- common/autotest_common.sh@1668 -- # zoned_devs=() 00:22:51.644 11:33:35 setup.sh.devices -- common/autotest_common.sh@1668 -- # local -gA zoned_devs 00:22:51.644 11:33:35 setup.sh.devices -- common/autotest_common.sh@1669 -- # local nvme bdf 00:22:51.644 11:33:35 setup.sh.devices -- common/autotest_common.sh@1671 -- # for nvme in /sys/block/nvme* 00:22:51.644 11:33:35 setup.sh.devices -- common/autotest_common.sh@1672 -- # is_block_zoned nvme0n1 00:22:51.644 11:33:35 setup.sh.devices -- common/autotest_common.sh@1661 -- # local device=nvme0n1 00:22:51.644 11:33:35 setup.sh.devices -- common/autotest_common.sh@1663 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:22:51.644 11:33:35 setup.sh.devices -- common/autotest_common.sh@1664 -- # [[ none != none ]] 00:22:51.644 11:33:35 setup.sh.devices -- setup/devices.sh@196 -- # blocks=() 00:22:51.644 11:33:35 setup.sh.devices -- setup/devices.sh@196 -- # declare -a blocks 00:22:51.644 11:33:35 setup.sh.devices -- setup/devices.sh@197 -- # blocks_to_pci=() 00:22:51.644 11:33:35 setup.sh.devices -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:22:51.644 11:33:35 setup.sh.devices -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:22:51.644 11:33:35 setup.sh.devices -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:22:51.644 11:33:35 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:22:51.644 11:33:35 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0 00:22:51.644 11:33:35 setup.sh.devices -- setup/devices.sh@202 -- # pci=0000:5e:00.0 00:22:51.645 11:33:35 setup.sh.devices -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\5\e\:\0\0\.\0* ]] 00:22:51.645 11:33:35 setup.sh.devices -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:22:51.645 11:33:35 setup.sh.devices -- scripts/common.sh@378 -- # local block=nvme0n1 pt 00:22:51.645 11:33:35 setup.sh.devices -- scripts/common.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:22:51.645 No valid GPT data, bailing 00:22:51.645 11:33:35 setup.sh.devices -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:22:51.645 11:33:35 setup.sh.devices -- scripts/common.sh@391 -- # pt= 00:22:51.645 11:33:35 setup.sh.devices -- scripts/common.sh@392 -- # return 1 00:22:51.645 11:33:35 setup.sh.devices -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:22:51.645 11:33:35 setup.sh.devices -- setup/common.sh@76 -- # local dev=nvme0n1 00:22:51.645 11:33:35 setup.sh.devices -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:22:51.645 11:33:35 setup.sh.devices -- setup/common.sh@80 -- # echo 3840755982336 00:22:51.645 11:33:35 setup.sh.devices -- setup/devices.sh@204 -- # (( 3840755982336 >= min_disk_size )) 00:22:51.645 11:33:35 setup.sh.devices -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:22:51.645 11:33:35 setup.sh.devices -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:5e:00.0 00:22:51.645 11:33:35 setup.sh.devices -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:22:51.645 11:33:35 setup.sh.devices -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:22:51.645 11:33:35 setup.sh.devices -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:22:51.645 11:33:35 setup.sh.devices -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:22:51.645 11:33:35 setup.sh.devices -- common/autotest_common.sh@1106 -- # xtrace_disable 00:22:51.645 11:33:35 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:22:51.645 ************************************ 00:22:51.645 START TEST nvme_mount 00:22:51.645 ************************************ 00:22:51.645 11:33:35 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1124 -- # nvme_mount 00:22:51.645 11:33:35 setup.sh.devices.nvme_mount -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:22:51.645 11:33:35 setup.sh.devices.nvme_mount -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:22:51.645 11:33:35 setup.sh.devices.nvme_mount -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:22:51.645 11:33:35 setup.sh.devices.nvme_mount -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:22:51.645 11:33:35 setup.sh.devices.nvme_mount -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:22:51.645 11:33:35 setup.sh.devices.nvme_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:22:51.645 11:33:35 setup.sh.devices.nvme_mount -- setup/common.sh@40 -- # local part_no=1 00:22:51.645 11:33:35 setup.sh.devices.nvme_mount -- setup/common.sh@41 -- # local size=1073741824 00:22:51.645 11:33:35 setup.sh.devices.nvme_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:22:51.645 11:33:35 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # parts=() 00:22:51.645 11:33:35 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # local parts 00:22:51.645 11:33:35 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:22:51.645 11:33:35 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:22:51.645 11:33:35 setup.sh.devices.nvme_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:22:51.645 11:33:35 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part++ )) 00:22:51.645 11:33:35 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:22:51.645 11:33:35 setup.sh.devices.nvme_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:22:51.645 11:33:35 setup.sh.devices.nvme_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:22:51.645 11:33:35 setup.sh.devices.nvme_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:22:53.036 Creating new GPT entries in memory. 00:22:53.036 GPT data structures destroyed! You may now partition the disk using fdisk or 00:22:53.036 other utilities. 00:22:53.036 11:33:36 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:22:53.036 11:33:36 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:22:53.036 11:33:36 setup.sh.devices.nvme_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:22:53.036 11:33:36 setup.sh.devices.nvme_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:22:53.036 11:33:36 setup.sh.devices.nvme_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:22:53.976 Creating new GPT entries in memory. 00:22:53.976 The operation has completed successfully. 00:22:53.976 11:33:37 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part++ )) 00:22:53.976 11:33:37 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:22:53.976 11:33:37 setup.sh.devices.nvme_mount -- setup/common.sh@62 -- # wait 79090 00:22:53.976 11:33:37 setup.sh.devices.nvme_mount -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:22:53.976 11:33:37 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount size= 00:22:53.976 11:33:37 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:22:53.976 11:33:37 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:22:53.976 11:33:37 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:22:53.976 11:33:37 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:22:53.976 11:33:37 setup.sh.devices.nvme_mount -- setup/devices.sh@105 -- # verify 0000:5e:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:22:53.976 11:33:37 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:22:53.976 11:33:37 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:22:53.976 11:33:37 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:22:53.976 11:33:37 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:22:53.976 11:33:37 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:22:53.976 11:33:37 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:22:53.976 11:33:37 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:22:53.976 11:33:37 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:22:53.976 11:33:37 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:22:53.976 11:33:37 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:22:53.976 11:33:37 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:22:53.976 11:33:37 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:22:53.977 11:33:37 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:22:57.267 11:33:41 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:22:57.267 11:33:41 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:22:57.267 11:33:41 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:22:57.267 11:33:41 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:22:57.267 11:33:41 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:22:57.267 11:33:41 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:22:57.267 11:33:41 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:22:57.267 11:33:41 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:22:57.267 11:33:41 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:22:57.267 11:33:41 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:22:57.267 11:33:41 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:22:57.267 11:33:41 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:22:57.267 11:33:41 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:22:57.267 11:33:41 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:22:57.267 11:33:41 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:22:57.267 11:33:41 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:22:57.267 11:33:41 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:22:57.267 11:33:41 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:22:57.267 11:33:41 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:22:57.267 11:33:41 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:22:57.267 11:33:41 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:22:57.267 11:33:41 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:22:57.267 11:33:41 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:22:57.267 11:33:41 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:22:57.267 11:33:41 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:22:57.267 11:33:41 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:22:57.267 11:33:41 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:22:57.267 11:33:41 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:22:57.267 11:33:41 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:22:57.267 11:33:41 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:22:57.267 11:33:41 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:22:57.267 11:33:41 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:22:57.267 11:33:41 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:22:57.267 11:33:41 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:22:57.267 11:33:41 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:22:57.267 11:33:41 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:22:57.267 11:33:41 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:22:57.267 11:33:41 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:22:57.267 11:33:41 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:22:57.267 11:33:41 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:22:57.267 11:33:41 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:22:57.267 11:33:41 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount ]] 00:22:57.267 11:33:41 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:22:57.267 11:33:41 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:22:57.267 11:33:41 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:22:57.267 11:33:41 setup.sh.devices.nvme_mount -- setup/devices.sh@110 -- # cleanup_nvme 00:22:57.267 11:33:41 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:22:57.267 11:33:41 setup.sh.devices.nvme_mount -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:22:57.267 11:33:41 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:22:57.267 11:33:41 setup.sh.devices.nvme_mount -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:22:57.267 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:22:57.267 11:33:41 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:22:57.267 11:33:41 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:22:57.526 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:22:57.526 /dev/nvme0n1: 8 bytes were erased at offset 0x37e3ee55e00 (gpt): 45 46 49 20 50 41 52 54 00:22:57.527 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:22:57.527 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:22:57.527 11:33:41 setup.sh.devices.nvme_mount -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:22:57.527 11:33:41 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:22:57.527 11:33:41 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:22:57.527 11:33:41 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:22:57.527 11:33:41 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:22:57.786 11:33:41 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:22:57.786 11:33:41 setup.sh.devices.nvme_mount -- setup/devices.sh@116 -- # verify 0000:5e:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:22:57.786 11:33:41 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:22:57.786 11:33:41 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:22:57.786 11:33:41 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:22:57.786 11:33:41 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:22:57.786 11:33:41 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:22:57.786 11:33:41 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:22:57.786 11:33:41 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:22:57.786 11:33:41 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:22:57.786 11:33:41 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:22:57.786 11:33:41 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:22:57.786 11:33:41 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:22:57.786 11:33:41 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:22:57.786 11:33:41 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:23:01.086 11:33:44 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:01.086 11:33:44 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:23:01.086 11:33:44 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:23:01.086 11:33:44 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:01.086 11:33:44 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:01.086 11:33:44 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:01.086 11:33:44 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:01.086 11:33:44 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:01.086 11:33:44 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:01.086 11:33:44 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:01.086 11:33:44 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:01.086 11:33:44 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:01.086 11:33:44 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:01.086 11:33:44 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:01.086 11:33:44 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:01.086 11:33:44 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:01.086 11:33:44 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:01.086 11:33:44 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:01.086 11:33:44 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:01.086 11:33:44 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:01.086 11:33:44 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:01.086 11:33:44 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:01.086 11:33:44 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:01.086 11:33:44 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:01.086 11:33:44 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:01.086 11:33:44 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:01.086 11:33:44 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:01.086 11:33:44 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:01.086 11:33:44 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:01.086 11:33:44 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:01.086 11:33:44 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:01.086 11:33:44 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:01.086 11:33:44 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:01.086 11:33:44 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:01.086 11:33:44 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:01.086 11:33:44 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:01.086 11:33:44 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:01.086 11:33:44 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:01.086 11:33:44 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:01.086 11:33:44 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:01.086 11:33:44 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:23:01.086 11:33:44 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount ]] 00:23:01.086 11:33:44 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:23:01.086 11:33:44 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:23:01.086 11:33:44 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:23:01.086 11:33:44 setup.sh.devices.nvme_mount -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:23:01.086 11:33:44 setup.sh.devices.nvme_mount -- setup/devices.sh@125 -- # verify 0000:5e:00.0 data@nvme0n1 '' '' 00:23:01.086 11:33:44 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:23:01.086 11:33:44 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:23:01.086 11:33:44 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point= 00:23:01.086 11:33:44 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file= 00:23:01.086 11:33:44 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:23:01.086 11:33:44 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:23:01.086 11:33:44 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:23:01.086 11:33:44 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:01.086 11:33:44 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:23:01.087 11:33:44 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:23:01.087 11:33:44 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:23:01.087 11:33:44 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:23:05.276 11:33:48 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:05.276 11:33:48 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:23:05.276 11:33:48 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:23:05.276 11:33:48 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:05.276 11:33:48 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:05.276 11:33:48 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:05.276 11:33:48 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:05.276 11:33:48 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:05.276 11:33:48 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:05.276 11:33:48 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:05.276 11:33:48 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:05.276 11:33:48 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:05.276 11:33:48 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:05.276 11:33:48 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:05.276 11:33:48 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:05.276 11:33:48 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:05.276 11:33:48 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:05.276 11:33:48 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:05.276 11:33:48 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:05.276 11:33:48 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:05.276 11:33:48 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:05.276 11:33:48 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:05.276 11:33:48 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:05.276 11:33:48 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:05.276 11:33:48 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:05.276 11:33:48 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:05.276 11:33:48 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:05.276 11:33:48 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:05.276 11:33:48 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:05.276 11:33:48 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:05.276 11:33:48 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:05.276 11:33:48 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:05.276 11:33:48 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:05.276 11:33:48 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:05.276 11:33:48 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:05.276 11:33:48 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:05.276 11:33:48 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:05.276 11:33:48 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:05.276 11:33:48 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:05.276 11:33:48 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:05.276 11:33:48 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:23:05.276 11:33:48 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:23:05.276 11:33:48 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # return 0 00:23:05.276 11:33:48 setup.sh.devices.nvme_mount -- setup/devices.sh@128 -- # cleanup_nvme 00:23:05.276 11:33:48 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:23:05.276 11:33:48 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:23:05.276 11:33:48 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:23:05.276 11:33:48 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:23:05.276 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:23:05.276 00:23:05.276 real 0m13.079s 00:23:05.276 user 0m3.953s 00:23:05.276 sys 0m7.113s 00:23:05.276 11:33:48 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1125 -- # xtrace_disable 00:23:05.276 11:33:48 setup.sh.devices.nvme_mount -- common/autotest_common.sh@10 -- # set +x 00:23:05.276 ************************************ 00:23:05.276 END TEST nvme_mount 00:23:05.276 ************************************ 00:23:05.276 11:33:48 setup.sh.devices -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:23:05.276 11:33:48 setup.sh.devices -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:23:05.276 11:33:48 setup.sh.devices -- common/autotest_common.sh@1106 -- # xtrace_disable 00:23:05.276 11:33:48 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:23:05.276 ************************************ 00:23:05.276 START TEST dm_mount 00:23:05.276 ************************************ 00:23:05.276 11:33:48 setup.sh.devices.dm_mount -- common/autotest_common.sh@1124 -- # dm_mount 00:23:05.276 11:33:48 setup.sh.devices.dm_mount -- setup/devices.sh@144 -- # pv=nvme0n1 00:23:05.276 11:33:48 setup.sh.devices.dm_mount -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:23:05.276 11:33:48 setup.sh.devices.dm_mount -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:23:05.276 11:33:48 setup.sh.devices.dm_mount -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:23:05.276 11:33:48 setup.sh.devices.dm_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:23:05.276 11:33:48 setup.sh.devices.dm_mount -- setup/common.sh@40 -- # local part_no=2 00:23:05.276 11:33:48 setup.sh.devices.dm_mount -- setup/common.sh@41 -- # local size=1073741824 00:23:05.277 11:33:48 setup.sh.devices.dm_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:23:05.277 11:33:48 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # parts=() 00:23:05.277 11:33:48 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # local parts 00:23:05.277 11:33:48 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:23:05.277 11:33:48 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:23:05.277 11:33:48 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:23:05.277 11:33:48 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:23:05.277 11:33:48 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:23:05.277 11:33:48 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:23:05.277 11:33:48 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:23:05.277 11:33:48 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:23:05.277 11:33:48 setup.sh.devices.dm_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:23:05.277 11:33:48 setup.sh.devices.dm_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:23:05.277 11:33:48 setup.sh.devices.dm_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:23:05.844 Creating new GPT entries in memory. 00:23:05.844 GPT data structures destroyed! You may now partition the disk using fdisk or 00:23:05.844 other utilities. 00:23:05.844 11:33:49 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:23:05.844 11:33:49 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:23:05.844 11:33:49 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:23:05.844 11:33:49 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:23:05.844 11:33:49 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:23:06.780 Creating new GPT entries in memory. 00:23:06.780 The operation has completed successfully. 00:23:06.780 11:33:50 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:23:06.780 11:33:50 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:23:06.780 11:33:50 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:23:06.780 11:33:50 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:23:06.780 11:33:50 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:23:08.155 The operation has completed successfully. 00:23:08.155 11:33:51 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:23:08.155 11:33:51 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:23:08.155 11:33:51 setup.sh.devices.dm_mount -- setup/common.sh@62 -- # wait 83298 00:23:08.155 11:33:51 setup.sh.devices.dm_mount -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:23:08.155 11:33:51 setup.sh.devices.dm_mount -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:23:08.155 11:33:51 setup.sh.devices.dm_mount -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:23:08.155 11:33:51 setup.sh.devices.dm_mount -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:23:08.155 11:33:51 setup.sh.devices.dm_mount -- setup/devices.sh@160 -- # for t in {1..5} 00:23:08.155 11:33:51 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:23:08.155 11:33:51 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # break 00:23:08.155 11:33:51 setup.sh.devices.dm_mount -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:23:08.155 11:33:51 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:23:08.155 11:33:51 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:23:08.155 11:33:51 setup.sh.devices.dm_mount -- setup/devices.sh@166 -- # dm=dm-0 00:23:08.155 11:33:51 setup.sh.devices.dm_mount -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:23:08.155 11:33:51 setup.sh.devices.dm_mount -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:23:08.155 11:33:51 setup.sh.devices.dm_mount -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:23:08.155 11:33:51 setup.sh.devices.dm_mount -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount size= 00:23:08.155 11:33:51 setup.sh.devices.dm_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:23:08.155 11:33:51 setup.sh.devices.dm_mount -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:23:08.155 11:33:51 setup.sh.devices.dm_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:23:08.155 11:33:51 setup.sh.devices.dm_mount -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:23:08.155 11:33:51 setup.sh.devices.dm_mount -- setup/devices.sh@174 -- # verify 0000:5e:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:23:08.155 11:33:51 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:23:08.155 11:33:51 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:23:08.155 11:33:51 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:23:08.155 11:33:51 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:23:08.155 11:33:51 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:23:08.155 11:33:51 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:23:08.155 11:33:51 setup.sh.devices.dm_mount -- setup/devices.sh@56 -- # : 00:23:08.155 11:33:51 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:23:08.155 11:33:51 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:08.155 11:33:51 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:23:08.155 11:33:51 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:23:08.155 11:33:51 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:23:08.155 11:33:51 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:23:11.502 11:33:55 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:11.502 11:33:55 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:23:11.502 11:33:55 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:23:11.502 11:33:55 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:11.502 11:33:55 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:11.502 11:33:55 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:11.502 11:33:55 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:11.502 11:33:55 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:11.502 11:33:55 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:11.502 11:33:55 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:11.502 11:33:55 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:11.502 11:33:55 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:11.502 11:33:55 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:11.502 11:33:55 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:11.502 11:33:55 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:11.502 11:33:55 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:11.502 11:33:55 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:11.502 11:33:55 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:11.502 11:33:55 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:11.502 11:33:55 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:11.502 11:33:55 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:11.502 11:33:55 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:11.502 11:33:55 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:11.502 11:33:55 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:11.502 11:33:55 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:11.502 11:33:55 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:11.502 11:33:55 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:11.502 11:33:55 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:11.502 11:33:55 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:11.502 11:33:55 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:11.502 11:33:55 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:11.502 11:33:55 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:11.502 11:33:55 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:11.502 11:33:55 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:11.502 11:33:55 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:11.502 11:33:55 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:11.502 11:33:55 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:11.502 11:33:55 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:11.503 11:33:55 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:11.503 11:33:55 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:11.503 11:33:55 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:23:11.503 11:33:55 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount ]] 00:23:11.503 11:33:55 setup.sh.devices.dm_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:23:11.503 11:33:55 setup.sh.devices.dm_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:23:11.503 11:33:55 setup.sh.devices.dm_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:23:11.503 11:33:55 setup.sh.devices.dm_mount -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:23:11.503 11:33:55 setup.sh.devices.dm_mount -- setup/devices.sh@184 -- # verify 0000:5e:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:23:11.503 11:33:55 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:23:11.503 11:33:55 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:23:11.503 11:33:55 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point= 00:23:11.503 11:33:55 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file= 00:23:11.503 11:33:55 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:23:11.503 11:33:55 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:23:11.503 11:33:55 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:23:11.503 11:33:55 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:11.503 11:33:55 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:23:11.503 11:33:55 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:23:11.503 11:33:55 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:23:11.503 11:33:55 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:23:15.695 11:33:58 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:15.695 11:33:58 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:23:15.695 11:33:58 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:23:15.695 11:33:58 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:15.695 11:33:58 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:15.695 11:33:58 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:15.695 11:33:58 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:15.695 11:33:58 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:15.695 11:33:58 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:15.695 11:33:58 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:15.695 11:33:58 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:15.695 11:33:58 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:15.695 11:33:58 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:15.695 11:33:58 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:15.695 11:33:58 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:15.695 11:33:58 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:15.695 11:33:58 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:15.695 11:33:58 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:15.695 11:33:58 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:15.695 11:33:58 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:15.695 11:33:58 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:15.695 11:33:58 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:15.695 11:33:58 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:15.695 11:33:58 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:15.695 11:33:58 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:15.695 11:33:58 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:15.695 11:33:58 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:15.695 11:33:58 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:15.695 11:33:58 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:15.695 11:33:58 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:15.695 11:33:58 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:15.696 11:33:58 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:15.696 11:33:58 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:15.696 11:33:58 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:15.696 11:33:58 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:15.696 11:33:58 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:15.696 11:33:58 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:15.696 11:33:58 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:15.696 11:33:58 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:23:15.696 11:33:58 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:23:15.696 11:33:58 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:23:15.696 11:33:58 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:23:15.696 11:33:58 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # return 0 00:23:15.696 11:33:58 setup.sh.devices.dm_mount -- setup/devices.sh@187 -- # cleanup_dm 00:23:15.696 11:33:58 setup.sh.devices.dm_mount -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:23:15.696 11:33:58 setup.sh.devices.dm_mount -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:23:15.696 11:33:58 setup.sh.devices.dm_mount -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:23:15.696 11:33:59 setup.sh.devices.dm_mount -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:23:15.696 11:33:59 setup.sh.devices.dm_mount -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:23:15.696 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:23:15.696 11:33:59 setup.sh.devices.dm_mount -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:23:15.696 11:33:59 setup.sh.devices.dm_mount -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:23:15.696 00:23:15.696 real 0m10.335s 00:23:15.696 user 0m2.593s 00:23:15.696 sys 0m4.862s 00:23:15.696 11:33:59 setup.sh.devices.dm_mount -- common/autotest_common.sh@1125 -- # xtrace_disable 00:23:15.696 11:33:59 setup.sh.devices.dm_mount -- common/autotest_common.sh@10 -- # set +x 00:23:15.696 ************************************ 00:23:15.696 END TEST dm_mount 00:23:15.696 ************************************ 00:23:15.696 11:33:59 setup.sh.devices -- setup/devices.sh@1 -- # cleanup 00:23:15.696 11:33:59 setup.sh.devices -- setup/devices.sh@11 -- # cleanup_nvme 00:23:15.696 11:33:59 setup.sh.devices -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:23:15.696 11:33:59 setup.sh.devices -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:23:15.696 11:33:59 setup.sh.devices -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:23:15.696 11:33:59 setup.sh.devices -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:23:15.696 11:33:59 setup.sh.devices -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:23:15.696 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:23:15.696 /dev/nvme0n1: 8 bytes were erased at offset 0x37e3ee55e00 (gpt): 45 46 49 20 50 41 52 54 00:23:15.696 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:23:15.696 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:23:15.696 11:33:59 setup.sh.devices -- setup/devices.sh@12 -- # cleanup_dm 00:23:15.696 11:33:59 setup.sh.devices -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:23:15.696 11:33:59 setup.sh.devices -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:23:15.696 11:33:59 setup.sh.devices -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:23:15.696 11:33:59 setup.sh.devices -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:23:15.696 11:33:59 setup.sh.devices -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:23:15.696 11:33:59 setup.sh.devices -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:23:15.696 00:23:15.696 real 0m27.596s 00:23:15.696 user 0m7.863s 00:23:15.696 sys 0m14.694s 00:23:15.696 11:33:59 setup.sh.devices -- common/autotest_common.sh@1125 -- # xtrace_disable 00:23:15.696 11:33:59 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:23:15.696 ************************************ 00:23:15.696 END TEST devices 00:23:15.696 ************************************ 00:23:15.696 00:23:15.696 real 1m32.497s 00:23:15.696 user 0m28.684s 00:23:15.696 sys 0m53.513s 00:23:15.696 11:33:59 setup.sh -- common/autotest_common.sh@1125 -- # xtrace_disable 00:23:15.696 11:33:59 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:23:15.696 ************************************ 00:23:15.696 END TEST setup.sh 00:23:15.696 ************************************ 00:23:15.696 11:33:59 -- spdk/autotest.sh@128 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:23:19.883 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:23:19.883 Hugepages 00:23:19.883 node hugesize free / total 00:23:19.883 node0 1048576kB 0 / 0 00:23:19.883 node0 2048kB 1024 / 1024 00:23:19.883 node1 1048576kB 0 / 0 00:23:19.883 node1 2048kB 1024 / 1024 00:23:19.883 00:23:19.883 Type BDF Vendor Device NUMA Driver Device Block devices 00:23:19.883 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:23:19.883 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:23:19.883 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:23:19.883 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:23:19.883 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:23:19.883 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:23:19.883 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:23:19.883 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:23:19.883 NVMe 0000:5e:00.0 8086 0b60 0 nvme nvme0 nvme0n1 00:23:19.883 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:23:19.883 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:23:19.883 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:23:19.883 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:23:19.883 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:23:19.883 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:23:19.883 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:23:19.883 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:23:19.883 VMD 0000:85:05.5 8086 201d 1 vfio-pci - - 00:23:19.883 11:34:03 -- spdk/autotest.sh@130 -- # uname -s 00:23:19.883 11:34:03 -- spdk/autotest.sh@130 -- # [[ Linux == Linux ]] 00:23:19.883 11:34:03 -- spdk/autotest.sh@132 -- # nvme_namespace_revert 00:23:19.883 11:34:03 -- common/autotest_common.sh@1530 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:23:23.166 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:23:23.166 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:23:23.166 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:23:23.166 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:23:23.166 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:23:23.166 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:23:23.166 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:23:23.166 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:23:23.166 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:23:23.166 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:23:23.166 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:23:23.166 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:23:23.166 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:23:23.166 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:23:23.166 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:23:23.166 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:23:23.166 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:23:24.540 0000:5e:00.0 (8086 0b60): nvme -> vfio-pci 00:23:24.540 11:34:08 -- common/autotest_common.sh@1531 -- # sleep 1 00:23:25.477 11:34:09 -- common/autotest_common.sh@1532 -- # bdfs=() 00:23:25.477 11:34:09 -- common/autotest_common.sh@1532 -- # local bdfs 00:23:25.477 11:34:09 -- common/autotest_common.sh@1533 -- # bdfs=($(get_nvme_bdfs)) 00:23:25.477 11:34:09 -- common/autotest_common.sh@1533 -- # get_nvme_bdfs 00:23:25.477 11:34:09 -- common/autotest_common.sh@1512 -- # bdfs=() 00:23:25.477 11:34:09 -- common/autotest_common.sh@1512 -- # local bdfs 00:23:25.477 11:34:09 -- common/autotest_common.sh@1513 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:23:25.477 11:34:09 -- common/autotest_common.sh@1513 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:23:25.477 11:34:09 -- common/autotest_common.sh@1513 -- # jq -r '.config[].params.traddr' 00:23:25.735 11:34:09 -- common/autotest_common.sh@1514 -- # (( 1 == 0 )) 00:23:25.735 11:34:09 -- common/autotest_common.sh@1518 -- # printf '%s\n' 0000:5e:00.0 00:23:25.735 11:34:09 -- common/autotest_common.sh@1535 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:23:29.014 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:23:29.014 Waiting for block devices as requested 00:23:29.014 0000:5e:00.0 (8086 0b60): vfio-pci -> nvme 00:23:29.273 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:23:29.273 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:23:29.273 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:23:29.273 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:23:29.531 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:23:29.532 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:23:29.532 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:23:29.790 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:23:29.790 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:23:29.790 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:23:30.049 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:23:30.049 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:23:30.049 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:23:30.308 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:23:30.308 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:23:30.308 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:23:30.309 11:34:14 -- common/autotest_common.sh@1537 -- # for bdf in "${bdfs[@]}" 00:23:30.309 11:34:14 -- common/autotest_common.sh@1538 -- # get_nvme_ctrlr_from_bdf 0000:5e:00.0 00:23:30.309 11:34:14 -- common/autotest_common.sh@1501 -- # grep 0000:5e:00.0/nvme/nvme 00:23:30.309 11:34:14 -- common/autotest_common.sh@1501 -- # readlink -f /sys/class/nvme/nvme0 00:23:30.309 11:34:14 -- common/autotest_common.sh@1501 -- # bdf_sysfs_path=/sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 00:23:30.309 11:34:14 -- common/autotest_common.sh@1502 -- # [[ -z /sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 ]] 00:23:30.309 11:34:14 -- common/autotest_common.sh@1506 -- # basename /sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 00:23:30.309 11:34:14 -- common/autotest_common.sh@1506 -- # printf '%s\n' nvme0 00:23:30.309 11:34:14 -- common/autotest_common.sh@1538 -- # nvme_ctrlr=/dev/nvme0 00:23:30.309 11:34:14 -- common/autotest_common.sh@1539 -- # [[ -z /dev/nvme0 ]] 00:23:30.567 11:34:14 -- common/autotest_common.sh@1544 -- # nvme id-ctrl /dev/nvme0 00:23:30.567 11:34:14 -- common/autotest_common.sh@1544 -- # grep oacs 00:23:30.567 11:34:14 -- common/autotest_common.sh@1544 -- # cut -d: -f2 00:23:30.567 11:34:14 -- common/autotest_common.sh@1544 -- # oacs=' 0x1e' 00:23:30.567 11:34:14 -- common/autotest_common.sh@1545 -- # oacs_ns_manage=8 00:23:30.567 11:34:14 -- common/autotest_common.sh@1547 -- # [[ 8 -ne 0 ]] 00:23:30.567 11:34:14 -- common/autotest_common.sh@1553 -- # nvme id-ctrl /dev/nvme0 00:23:30.567 11:34:14 -- common/autotest_common.sh@1553 -- # cut -d: -f2 00:23:30.567 11:34:14 -- common/autotest_common.sh@1553 -- # grep unvmcap 00:23:30.567 11:34:14 -- common/autotest_common.sh@1553 -- # unvmcap=' 0' 00:23:30.567 11:34:14 -- common/autotest_common.sh@1554 -- # [[ 0 -eq 0 ]] 00:23:30.567 11:34:14 -- common/autotest_common.sh@1556 -- # continue 00:23:30.567 11:34:14 -- spdk/autotest.sh@135 -- # timing_exit pre_cleanup 00:23:30.567 11:34:14 -- common/autotest_common.sh@729 -- # xtrace_disable 00:23:30.567 11:34:14 -- common/autotest_common.sh@10 -- # set +x 00:23:30.567 11:34:14 -- spdk/autotest.sh@138 -- # timing_enter afterboot 00:23:30.567 11:34:14 -- common/autotest_common.sh@723 -- # xtrace_disable 00:23:30.567 11:34:14 -- common/autotest_common.sh@10 -- # set +x 00:23:30.567 11:34:14 -- spdk/autotest.sh@139 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:23:33.856 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:23:33.856 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:23:33.856 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:23:33.856 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:23:33.856 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:23:33.856 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:23:33.856 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:23:33.856 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:23:33.856 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:23:33.856 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:23:33.856 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:23:33.856 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:23:33.856 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:23:33.856 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:23:33.856 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:23:33.856 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:23:33.856 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:23:35.234 0000:5e:00.0 (8086 0b60): nvme -> vfio-pci 00:23:35.234 11:34:19 -- spdk/autotest.sh@140 -- # timing_exit afterboot 00:23:35.234 11:34:19 -- common/autotest_common.sh@729 -- # xtrace_disable 00:23:35.234 11:34:19 -- common/autotest_common.sh@10 -- # set +x 00:23:35.235 11:34:19 -- spdk/autotest.sh@144 -- # opal_revert_cleanup 00:23:35.235 11:34:19 -- common/autotest_common.sh@1590 -- # mapfile -t bdfs 00:23:35.235 11:34:19 -- common/autotest_common.sh@1590 -- # get_nvme_bdfs_by_id 0x0a54 00:23:35.235 11:34:19 -- common/autotest_common.sh@1576 -- # bdfs=() 00:23:35.235 11:34:19 -- common/autotest_common.sh@1576 -- # local bdfs 00:23:35.235 11:34:19 -- common/autotest_common.sh@1578 -- # get_nvme_bdfs 00:23:35.235 11:34:19 -- common/autotest_common.sh@1512 -- # bdfs=() 00:23:35.235 11:34:19 -- common/autotest_common.sh@1512 -- # local bdfs 00:23:35.235 11:34:19 -- common/autotest_common.sh@1513 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:23:35.235 11:34:19 -- common/autotest_common.sh@1513 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:23:35.235 11:34:19 -- common/autotest_common.sh@1513 -- # jq -r '.config[].params.traddr' 00:23:35.235 11:34:19 -- common/autotest_common.sh@1514 -- # (( 1 == 0 )) 00:23:35.235 11:34:19 -- common/autotest_common.sh@1518 -- # printf '%s\n' 0000:5e:00.0 00:23:35.235 11:34:19 -- common/autotest_common.sh@1578 -- # for bdf in $(get_nvme_bdfs) 00:23:35.235 11:34:19 -- common/autotest_common.sh@1579 -- # cat /sys/bus/pci/devices/0000:5e:00.0/device 00:23:35.235 11:34:19 -- common/autotest_common.sh@1579 -- # device=0x0b60 00:23:35.235 11:34:19 -- common/autotest_common.sh@1580 -- # [[ 0x0b60 == \0\x\0\a\5\4 ]] 00:23:35.235 11:34:19 -- common/autotest_common.sh@1585 -- # printf '%s\n' 00:23:35.235 11:34:19 -- common/autotest_common.sh@1591 -- # [[ -z '' ]] 00:23:35.235 11:34:19 -- common/autotest_common.sh@1592 -- # return 0 00:23:35.235 11:34:19 -- spdk/autotest.sh@150 -- # '[' 0 -eq 1 ']' 00:23:35.235 11:34:19 -- spdk/autotest.sh@154 -- # '[' 1 -eq 1 ']' 00:23:35.235 11:34:19 -- spdk/autotest.sh@155 -- # [[ 1 -eq 1 ]] 00:23:35.235 11:34:19 -- spdk/autotest.sh@156 -- # [[ 0 -eq 1 ]] 00:23:35.235 11:34:19 -- spdk/autotest.sh@159 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/qat_setup.sh 00:23:36.170 Restarting all devices. 00:23:39.454 lstat() error: No such file or directory 00:23:39.454 QAT Error: No GENERAL section found 00:23:39.454 Failed to configure qat_dev0 00:23:39.454 lstat() error: No such file or directory 00:23:39.454 QAT Error: No GENERAL section found 00:23:39.454 Failed to configure qat_dev1 00:23:39.454 lstat() error: No such file or directory 00:23:39.454 QAT Error: No GENERAL section found 00:23:39.454 Failed to configure qat_dev2 00:23:39.454 enable sriov 00:23:39.454 Checking status of all devices. 00:23:39.454 There is 3 QAT acceleration device(s) in the system: 00:23:39.454 qat_dev0 - type: c6xx, inst_id: 0, node_id: 0, bsf: 0000:3d:00.0, #accel: 5 #engines: 10 state: down 00:23:39.454 qat_dev1 - type: c6xx, inst_id: 1, node_id: 0, bsf: 0000:3f:00.0, #accel: 5 #engines: 10 state: down 00:23:39.454 qat_dev2 - type: c6xx, inst_id: 2, node_id: 1, bsf: 0000:da:00.0, #accel: 5 #engines: 10 state: down 00:23:40.389 0000:3d:00.0 set to 16 VFs 00:23:40.953 0000:3f:00.0 set to 16 VFs 00:23:41.887 0000:da:00.0 set to 16 VFs 00:23:43.261 Properly configured the qat device with driver uio_pci_generic. 00:23:43.261 11:34:26 -- spdk/autotest.sh@162 -- # timing_enter lib 00:23:43.261 11:34:26 -- common/autotest_common.sh@723 -- # xtrace_disable 00:23:43.261 11:34:26 -- common/autotest_common.sh@10 -- # set +x 00:23:43.261 11:34:26 -- spdk/autotest.sh@164 -- # [[ 0 -eq 1 ]] 00:23:43.261 11:34:26 -- spdk/autotest.sh@168 -- # run_test env /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env.sh 00:23:43.261 11:34:26 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:23:43.261 11:34:26 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:23:43.261 11:34:26 -- common/autotest_common.sh@10 -- # set +x 00:23:43.261 ************************************ 00:23:43.261 START TEST env 00:23:43.261 ************************************ 00:23:43.261 11:34:27 env -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env.sh 00:23:43.261 * Looking for test storage... 00:23:43.261 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env 00:23:43.261 11:34:27 env -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/memory/memory_ut 00:23:43.261 11:34:27 env -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:23:43.261 11:34:27 env -- common/autotest_common.sh@1106 -- # xtrace_disable 00:23:43.261 11:34:27 env -- common/autotest_common.sh@10 -- # set +x 00:23:43.261 ************************************ 00:23:43.261 START TEST env_memory 00:23:43.261 ************************************ 00:23:43.261 11:34:27 env.env_memory -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/memory/memory_ut 00:23:43.261 00:23:43.261 00:23:43.261 CUnit - A unit testing framework for C - Version 2.1-3 00:23:43.261 http://cunit.sourceforge.net/ 00:23:43.261 00:23:43.261 00:23:43.261 Suite: memory 00:23:43.262 Test: alloc and free memory map ...[2024-06-10 11:34:27.201058] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:23:43.521 passed 00:23:43.521 Test: mem map translation ...[2024-06-10 11:34:27.219605] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 590:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:23:43.521 [2024-06-10 11:34:27.219623] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 590:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:23:43.521 [2024-06-10 11:34:27.219658] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 584:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:23:43.521 [2024-06-10 11:34:27.219667] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 600:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:23:43.521 passed 00:23:43.521 Test: mem map registration ...[2024-06-10 11:34:27.255554] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 346:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x200000 len=1234 00:23:43.521 [2024-06-10 11:34:27.255572] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 346:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x4d2 len=2097152 00:23:43.521 passed 00:23:43.521 Test: mem map adjacent registrations ...passed 00:23:43.521 00:23:43.521 Run Summary: Type Total Ran Passed Failed Inactive 00:23:43.521 suites 1 1 n/a 0 0 00:23:43.521 tests 4 4 4 0 0 00:23:43.521 asserts 152 152 152 0 n/a 00:23:43.521 00:23:43.521 Elapsed time = 0.134 seconds 00:23:43.521 00:23:43.521 real 0m0.149s 00:23:43.521 user 0m0.133s 00:23:43.521 sys 0m0.015s 00:23:43.521 11:34:27 env.env_memory -- common/autotest_common.sh@1125 -- # xtrace_disable 00:23:43.521 11:34:27 env.env_memory -- common/autotest_common.sh@10 -- # set +x 00:23:43.521 ************************************ 00:23:43.521 END TEST env_memory 00:23:43.521 ************************************ 00:23:43.521 11:34:27 env -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/vtophys/vtophys 00:23:43.521 11:34:27 env -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:23:43.521 11:34:27 env -- common/autotest_common.sh@1106 -- # xtrace_disable 00:23:43.521 11:34:27 env -- common/autotest_common.sh@10 -- # set +x 00:23:43.521 ************************************ 00:23:43.521 START TEST env_vtophys 00:23:43.521 ************************************ 00:23:43.521 11:34:27 env.env_vtophys -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/vtophys/vtophys 00:23:43.521 EAL: lib.eal log level changed from notice to debug 00:23:43.521 EAL: Detected lcore 0 as core 0 on socket 0 00:23:43.521 EAL: Detected lcore 1 as core 1 on socket 0 00:23:43.521 EAL: Detected lcore 2 as core 2 on socket 0 00:23:43.521 EAL: Detected lcore 3 as core 3 on socket 0 00:23:43.521 EAL: Detected lcore 4 as core 4 on socket 0 00:23:43.521 EAL: Detected lcore 5 as core 8 on socket 0 00:23:43.521 EAL: Detected lcore 6 as core 9 on socket 0 00:23:43.521 EAL: Detected lcore 7 as core 10 on socket 0 00:23:43.521 EAL: Detected lcore 8 as core 11 on socket 0 00:23:43.521 EAL: Detected lcore 9 as core 16 on socket 0 00:23:43.521 EAL: Detected lcore 10 as core 17 on socket 0 00:23:43.521 EAL: Detected lcore 11 as core 18 on socket 0 00:23:43.521 EAL: Detected lcore 12 as core 19 on socket 0 00:23:43.521 EAL: Detected lcore 13 as core 20 on socket 0 00:23:43.521 EAL: Detected lcore 14 as core 24 on socket 0 00:23:43.521 EAL: Detected lcore 15 as core 25 on socket 0 00:23:43.521 EAL: Detected lcore 16 as core 26 on socket 0 00:23:43.521 EAL: Detected lcore 17 as core 27 on socket 0 00:23:43.521 EAL: Detected lcore 18 as core 0 on socket 1 00:23:43.521 EAL: Detected lcore 19 as core 1 on socket 1 00:23:43.521 EAL: Detected lcore 20 as core 2 on socket 1 00:23:43.521 EAL: Detected lcore 21 as core 3 on socket 1 00:23:43.521 EAL: Detected lcore 22 as core 4 on socket 1 00:23:43.521 EAL: Detected lcore 23 as core 8 on socket 1 00:23:43.521 EAL: Detected lcore 24 as core 9 on socket 1 00:23:43.521 EAL: Detected lcore 25 as core 10 on socket 1 00:23:43.521 EAL: Detected lcore 26 as core 11 on socket 1 00:23:43.521 EAL: Detected lcore 27 as core 16 on socket 1 00:23:43.521 EAL: Detected lcore 28 as core 17 on socket 1 00:23:43.521 EAL: Detected lcore 29 as core 18 on socket 1 00:23:43.521 EAL: Detected lcore 30 as core 19 on socket 1 00:23:43.521 EAL: Detected lcore 31 as core 20 on socket 1 00:23:43.521 EAL: Detected lcore 32 as core 24 on socket 1 00:23:43.521 EAL: Detected lcore 33 as core 25 on socket 1 00:23:43.521 EAL: Detected lcore 34 as core 26 on socket 1 00:23:43.521 EAL: Detected lcore 35 as core 27 on socket 1 00:23:43.521 EAL: Detected lcore 36 as core 0 on socket 0 00:23:43.521 EAL: Detected lcore 37 as core 1 on socket 0 00:23:43.521 EAL: Detected lcore 38 as core 2 on socket 0 00:23:43.521 EAL: Detected lcore 39 as core 3 on socket 0 00:23:43.521 EAL: Detected lcore 40 as core 4 on socket 0 00:23:43.521 EAL: Detected lcore 41 as core 8 on socket 0 00:23:43.521 EAL: Detected lcore 42 as core 9 on socket 0 00:23:43.521 EAL: Detected lcore 43 as core 10 on socket 0 00:23:43.521 EAL: Detected lcore 44 as core 11 on socket 0 00:23:43.521 EAL: Detected lcore 45 as core 16 on socket 0 00:23:43.521 EAL: Detected lcore 46 as core 17 on socket 0 00:23:43.521 EAL: Detected lcore 47 as core 18 on socket 0 00:23:43.521 EAL: Detected lcore 48 as core 19 on socket 0 00:23:43.521 EAL: Detected lcore 49 as core 20 on socket 0 00:23:43.521 EAL: Detected lcore 50 as core 24 on socket 0 00:23:43.521 EAL: Detected lcore 51 as core 25 on socket 0 00:23:43.521 EAL: Detected lcore 52 as core 26 on socket 0 00:23:43.522 EAL: Detected lcore 53 as core 27 on socket 0 00:23:43.522 EAL: Detected lcore 54 as core 0 on socket 1 00:23:43.522 EAL: Detected lcore 55 as core 1 on socket 1 00:23:43.522 EAL: Detected lcore 56 as core 2 on socket 1 00:23:43.522 EAL: Detected lcore 57 as core 3 on socket 1 00:23:43.522 EAL: Detected lcore 58 as core 4 on socket 1 00:23:43.522 EAL: Detected lcore 59 as core 8 on socket 1 00:23:43.522 EAL: Detected lcore 60 as core 9 on socket 1 00:23:43.522 EAL: Detected lcore 61 as core 10 on socket 1 00:23:43.522 EAL: Detected lcore 62 as core 11 on socket 1 00:23:43.522 EAL: Detected lcore 63 as core 16 on socket 1 00:23:43.522 EAL: Detected lcore 64 as core 17 on socket 1 00:23:43.522 EAL: Detected lcore 65 as core 18 on socket 1 00:23:43.522 EAL: Detected lcore 66 as core 19 on socket 1 00:23:43.522 EAL: Detected lcore 67 as core 20 on socket 1 00:23:43.522 EAL: Detected lcore 68 as core 24 on socket 1 00:23:43.522 EAL: Detected lcore 69 as core 25 on socket 1 00:23:43.522 EAL: Detected lcore 70 as core 26 on socket 1 00:23:43.522 EAL: Detected lcore 71 as core 27 on socket 1 00:23:43.522 EAL: Maximum logical cores by configuration: 128 00:23:43.522 EAL: Detected CPU lcores: 72 00:23:43.522 EAL: Detected NUMA nodes: 2 00:23:43.522 EAL: Checking presence of .so 'librte_eal.so.24.1' 00:23:43.522 EAL: Detected shared linkage of DPDK 00:23:43.522 EAL: No shared files mode enabled, IPC will be disabled 00:23:43.522 EAL: No shared files mode enabled, IPC is disabled 00:23:43.522 EAL: PCI driver qat for device 0000:3d:01.0 wants IOVA as 'PA' 00:23:43.522 EAL: PCI driver qat for device 0000:3d:01.1 wants IOVA as 'PA' 00:23:43.522 EAL: PCI driver qat for device 0000:3d:01.2 wants IOVA as 'PA' 00:23:43.522 EAL: PCI driver qat for device 0000:3d:01.3 wants IOVA as 'PA' 00:23:43.522 EAL: PCI driver qat for device 0000:3d:01.4 wants IOVA as 'PA' 00:23:43.522 EAL: PCI driver qat for device 0000:3d:01.5 wants IOVA as 'PA' 00:23:43.522 EAL: PCI driver qat for device 0000:3d:01.6 wants IOVA as 'PA' 00:23:43.522 EAL: PCI driver qat for device 0000:3d:01.7 wants IOVA as 'PA' 00:23:43.522 EAL: PCI driver qat for device 0000:3d:02.0 wants IOVA as 'PA' 00:23:43.522 EAL: PCI driver qat for device 0000:3d:02.1 wants IOVA as 'PA' 00:23:43.522 EAL: PCI driver qat for device 0000:3d:02.2 wants IOVA as 'PA' 00:23:43.522 EAL: PCI driver qat for device 0000:3d:02.3 wants IOVA as 'PA' 00:23:43.522 EAL: PCI driver qat for device 0000:3d:02.4 wants IOVA as 'PA' 00:23:43.522 EAL: PCI driver qat for device 0000:3d:02.5 wants IOVA as 'PA' 00:23:43.522 EAL: PCI driver qat for device 0000:3d:02.6 wants IOVA as 'PA' 00:23:43.522 EAL: PCI driver qat for device 0000:3d:02.7 wants IOVA as 'PA' 00:23:43.522 EAL: PCI driver qat for device 0000:3f:01.0 wants IOVA as 'PA' 00:23:43.522 EAL: PCI driver qat for device 0000:3f:01.1 wants IOVA as 'PA' 00:23:43.522 EAL: PCI driver qat for device 0000:3f:01.2 wants IOVA as 'PA' 00:23:43.522 EAL: PCI driver qat for device 0000:3f:01.3 wants IOVA as 'PA' 00:23:43.522 EAL: PCI driver qat for device 0000:3f:01.4 wants IOVA as 'PA' 00:23:43.522 EAL: PCI driver qat for device 0000:3f:01.5 wants IOVA as 'PA' 00:23:43.522 EAL: PCI driver qat for device 0000:3f:01.6 wants IOVA as 'PA' 00:23:43.522 EAL: PCI driver qat for device 0000:3f:01.7 wants IOVA as 'PA' 00:23:43.522 EAL: PCI driver qat for device 0000:3f:02.0 wants IOVA as 'PA' 00:23:43.522 EAL: PCI driver qat for device 0000:3f:02.1 wants IOVA as 'PA' 00:23:43.522 EAL: PCI driver qat for device 0000:3f:02.2 wants IOVA as 'PA' 00:23:43.522 EAL: PCI driver qat for device 0000:3f:02.3 wants IOVA as 'PA' 00:23:43.522 EAL: PCI driver qat for device 0000:3f:02.4 wants IOVA as 'PA' 00:23:43.522 EAL: PCI driver qat for device 0000:3f:02.5 wants IOVA as 'PA' 00:23:43.522 EAL: PCI driver qat for device 0000:3f:02.6 wants IOVA as 'PA' 00:23:43.522 EAL: PCI driver qat for device 0000:3f:02.7 wants IOVA as 'PA' 00:23:43.522 EAL: PCI driver qat for device 0000:da:01.0 wants IOVA as 'PA' 00:23:43.522 EAL: PCI driver qat for device 0000:da:01.1 wants IOVA as 'PA' 00:23:43.522 EAL: PCI driver qat for device 0000:da:01.2 wants IOVA as 'PA' 00:23:43.522 EAL: PCI driver qat for device 0000:da:01.3 wants IOVA as 'PA' 00:23:43.522 EAL: PCI driver qat for device 0000:da:01.4 wants IOVA as 'PA' 00:23:43.522 EAL: PCI driver qat for device 0000:da:01.5 wants IOVA as 'PA' 00:23:43.522 EAL: PCI driver qat for device 0000:da:01.6 wants IOVA as 'PA' 00:23:43.522 EAL: PCI driver qat for device 0000:da:01.7 wants IOVA as 'PA' 00:23:43.522 EAL: PCI driver qat for device 0000:da:02.0 wants IOVA as 'PA' 00:23:43.522 EAL: PCI driver qat for device 0000:da:02.1 wants IOVA as 'PA' 00:23:43.522 EAL: PCI driver qat for device 0000:da:02.2 wants IOVA as 'PA' 00:23:43.522 EAL: PCI driver qat for device 0000:da:02.3 wants IOVA as 'PA' 00:23:43.522 EAL: PCI driver qat for device 0000:da:02.4 wants IOVA as 'PA' 00:23:43.522 EAL: PCI driver qat for device 0000:da:02.5 wants IOVA as 'PA' 00:23:43.522 EAL: PCI driver qat for device 0000:da:02.6 wants IOVA as 'PA' 00:23:43.522 EAL: PCI driver qat for device 0000:da:02.7 wants IOVA as 'PA' 00:23:43.522 EAL: Bus pci wants IOVA as 'PA' 00:23:43.522 EAL: Bus auxiliary wants IOVA as 'DC' 00:23:43.522 EAL: Bus vdev wants IOVA as 'DC' 00:23:43.522 EAL: Selected IOVA mode 'PA' 00:23:43.522 EAL: Probing VFIO support... 00:23:43.522 EAL: IOMMU type 1 (Type 1) is supported 00:23:43.522 EAL: IOMMU type 7 (sPAPR) is not supported 00:23:43.522 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:23:43.522 EAL: VFIO support initialized 00:23:43.522 EAL: Ask a virtual area of 0x2e000 bytes 00:23:43.522 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:23:43.522 EAL: Setting up physically contiguous memory... 00:23:43.522 EAL: Setting maximum number of open files to 524288 00:23:43.522 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:23:43.522 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:23:43.522 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:23:43.522 EAL: Ask a virtual area of 0x61000 bytes 00:23:43.522 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:23:43.522 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:23:43.522 EAL: Ask a virtual area of 0x400000000 bytes 00:23:43.522 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:23:43.522 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:23:43.522 EAL: Ask a virtual area of 0x61000 bytes 00:23:43.522 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:23:43.522 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:23:43.522 EAL: Ask a virtual area of 0x400000000 bytes 00:23:43.522 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:23:43.522 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:23:43.522 EAL: Ask a virtual area of 0x61000 bytes 00:23:43.522 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:23:43.522 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:23:43.522 EAL: Ask a virtual area of 0x400000000 bytes 00:23:43.522 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:23:43.522 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:23:43.522 EAL: Ask a virtual area of 0x61000 bytes 00:23:43.522 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:23:43.522 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:23:43.522 EAL: Ask a virtual area of 0x400000000 bytes 00:23:43.522 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:23:43.522 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:23:43.522 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:23:43.522 EAL: Ask a virtual area of 0x61000 bytes 00:23:43.522 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:23:43.522 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:23:43.522 EAL: Ask a virtual area of 0x400000000 bytes 00:23:43.522 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:23:43.522 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:23:43.522 EAL: Ask a virtual area of 0x61000 bytes 00:23:43.522 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:23:43.522 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:23:43.522 EAL: Ask a virtual area of 0x400000000 bytes 00:23:43.522 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:23:43.522 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:23:43.522 EAL: Ask a virtual area of 0x61000 bytes 00:23:43.522 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:23:43.522 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:23:43.522 EAL: Ask a virtual area of 0x400000000 bytes 00:23:43.522 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:23:43.522 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:23:43.522 EAL: Ask a virtual area of 0x61000 bytes 00:23:43.522 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:23:43.522 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:23:43.522 EAL: Ask a virtual area of 0x400000000 bytes 00:23:43.522 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:23:43.522 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:23:43.522 EAL: Hugepages will be freed exactly as allocated. 00:23:43.522 EAL: No shared files mode enabled, IPC is disabled 00:23:43.522 EAL: No shared files mode enabled, IPC is disabled 00:23:43.522 EAL: TSC frequency is ~2300000 KHz 00:23:43.522 EAL: Main lcore 0 is ready (tid=7f0e4a025b00;cpuset=[0]) 00:23:43.522 EAL: Trying to obtain current memory policy. 00:23:43.522 EAL: Setting policy MPOL_PREFERRED for socket 0 00:23:43.522 EAL: Restoring previous memory policy: 0 00:23:43.522 EAL: request: mp_malloc_sync 00:23:43.522 EAL: No shared files mode enabled, IPC is disabled 00:23:43.522 EAL: Heap on socket 0 was expanded by 2MB 00:23:43.522 EAL: PCI device 0000:3d:01.0 on NUMA socket 0 00:23:43.522 EAL: probe driver: 8086:37c9 qat 00:23:43.522 EAL: PCI memory mapped at 0x202001000000 00:23:43.522 EAL: PCI memory mapped at 0x202001001000 00:23:43.522 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.0 (socket 0) 00:23:43.522 EAL: PCI device 0000:3d:01.1 on NUMA socket 0 00:23:43.522 EAL: probe driver: 8086:37c9 qat 00:23:43.522 EAL: PCI memory mapped at 0x202001002000 00:23:43.522 EAL: PCI memory mapped at 0x202001003000 00:23:43.522 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.1 (socket 0) 00:23:43.522 EAL: PCI device 0000:3d:01.2 on NUMA socket 0 00:23:43.522 EAL: probe driver: 8086:37c9 qat 00:23:43.522 EAL: PCI memory mapped at 0x202001004000 00:23:43.522 EAL: PCI memory mapped at 0x202001005000 00:23:43.522 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.2 (socket 0) 00:23:43.522 EAL: PCI device 0000:3d:01.3 on NUMA socket 0 00:23:43.522 EAL: probe driver: 8086:37c9 qat 00:23:43.522 EAL: PCI memory mapped at 0x202001006000 00:23:43.522 EAL: PCI memory mapped at 0x202001007000 00:23:43.522 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.3 (socket 0) 00:23:43.522 EAL: PCI device 0000:3d:01.4 on NUMA socket 0 00:23:43.522 EAL: probe driver: 8086:37c9 qat 00:23:43.523 EAL: PCI memory mapped at 0x202001008000 00:23:43.523 EAL: PCI memory mapped at 0x202001009000 00:23:43.523 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.4 (socket 0) 00:23:43.523 EAL: PCI device 0000:3d:01.5 on NUMA socket 0 00:23:43.523 EAL: probe driver: 8086:37c9 qat 00:23:43.523 EAL: PCI memory mapped at 0x20200100a000 00:23:43.523 EAL: PCI memory mapped at 0x20200100b000 00:23:43.523 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.5 (socket 0) 00:23:43.523 EAL: PCI device 0000:3d:01.6 on NUMA socket 0 00:23:43.523 EAL: probe driver: 8086:37c9 qat 00:23:43.523 EAL: PCI memory mapped at 0x20200100c000 00:23:43.523 EAL: PCI memory mapped at 0x20200100d000 00:23:43.523 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.6 (socket 0) 00:23:43.523 EAL: PCI device 0000:3d:01.7 on NUMA socket 0 00:23:43.523 EAL: probe driver: 8086:37c9 qat 00:23:43.523 EAL: PCI memory mapped at 0x20200100e000 00:23:43.523 EAL: PCI memory mapped at 0x20200100f000 00:23:43.523 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.7 (socket 0) 00:23:43.523 EAL: PCI device 0000:3d:02.0 on NUMA socket 0 00:23:43.523 EAL: probe driver: 8086:37c9 qat 00:23:43.523 EAL: PCI memory mapped at 0x202001010000 00:23:43.523 EAL: PCI memory mapped at 0x202001011000 00:23:43.523 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.0 (socket 0) 00:23:43.523 EAL: PCI device 0000:3d:02.1 on NUMA socket 0 00:23:43.523 EAL: probe driver: 8086:37c9 qat 00:23:43.523 EAL: PCI memory mapped at 0x202001012000 00:23:43.523 EAL: PCI memory mapped at 0x202001013000 00:23:43.523 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.1 (socket 0) 00:23:43.523 EAL: PCI device 0000:3d:02.2 on NUMA socket 0 00:23:43.523 EAL: probe driver: 8086:37c9 qat 00:23:43.523 EAL: PCI memory mapped at 0x202001014000 00:23:43.523 EAL: PCI memory mapped at 0x202001015000 00:23:43.523 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.2 (socket 0) 00:23:43.523 EAL: PCI device 0000:3d:02.3 on NUMA socket 0 00:23:43.523 EAL: probe driver: 8086:37c9 qat 00:23:43.523 EAL: PCI memory mapped at 0x202001016000 00:23:43.523 EAL: PCI memory mapped at 0x202001017000 00:23:43.523 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.3 (socket 0) 00:23:43.523 EAL: PCI device 0000:3d:02.4 on NUMA socket 0 00:23:43.523 EAL: probe driver: 8086:37c9 qat 00:23:43.523 EAL: PCI memory mapped at 0x202001018000 00:23:43.523 EAL: PCI memory mapped at 0x202001019000 00:23:43.523 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.4 (socket 0) 00:23:43.523 EAL: PCI device 0000:3d:02.5 on NUMA socket 0 00:23:43.523 EAL: probe driver: 8086:37c9 qat 00:23:43.523 EAL: PCI memory mapped at 0x20200101a000 00:23:43.523 EAL: PCI memory mapped at 0x20200101b000 00:23:43.523 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.5 (socket 0) 00:23:43.523 EAL: PCI device 0000:3d:02.6 on NUMA socket 0 00:23:43.523 EAL: probe driver: 8086:37c9 qat 00:23:43.523 EAL: PCI memory mapped at 0x20200101c000 00:23:43.523 EAL: PCI memory mapped at 0x20200101d000 00:23:43.523 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.6 (socket 0) 00:23:43.523 EAL: PCI device 0000:3d:02.7 on NUMA socket 0 00:23:43.523 EAL: probe driver: 8086:37c9 qat 00:23:43.523 EAL: PCI memory mapped at 0x20200101e000 00:23:43.523 EAL: PCI memory mapped at 0x20200101f000 00:23:43.523 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.7 (socket 0) 00:23:43.523 EAL: PCI device 0000:3f:01.0 on NUMA socket 0 00:23:43.523 EAL: probe driver: 8086:37c9 qat 00:23:43.523 EAL: PCI memory mapped at 0x202001020000 00:23:43.523 EAL: PCI memory mapped at 0x202001021000 00:23:43.523 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.0 (socket 0) 00:23:43.523 EAL: PCI device 0000:3f:01.1 on NUMA socket 0 00:23:43.523 EAL: probe driver: 8086:37c9 qat 00:23:43.523 EAL: PCI memory mapped at 0x202001022000 00:23:43.523 EAL: PCI memory mapped at 0x202001023000 00:23:43.523 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.1 (socket 0) 00:23:43.523 EAL: PCI device 0000:3f:01.2 on NUMA socket 0 00:23:43.523 EAL: probe driver: 8086:37c9 qat 00:23:43.523 EAL: PCI memory mapped at 0x202001024000 00:23:43.523 EAL: PCI memory mapped at 0x202001025000 00:23:43.523 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.2 (socket 0) 00:23:43.523 EAL: PCI device 0000:3f:01.3 on NUMA socket 0 00:23:43.523 EAL: probe driver: 8086:37c9 qat 00:23:43.523 EAL: PCI memory mapped at 0x202001026000 00:23:43.523 EAL: PCI memory mapped at 0x202001027000 00:23:43.523 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.3 (socket 0) 00:23:43.523 EAL: PCI device 0000:3f:01.4 on NUMA socket 0 00:23:43.523 EAL: probe driver: 8086:37c9 qat 00:23:43.523 EAL: PCI memory mapped at 0x202001028000 00:23:43.523 EAL: PCI memory mapped at 0x202001029000 00:23:43.523 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.4 (socket 0) 00:23:43.523 EAL: PCI device 0000:3f:01.5 on NUMA socket 0 00:23:43.523 EAL: probe driver: 8086:37c9 qat 00:23:43.523 EAL: PCI memory mapped at 0x20200102a000 00:23:43.523 EAL: PCI memory mapped at 0x20200102b000 00:23:43.523 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.5 (socket 0) 00:23:43.523 EAL: PCI device 0000:3f:01.6 on NUMA socket 0 00:23:43.523 EAL: probe driver: 8086:37c9 qat 00:23:43.523 EAL: PCI memory mapped at 0x20200102c000 00:23:43.523 EAL: PCI memory mapped at 0x20200102d000 00:23:43.523 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.6 (socket 0) 00:23:43.523 EAL: PCI device 0000:3f:01.7 on NUMA socket 0 00:23:43.523 EAL: probe driver: 8086:37c9 qat 00:23:43.523 EAL: PCI memory mapped at 0x20200102e000 00:23:43.523 EAL: PCI memory mapped at 0x20200102f000 00:23:43.523 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.7 (socket 0) 00:23:43.523 EAL: PCI device 0000:3f:02.0 on NUMA socket 0 00:23:43.523 EAL: probe driver: 8086:37c9 qat 00:23:43.523 EAL: PCI memory mapped at 0x202001030000 00:23:43.523 EAL: PCI memory mapped at 0x202001031000 00:23:43.523 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.0 (socket 0) 00:23:43.523 EAL: PCI device 0000:3f:02.1 on NUMA socket 0 00:23:43.523 EAL: probe driver: 8086:37c9 qat 00:23:43.523 EAL: PCI memory mapped at 0x202001032000 00:23:43.523 EAL: PCI memory mapped at 0x202001033000 00:23:43.523 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.1 (socket 0) 00:23:43.523 EAL: PCI device 0000:3f:02.2 on NUMA socket 0 00:23:43.523 EAL: probe driver: 8086:37c9 qat 00:23:43.523 EAL: PCI memory mapped at 0x202001034000 00:23:43.523 EAL: PCI memory mapped at 0x202001035000 00:23:43.523 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.2 (socket 0) 00:23:43.523 EAL: PCI device 0000:3f:02.3 on NUMA socket 0 00:23:43.523 EAL: probe driver: 8086:37c9 qat 00:23:43.523 EAL: PCI memory mapped at 0x202001036000 00:23:43.523 EAL: PCI memory mapped at 0x202001037000 00:23:43.523 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.3 (socket 0) 00:23:43.523 EAL: PCI device 0000:3f:02.4 on NUMA socket 0 00:23:43.523 EAL: probe driver: 8086:37c9 qat 00:23:43.523 EAL: PCI memory mapped at 0x202001038000 00:23:43.523 EAL: PCI memory mapped at 0x202001039000 00:23:43.523 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.4 (socket 0) 00:23:43.523 EAL: PCI device 0000:3f:02.5 on NUMA socket 0 00:23:43.523 EAL: probe driver: 8086:37c9 qat 00:23:43.523 EAL: PCI memory mapped at 0x20200103a000 00:23:43.523 EAL: PCI memory mapped at 0x20200103b000 00:23:43.523 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.5 (socket 0) 00:23:43.523 EAL: PCI device 0000:3f:02.6 on NUMA socket 0 00:23:43.523 EAL: probe driver: 8086:37c9 qat 00:23:43.523 EAL: PCI memory mapped at 0x20200103c000 00:23:43.523 EAL: PCI memory mapped at 0x20200103d000 00:23:43.523 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.6 (socket 0) 00:23:43.523 EAL: PCI device 0000:3f:02.7 on NUMA socket 0 00:23:43.523 EAL: probe driver: 8086:37c9 qat 00:23:43.523 EAL: PCI memory mapped at 0x20200103e000 00:23:43.523 EAL: PCI memory mapped at 0x20200103f000 00:23:43.523 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.7 (socket 0) 00:23:43.523 EAL: PCI device 0000:da:01.0 on NUMA socket 1 00:23:43.523 EAL: probe driver: 8086:37c9 qat 00:23:43.523 EAL: PCI memory mapped at 0x202001040000 00:23:43.523 EAL: PCI memory mapped at 0x202001041000 00:23:43.523 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.0 (socket 1) 00:23:43.523 EAL: Trying to obtain current memory policy. 00:23:43.523 EAL: Setting policy MPOL_PREFERRED for socket 1 00:23:43.523 EAL: Restoring previous memory policy: 4 00:23:43.523 EAL: request: mp_malloc_sync 00:23:43.523 EAL: No shared files mode enabled, IPC is disabled 00:23:43.523 EAL: Heap on socket 1 was expanded by 2MB 00:23:43.523 EAL: PCI device 0000:da:01.1 on NUMA socket 1 00:23:43.523 EAL: probe driver: 8086:37c9 qat 00:23:43.523 EAL: PCI memory mapped at 0x202001042000 00:23:43.523 EAL: PCI memory mapped at 0x202001043000 00:23:43.523 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.1 (socket 1) 00:23:43.523 EAL: PCI device 0000:da:01.2 on NUMA socket 1 00:23:43.523 EAL: probe driver: 8086:37c9 qat 00:23:43.523 EAL: PCI memory mapped at 0x202001044000 00:23:43.523 EAL: PCI memory mapped at 0x202001045000 00:23:43.523 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.2 (socket 1) 00:23:43.523 EAL: PCI device 0000:da:01.3 on NUMA socket 1 00:23:43.523 EAL: probe driver: 8086:37c9 qat 00:23:43.523 EAL: PCI memory mapped at 0x202001046000 00:23:43.523 EAL: PCI memory mapped at 0x202001047000 00:23:43.523 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.3 (socket 1) 00:23:43.523 EAL: PCI device 0000:da:01.4 on NUMA socket 1 00:23:43.523 EAL: probe driver: 8086:37c9 qat 00:23:43.523 EAL: PCI memory mapped at 0x202001048000 00:23:43.523 EAL: PCI memory mapped at 0x202001049000 00:23:43.523 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.4 (socket 1) 00:23:43.523 EAL: PCI device 0000:da:01.5 on NUMA socket 1 00:23:43.523 EAL: probe driver: 8086:37c9 qat 00:23:43.523 EAL: PCI memory mapped at 0x20200104a000 00:23:43.783 EAL: PCI memory mapped at 0x20200104b000 00:23:43.783 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.5 (socket 1) 00:23:43.783 EAL: PCI device 0000:da:01.6 on NUMA socket 1 00:23:43.783 EAL: probe driver: 8086:37c9 qat 00:23:43.783 EAL: PCI memory mapped at 0x20200104c000 00:23:43.783 EAL: PCI memory mapped at 0x20200104d000 00:23:43.783 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.6 (socket 1) 00:23:43.783 EAL: PCI device 0000:da:01.7 on NUMA socket 1 00:23:43.783 EAL: probe driver: 8086:37c9 qat 00:23:43.783 EAL: PCI memory mapped at 0x20200104e000 00:23:43.783 EAL: PCI memory mapped at 0x20200104f000 00:23:43.783 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.7 (socket 1) 00:23:43.783 EAL: PCI device 0000:da:02.0 on NUMA socket 1 00:23:43.783 EAL: probe driver: 8086:37c9 qat 00:23:43.783 EAL: PCI memory mapped at 0x202001050000 00:23:43.783 EAL: PCI memory mapped at 0x202001051000 00:23:43.783 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.0 (socket 1) 00:23:43.783 EAL: PCI device 0000:da:02.1 on NUMA socket 1 00:23:43.783 EAL: probe driver: 8086:37c9 qat 00:23:43.783 EAL: PCI memory mapped at 0x202001052000 00:23:43.783 EAL: PCI memory mapped at 0x202001053000 00:23:43.783 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.1 (socket 1) 00:23:43.783 EAL: PCI device 0000:da:02.2 on NUMA socket 1 00:23:43.783 EAL: probe driver: 8086:37c9 qat 00:23:43.783 EAL: PCI memory mapped at 0x202001054000 00:23:43.783 EAL: PCI memory mapped at 0x202001055000 00:23:43.783 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.2 (socket 1) 00:23:43.783 EAL: PCI device 0000:da:02.3 on NUMA socket 1 00:23:43.783 EAL: probe driver: 8086:37c9 qat 00:23:43.783 EAL: PCI memory mapped at 0x202001056000 00:23:43.783 EAL: PCI memory mapped at 0x202001057000 00:23:43.783 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.3 (socket 1) 00:23:43.783 EAL: PCI device 0000:da:02.4 on NUMA socket 1 00:23:43.783 EAL: probe driver: 8086:37c9 qat 00:23:43.783 EAL: PCI memory mapped at 0x202001058000 00:23:43.783 EAL: PCI memory mapped at 0x202001059000 00:23:43.783 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.4 (socket 1) 00:23:43.783 EAL: PCI device 0000:da:02.5 on NUMA socket 1 00:23:43.783 EAL: probe driver: 8086:37c9 qat 00:23:43.783 EAL: PCI memory mapped at 0x20200105a000 00:23:43.783 EAL: PCI memory mapped at 0x20200105b000 00:23:43.783 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.5 (socket 1) 00:23:43.783 EAL: PCI device 0000:da:02.6 on NUMA socket 1 00:23:43.783 EAL: probe driver: 8086:37c9 qat 00:23:43.783 EAL: PCI memory mapped at 0x20200105c000 00:23:43.783 EAL: PCI memory mapped at 0x20200105d000 00:23:43.783 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.6 (socket 1) 00:23:43.783 EAL: PCI device 0000:da:02.7 on NUMA socket 1 00:23:43.783 EAL: probe driver: 8086:37c9 qat 00:23:43.783 EAL: PCI memory mapped at 0x20200105e000 00:23:43.783 EAL: PCI memory mapped at 0x20200105f000 00:23:43.783 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.7 (socket 1) 00:23:43.783 EAL: No shared files mode enabled, IPC is disabled 00:23:43.783 EAL: No shared files mode enabled, IPC is disabled 00:23:43.783 EAL: No PCI address specified using 'addr=' in: bus=pci 00:23:43.783 EAL: Mem event callback 'spdk:(nil)' registered 00:23:43.783 00:23:43.783 00:23:43.783 CUnit - A unit testing framework for C - Version 2.1-3 00:23:43.783 http://cunit.sourceforge.net/ 00:23:43.783 00:23:43.783 00:23:43.783 Suite: components_suite 00:23:43.783 Test: vtophys_malloc_test ...passed 00:23:43.783 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:23:43.783 EAL: Setting policy MPOL_PREFERRED for socket 0 00:23:43.783 EAL: Restoring previous memory policy: 4 00:23:43.783 EAL: Calling mem event callback 'spdk:(nil)' 00:23:43.783 EAL: request: mp_malloc_sync 00:23:43.783 EAL: No shared files mode enabled, IPC is disabled 00:23:43.783 EAL: Heap on socket 0 was expanded by 4MB 00:23:43.783 EAL: Calling mem event callback 'spdk:(nil)' 00:23:43.783 EAL: request: mp_malloc_sync 00:23:43.783 EAL: No shared files mode enabled, IPC is disabled 00:23:43.783 EAL: Heap on socket 0 was shrunk by 4MB 00:23:43.783 EAL: Trying to obtain current memory policy. 00:23:43.783 EAL: Setting policy MPOL_PREFERRED for socket 0 00:23:43.783 EAL: Restoring previous memory policy: 4 00:23:43.783 EAL: Calling mem event callback 'spdk:(nil)' 00:23:43.783 EAL: request: mp_malloc_sync 00:23:43.783 EAL: No shared files mode enabled, IPC is disabled 00:23:43.783 EAL: Heap on socket 0 was expanded by 6MB 00:23:43.783 EAL: Calling mem event callback 'spdk:(nil)' 00:23:43.783 EAL: request: mp_malloc_sync 00:23:43.783 EAL: No shared files mode enabled, IPC is disabled 00:23:43.783 EAL: Heap on socket 0 was shrunk by 6MB 00:23:43.783 EAL: Trying to obtain current memory policy. 00:23:43.783 EAL: Setting policy MPOL_PREFERRED for socket 0 00:23:43.783 EAL: Restoring previous memory policy: 4 00:23:43.783 EAL: Calling mem event callback 'spdk:(nil)' 00:23:43.783 EAL: request: mp_malloc_sync 00:23:43.783 EAL: No shared files mode enabled, IPC is disabled 00:23:43.783 EAL: Heap on socket 0 was expanded by 10MB 00:23:43.783 EAL: Calling mem event callback 'spdk:(nil)' 00:23:43.783 EAL: request: mp_malloc_sync 00:23:43.783 EAL: No shared files mode enabled, IPC is disabled 00:23:43.783 EAL: Heap on socket 0 was shrunk by 10MB 00:23:43.783 EAL: Trying to obtain current memory policy. 00:23:43.783 EAL: Setting policy MPOL_PREFERRED for socket 0 00:23:43.783 EAL: Restoring previous memory policy: 4 00:23:43.783 EAL: Calling mem event callback 'spdk:(nil)' 00:23:43.783 EAL: request: mp_malloc_sync 00:23:43.783 EAL: No shared files mode enabled, IPC is disabled 00:23:43.783 EAL: Heap on socket 0 was expanded by 18MB 00:23:43.783 EAL: Calling mem event callback 'spdk:(nil)' 00:23:43.783 EAL: request: mp_malloc_sync 00:23:43.783 EAL: No shared files mode enabled, IPC is disabled 00:23:43.783 EAL: Heap on socket 0 was shrunk by 18MB 00:23:43.783 EAL: Trying to obtain current memory policy. 00:23:43.783 EAL: Setting policy MPOL_PREFERRED for socket 0 00:23:43.783 EAL: Restoring previous memory policy: 4 00:23:43.783 EAL: Calling mem event callback 'spdk:(nil)' 00:23:43.783 EAL: request: mp_malloc_sync 00:23:43.783 EAL: No shared files mode enabled, IPC is disabled 00:23:43.783 EAL: Heap on socket 0 was expanded by 34MB 00:23:43.783 EAL: Calling mem event callback 'spdk:(nil)' 00:23:43.783 EAL: request: mp_malloc_sync 00:23:43.783 EAL: No shared files mode enabled, IPC is disabled 00:23:43.783 EAL: Heap on socket 0 was shrunk by 34MB 00:23:43.783 EAL: Trying to obtain current memory policy. 00:23:43.783 EAL: Setting policy MPOL_PREFERRED for socket 0 00:23:43.783 EAL: Restoring previous memory policy: 4 00:23:43.783 EAL: Calling mem event callback 'spdk:(nil)' 00:23:43.783 EAL: request: mp_malloc_sync 00:23:43.783 EAL: No shared files mode enabled, IPC is disabled 00:23:43.783 EAL: Heap on socket 0 was expanded by 66MB 00:23:43.783 EAL: Calling mem event callback 'spdk:(nil)' 00:23:43.783 EAL: request: mp_malloc_sync 00:23:43.783 EAL: No shared files mode enabled, IPC is disabled 00:23:43.783 EAL: Heap on socket 0 was shrunk by 66MB 00:23:43.783 EAL: Trying to obtain current memory policy. 00:23:43.783 EAL: Setting policy MPOL_PREFERRED for socket 0 00:23:43.783 EAL: Restoring previous memory policy: 4 00:23:43.783 EAL: Calling mem event callback 'spdk:(nil)' 00:23:43.783 EAL: request: mp_malloc_sync 00:23:43.783 EAL: No shared files mode enabled, IPC is disabled 00:23:43.783 EAL: Heap on socket 0 was expanded by 130MB 00:23:43.783 EAL: Calling mem event callback 'spdk:(nil)' 00:23:43.783 EAL: request: mp_malloc_sync 00:23:43.783 EAL: No shared files mode enabled, IPC is disabled 00:23:43.784 EAL: Heap on socket 0 was shrunk by 130MB 00:23:43.784 EAL: Trying to obtain current memory policy. 00:23:43.784 EAL: Setting policy MPOL_PREFERRED for socket 0 00:23:43.784 EAL: Restoring previous memory policy: 4 00:23:43.784 EAL: Calling mem event callback 'spdk:(nil)' 00:23:43.784 EAL: request: mp_malloc_sync 00:23:43.784 EAL: No shared files mode enabled, IPC is disabled 00:23:43.784 EAL: Heap on socket 0 was expanded by 258MB 00:23:43.784 EAL: Calling mem event callback 'spdk:(nil)' 00:23:44.042 EAL: request: mp_malloc_sync 00:23:44.042 EAL: No shared files mode enabled, IPC is disabled 00:23:44.042 EAL: Heap on socket 0 was shrunk by 258MB 00:23:44.042 EAL: Trying to obtain current memory policy. 00:23:44.042 EAL: Setting policy MPOL_PREFERRED for socket 0 00:23:44.042 EAL: Restoring previous memory policy: 4 00:23:44.042 EAL: Calling mem event callback 'spdk:(nil)' 00:23:44.042 EAL: request: mp_malloc_sync 00:23:44.042 EAL: No shared files mode enabled, IPC is disabled 00:23:44.042 EAL: Heap on socket 0 was expanded by 514MB 00:23:44.042 EAL: Calling mem event callback 'spdk:(nil)' 00:23:44.301 EAL: request: mp_malloc_sync 00:23:44.301 EAL: No shared files mode enabled, IPC is disabled 00:23:44.301 EAL: Heap on socket 0 was shrunk by 514MB 00:23:44.301 EAL: Trying to obtain current memory policy. 00:23:44.301 EAL: Setting policy MPOL_PREFERRED for socket 0 00:23:44.559 EAL: Restoring previous memory policy: 4 00:23:44.559 EAL: Calling mem event callback 'spdk:(nil)' 00:23:44.559 EAL: request: mp_malloc_sync 00:23:44.559 EAL: No shared files mode enabled, IPC is disabled 00:23:44.559 EAL: Heap on socket 0 was expanded by 1026MB 00:23:44.559 EAL: Calling mem event callback 'spdk:(nil)' 00:23:44.818 EAL: request: mp_malloc_sync 00:23:44.818 EAL: No shared files mode enabled, IPC is disabled 00:23:44.818 EAL: Heap on socket 0 was shrunk by 1026MB 00:23:44.818 passed 00:23:44.818 00:23:44.818 Run Summary: Type Total Ran Passed Failed Inactive 00:23:44.818 suites 1 1 n/a 0 0 00:23:44.818 tests 2 2 2 0 0 00:23:44.818 asserts 6240 6240 6240 0 n/a 00:23:44.818 00:23:44.818 Elapsed time = 1.129 seconds 00:23:44.818 EAL: No shared files mode enabled, IPC is disabled 00:23:44.818 EAL: No shared files mode enabled, IPC is disabled 00:23:44.818 EAL: No shared files mode enabled, IPC is disabled 00:23:44.818 00:23:44.818 real 0m1.279s 00:23:44.818 user 0m0.745s 00:23:44.818 sys 0m0.510s 00:23:44.818 11:34:28 env.env_vtophys -- common/autotest_common.sh@1125 -- # xtrace_disable 00:23:44.818 11:34:28 env.env_vtophys -- common/autotest_common.sh@10 -- # set +x 00:23:44.818 ************************************ 00:23:44.818 END TEST env_vtophys 00:23:44.818 ************************************ 00:23:44.818 11:34:28 env -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/pci/pci_ut 00:23:44.818 11:34:28 env -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:23:44.818 11:34:28 env -- common/autotest_common.sh@1106 -- # xtrace_disable 00:23:44.818 11:34:28 env -- common/autotest_common.sh@10 -- # set +x 00:23:44.818 ************************************ 00:23:44.818 START TEST env_pci 00:23:44.818 ************************************ 00:23:44.818 11:34:28 env.env_pci -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/pci/pci_ut 00:23:44.818 00:23:44.818 00:23:44.818 CUnit - A unit testing framework for C - Version 2.1-3 00:23:44.818 http://cunit.sourceforge.net/ 00:23:44.818 00:23:44.818 00:23:44.818 Suite: pci 00:23:45.078 Test: pci_hook ...[2024-06-10 11:34:28.764025] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/pci.c:1040:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 93917 has claimed it 00:23:45.078 EAL: Cannot find device (10000:00:01.0) 00:23:45.078 EAL: Failed to attach device on primary process 00:23:45.078 passed 00:23:45.078 00:23:45.078 Run Summary: Type Total Ran Passed Failed Inactive 00:23:45.078 suites 1 1 n/a 0 0 00:23:45.078 tests 1 1 1 0 0 00:23:45.078 asserts 25 25 25 0 n/a 00:23:45.078 00:23:45.078 Elapsed time = 0.035 seconds 00:23:45.078 00:23:45.078 real 0m0.059s 00:23:45.078 user 0m0.023s 00:23:45.078 sys 0m0.036s 00:23:45.078 11:34:28 env.env_pci -- common/autotest_common.sh@1125 -- # xtrace_disable 00:23:45.078 11:34:28 env.env_pci -- common/autotest_common.sh@10 -- # set +x 00:23:45.078 ************************************ 00:23:45.078 END TEST env_pci 00:23:45.078 ************************************ 00:23:45.078 11:34:28 env -- env/env.sh@14 -- # argv='-c 0x1 ' 00:23:45.078 11:34:28 env -- env/env.sh@15 -- # uname 00:23:45.078 11:34:28 env -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:23:45.078 11:34:28 env -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:23:45.078 11:34:28 env -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:23:45.078 11:34:28 env -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:23:45.078 11:34:28 env -- common/autotest_common.sh@1106 -- # xtrace_disable 00:23:45.078 11:34:28 env -- common/autotest_common.sh@10 -- # set +x 00:23:45.078 ************************************ 00:23:45.078 START TEST env_dpdk_post_init 00:23:45.078 ************************************ 00:23:45.078 11:34:28 env.env_dpdk_post_init -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:23:45.078 EAL: Detected CPU lcores: 72 00:23:45.078 EAL: Detected NUMA nodes: 2 00:23:45.078 EAL: Detected shared linkage of DPDK 00:23:45.078 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:23:45.078 EAL: Selected IOVA mode 'PA' 00:23:45.078 EAL: VFIO support initialized 00:23:45.078 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.0 (socket 0) 00:23:45.078 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_asym 00:23:45.078 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:23:45.078 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_sym 00:23:45.079 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:23:45.079 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.1 (socket 0) 00:23:45.079 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_asym 00:23:45.079 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:23:45.079 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_sym 00:23:45.079 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:23:45.079 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.2 (socket 0) 00:23:45.079 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_asym 00:23:45.079 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:23:45.079 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_sym 00:23:45.079 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:23:45.079 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.3 (socket 0) 00:23:45.079 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_asym 00:23:45.079 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:23:45.079 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_sym 00:23:45.079 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:23:45.079 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.4 (socket 0) 00:23:45.079 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_asym 00:23:45.079 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:23:45.079 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_sym 00:23:45.079 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:23:45.079 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.5 (socket 0) 00:23:45.079 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_asym 00:23:45.079 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:23:45.079 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_sym 00:23:45.079 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:23:45.079 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.6 (socket 0) 00:23:45.079 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_asym 00:23:45.079 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:23:45.079 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_sym 00:23:45.079 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:23:45.079 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.7 (socket 0) 00:23:45.079 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_asym 00:23:45.079 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:23:45.079 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_sym 00:23:45.079 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:23:45.079 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.0 (socket 0) 00:23:45.079 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_asym 00:23:45.079 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:23:45.079 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_sym 00:23:45.079 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:23:45.079 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.1 (socket 0) 00:23:45.079 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_asym 00:23:45.079 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:23:45.079 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_sym 00:23:45.079 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:23:45.079 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.2 (socket 0) 00:23:45.079 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_asym 00:23:45.079 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:23:45.079 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_sym 00:23:45.079 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:23:45.079 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.3 (socket 0) 00:23:45.079 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_asym 00:23:45.079 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:23:45.079 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_sym 00:23:45.079 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:23:45.079 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.4 (socket 0) 00:23:45.079 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_asym 00:23:45.079 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:23:45.079 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_sym 00:23:45.079 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:23:45.079 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.5 (socket 0) 00:23:45.079 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_asym 00:23:45.079 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:23:45.079 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_sym 00:23:45.079 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:23:45.079 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.6 (socket 0) 00:23:45.079 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_asym 00:23:45.079 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:23:45.079 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_sym 00:23:45.079 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:23:45.079 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.7 (socket 0) 00:23:45.079 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_asym 00:23:45.079 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:23:45.079 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_sym 00:23:45.079 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:23:45.079 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.0 (socket 0) 00:23:45.079 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_asym 00:23:45.079 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:23:45.079 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_sym 00:23:45.079 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:23:45.079 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.1 (socket 0) 00:23:45.079 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_asym 00:23:45.079 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:23:45.079 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_sym 00:23:45.079 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:23:45.079 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.2 (socket 0) 00:23:45.079 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_asym 00:23:45.079 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:23:45.079 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_sym 00:23:45.079 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:23:45.079 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.3 (socket 0) 00:23:45.079 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_asym 00:23:45.079 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:23:45.079 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_sym 00:23:45.079 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:23:45.079 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.4 (socket 0) 00:23:45.079 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_asym 00:23:45.079 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:23:45.079 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_sym 00:23:45.079 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:23:45.079 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.5 (socket 0) 00:23:45.079 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_asym 00:23:45.079 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:23:45.079 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_sym 00:23:45.079 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:23:45.079 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.6 (socket 0) 00:23:45.079 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_asym 00:23:45.079 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:23:45.079 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_sym 00:23:45.079 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:23:45.079 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.7 (socket 0) 00:23:45.079 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_asym 00:23:45.079 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:23:45.079 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_sym 00:23:45.079 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:23:45.079 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.0 (socket 0) 00:23:45.079 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_asym 00:23:45.079 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:23:45.079 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_sym 00:23:45.079 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:23:45.079 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.1 (socket 0) 00:23:45.079 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_asym 00:23:45.079 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:23:45.079 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_sym 00:23:45.079 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:23:45.079 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.2 (socket 0) 00:23:45.079 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_asym 00:23:45.079 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:23:45.079 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_sym 00:23:45.079 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:23:45.079 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.3 (socket 0) 00:23:45.079 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_asym 00:23:45.079 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:23:45.079 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_sym 00:23:45.079 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:23:45.079 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.4 (socket 0) 00:23:45.079 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_asym 00:23:45.079 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:23:45.079 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_sym 00:23:45.079 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:23:45.079 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.5 (socket 0) 00:23:45.080 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_asym 00:23:45.080 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:23:45.080 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_sym 00:23:45.080 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:23:45.080 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.6 (socket 0) 00:23:45.080 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_asym 00:23:45.080 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:23:45.080 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_sym 00:23:45.080 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:23:45.080 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.7 (socket 0) 00:23:45.080 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_asym 00:23:45.080 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:23:45.080 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_sym 00:23:45.080 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:23:45.080 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.0 (socket 1) 00:23:45.080 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_asym 00:23:45.080 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:23:45.080 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_sym 00:23:45.080 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:23:45.080 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.1 (socket 1) 00:23:45.080 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_asym 00:23:45.080 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:23:45.080 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_sym 00:23:45.080 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:23:45.080 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.2 (socket 1) 00:23:45.080 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_asym 00:23:45.080 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:23:45.080 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_sym 00:23:45.080 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:23:45.080 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.3 (socket 1) 00:23:45.080 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_asym 00:23:45.080 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:23:45.080 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_sym 00:23:45.080 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:23:45.080 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.4 (socket 1) 00:23:45.080 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_asym 00:23:45.080 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:23:45.080 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_sym 00:23:45.080 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:23:45.080 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.5 (socket 1) 00:23:45.080 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_asym 00:23:45.080 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:23:45.080 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_sym 00:23:45.080 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:23:45.080 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.6 (socket 1) 00:23:45.080 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_asym 00:23:45.080 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:23:45.080 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_sym 00:23:45.080 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:23:45.080 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.7 (socket 1) 00:23:45.080 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_asym 00:23:45.080 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:23:45.080 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_sym 00:23:45.080 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:23:45.080 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.0 (socket 1) 00:23:45.080 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_asym 00:23:45.080 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:23:45.080 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_sym 00:23:45.080 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:23:45.080 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.1 (socket 1) 00:23:45.080 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_asym 00:23:45.080 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:23:45.080 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_sym 00:23:45.080 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:23:45.080 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.2 (socket 1) 00:23:45.080 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_asym 00:23:45.080 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:23:45.080 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_sym 00:23:45.080 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:23:45.080 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.3 (socket 1) 00:23:45.080 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_asym 00:23:45.080 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:23:45.080 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_sym 00:23:45.080 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:23:45.080 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.4 (socket 1) 00:23:45.080 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_asym 00:23:45.080 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:23:45.080 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_sym 00:23:45.080 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:23:45.080 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.5 (socket 1) 00:23:45.080 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_asym 00:23:45.080 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:23:45.080 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_sym 00:23:45.080 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:23:45.080 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.6 (socket 1) 00:23:45.080 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_asym 00:23:45.080 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:23:45.080 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_sym 00:23:45.080 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:23:45.080 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.7 (socket 1) 00:23:45.080 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_asym 00:23:45.080 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:23:45.080 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_sym 00:23:45.080 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:23:45.080 TELEMETRY: No legacy callbacks, legacy socket not created 00:23:45.080 EAL: Using IOMMU type 1 (Type 1) 00:23:45.340 EAL: Ignore mapping IO port bar(1) 00:23:45.340 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.0 (socket 0) 00:23:45.340 EAL: Ignore mapping IO port bar(1) 00:23:45.340 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.1 (socket 0) 00:23:45.340 EAL: Ignore mapping IO port bar(1) 00:23:45.340 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.2 (socket 0) 00:23:45.340 EAL: Ignore mapping IO port bar(1) 00:23:45.340 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.3 (socket 0) 00:23:45.340 EAL: Ignore mapping IO port bar(1) 00:23:45.340 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.4 (socket 0) 00:23:45.340 EAL: Ignore mapping IO port bar(1) 00:23:45.340 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.5 (socket 0) 00:23:45.340 EAL: Ignore mapping IO port bar(1) 00:23:45.340 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.6 (socket 0) 00:23:45.340 EAL: Ignore mapping IO port bar(1) 00:23:45.340 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.7 (socket 0) 00:23:45.598 EAL: Probe PCI driver: spdk_nvme (8086:0b60) device: 0000:5e:00.0 (socket 0) 00:23:45.598 EAL: Ignore mapping IO port bar(1) 00:23:45.598 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.0 (socket 1) 00:23:45.599 EAL: Ignore mapping IO port bar(1) 00:23:45.599 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.1 (socket 1) 00:23:45.599 EAL: Ignore mapping IO port bar(1) 00:23:45.599 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.2 (socket 1) 00:23:45.599 EAL: Ignore mapping IO port bar(1) 00:23:45.599 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.3 (socket 1) 00:23:45.599 EAL: Ignore mapping IO port bar(1) 00:23:45.599 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.4 (socket 1) 00:23:45.599 EAL: Ignore mapping IO port bar(1) 00:23:45.599 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.5 (socket 1) 00:23:45.599 EAL: Ignore mapping IO port bar(1) 00:23:45.599 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.6 (socket 1) 00:23:45.599 EAL: Ignore mapping IO port bar(1) 00:23:45.599 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.7 (socket 1) 00:23:45.599 EAL: Ignore mapping IO port bar(1) 00:23:45.599 EAL: Ignore mapping IO port bar(5) 00:23:45.599 EAL: Probe PCI driver: spdk_vmd (8086:201d) device: 0000:85:05.5 (socket 1) 00:23:47.050 EAL: Releasing PCI mapped resource for 0000:5e:00.0 00:23:47.050 EAL: Calling pci_unmap_resource for 0000:5e:00.0 at 0x202001080000 00:23:47.050 Starting DPDK initialization... 00:23:47.050 Starting SPDK post initialization... 00:23:47.050 SPDK NVMe probe 00:23:47.050 Attaching to 0000:5e:00.0 00:23:47.050 Attached to 0000:5e:00.0 00:23:47.050 Cleaning up... 00:23:47.050 00:23:47.050 real 0m2.042s 00:23:47.050 user 0m1.262s 00:23:47.050 sys 0m0.355s 00:23:47.050 11:34:30 env.env_dpdk_post_init -- common/autotest_common.sh@1125 -- # xtrace_disable 00:23:47.050 11:34:30 env.env_dpdk_post_init -- common/autotest_common.sh@10 -- # set +x 00:23:47.050 ************************************ 00:23:47.050 END TEST env_dpdk_post_init 00:23:47.050 ************************************ 00:23:47.050 11:34:30 env -- env/env.sh@26 -- # uname 00:23:47.050 11:34:30 env -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:23:47.051 11:34:30 env -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:23:47.051 11:34:30 env -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:23:47.051 11:34:30 env -- common/autotest_common.sh@1106 -- # xtrace_disable 00:23:47.051 11:34:30 env -- common/autotest_common.sh@10 -- # set +x 00:23:47.310 ************************************ 00:23:47.310 START TEST env_mem_callbacks 00:23:47.310 ************************************ 00:23:47.310 11:34:30 env.env_mem_callbacks -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:23:47.310 EAL: Detected CPU lcores: 72 00:23:47.310 EAL: Detected NUMA nodes: 2 00:23:47.310 EAL: Detected shared linkage of DPDK 00:23:47.310 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:23:47.310 EAL: Selected IOVA mode 'PA' 00:23:47.310 EAL: VFIO support initialized 00:23:47.310 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.0 (socket 0) 00:23:47.310 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_asym 00:23:47.310 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:23:47.310 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_sym 00:23:47.310 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:23:47.310 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.1 (socket 0) 00:23:47.310 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_asym 00:23:47.310 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:23:47.310 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_sym 00:23:47.310 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:23:47.310 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.2 (socket 0) 00:23:47.310 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_asym 00:23:47.310 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:23:47.310 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_sym 00:23:47.310 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:23:47.310 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.3 (socket 0) 00:23:47.310 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_asym 00:23:47.310 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:23:47.310 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_sym 00:23:47.310 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:23:47.310 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.4 (socket 0) 00:23:47.310 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_asym 00:23:47.310 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:23:47.310 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_sym 00:23:47.310 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:23:47.310 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.5 (socket 0) 00:23:47.310 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_asym 00:23:47.310 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:23:47.310 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_sym 00:23:47.310 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:23:47.310 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.6 (socket 0) 00:23:47.310 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_asym 00:23:47.310 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:23:47.310 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_sym 00:23:47.310 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:23:47.310 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.7 (socket 0) 00:23:47.310 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_asym 00:23:47.310 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:23:47.310 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_sym 00:23:47.310 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:23:47.310 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.0 (socket 0) 00:23:47.310 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_asym 00:23:47.310 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:23:47.310 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_sym 00:23:47.310 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:23:47.310 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.1 (socket 0) 00:23:47.310 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_asym 00:23:47.310 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:23:47.310 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_sym 00:23:47.310 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:23:47.310 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.2 (socket 0) 00:23:47.310 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_asym 00:23:47.310 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:23:47.310 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_sym 00:23:47.310 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:23:47.310 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.3 (socket 0) 00:23:47.310 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_asym 00:23:47.310 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:23:47.310 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_sym 00:23:47.310 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:23:47.310 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.4 (socket 0) 00:23:47.310 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_asym 00:23:47.310 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:23:47.310 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_sym 00:23:47.310 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:23:47.310 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.5 (socket 0) 00:23:47.310 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_asym 00:23:47.310 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:23:47.310 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_sym 00:23:47.310 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:23:47.310 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.6 (socket 0) 00:23:47.310 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_asym 00:23:47.310 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:23:47.310 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_sym 00:23:47.310 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:23:47.310 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.7 (socket 0) 00:23:47.310 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_asym 00:23:47.310 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:23:47.310 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_sym 00:23:47.310 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:23:47.310 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.0 (socket 0) 00:23:47.310 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_asym 00:23:47.310 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:23:47.310 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_sym 00:23:47.310 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:23:47.310 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.1 (socket 0) 00:23:47.310 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_asym 00:23:47.310 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:23:47.310 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_sym 00:23:47.310 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:23:47.310 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.2 (socket 0) 00:23:47.310 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_asym 00:23:47.310 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:23:47.310 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_sym 00:23:47.310 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:23:47.310 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.3 (socket 0) 00:23:47.310 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_asym 00:23:47.310 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:23:47.310 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_sym 00:23:47.311 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:23:47.311 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.4 (socket 0) 00:23:47.311 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_asym 00:23:47.311 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:23:47.311 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_sym 00:23:47.311 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:23:47.311 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.5 (socket 0) 00:23:47.311 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_asym 00:23:47.311 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:23:47.311 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_sym 00:23:47.311 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:23:47.311 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.6 (socket 0) 00:23:47.311 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_asym 00:23:47.311 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:23:47.311 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_sym 00:23:47.311 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:23:47.311 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.7 (socket 0) 00:23:47.311 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_asym 00:23:47.311 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:23:47.311 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_sym 00:23:47.311 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:23:47.311 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.0 (socket 0) 00:23:47.311 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_asym 00:23:47.311 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:23:47.311 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_sym 00:23:47.311 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:23:47.311 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.1 (socket 0) 00:23:47.311 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_asym 00:23:47.311 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:23:47.311 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_sym 00:23:47.311 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:23:47.311 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.2 (socket 0) 00:23:47.311 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_asym 00:23:47.311 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:23:47.311 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_sym 00:23:47.311 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:23:47.311 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.3 (socket 0) 00:23:47.311 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_asym 00:23:47.311 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:23:47.311 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_sym 00:23:47.311 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:23:47.311 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.4 (socket 0) 00:23:47.311 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_asym 00:23:47.311 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:23:47.311 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_sym 00:23:47.311 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:23:47.311 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.5 (socket 0) 00:23:47.311 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_asym 00:23:47.311 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:23:47.311 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_sym 00:23:47.311 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:23:47.311 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.6 (socket 0) 00:23:47.311 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_asym 00:23:47.311 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:23:47.311 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_sym 00:23:47.311 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:23:47.311 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.7 (socket 0) 00:23:47.311 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_asym 00:23:47.311 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:23:47.311 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_sym 00:23:47.311 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:23:47.311 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.0 (socket 1) 00:23:47.311 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_asym 00:23:47.311 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:23:47.311 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_sym 00:23:47.311 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:23:47.311 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.1 (socket 1) 00:23:47.311 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_asym 00:23:47.311 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:23:47.311 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_sym 00:23:47.311 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:23:47.311 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.2 (socket 1) 00:23:47.311 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_asym 00:23:47.311 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:23:47.311 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_sym 00:23:47.311 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:23:47.311 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.3 (socket 1) 00:23:47.311 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_asym 00:23:47.311 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:23:47.311 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_sym 00:23:47.311 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:23:47.311 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.4 (socket 1) 00:23:47.311 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_asym 00:23:47.311 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:23:47.311 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_sym 00:23:47.311 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:23:47.311 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.5 (socket 1) 00:23:47.311 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_asym 00:23:47.311 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:23:47.311 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_sym 00:23:47.311 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:23:47.311 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.6 (socket 1) 00:23:47.311 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_asym 00:23:47.311 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:23:47.311 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_sym 00:23:47.311 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:23:47.311 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.7 (socket 1) 00:23:47.311 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_asym 00:23:47.311 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:23:47.311 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_sym 00:23:47.311 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:23:47.311 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.0 (socket 1) 00:23:47.311 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_asym 00:23:47.311 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:23:47.311 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_sym 00:23:47.311 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:23:47.311 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.1 (socket 1) 00:23:47.311 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_asym 00:23:47.311 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:23:47.311 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_sym 00:23:47.311 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:23:47.311 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.2 (socket 1) 00:23:47.311 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_asym 00:23:47.311 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:23:47.311 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_sym 00:23:47.311 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:23:47.311 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.3 (socket 1) 00:23:47.311 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_asym 00:23:47.311 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:23:47.311 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_sym 00:23:47.311 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:23:47.311 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.4 (socket 1) 00:23:47.311 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_asym 00:23:47.311 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:23:47.311 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_sym 00:23:47.311 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:23:47.311 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.5 (socket 1) 00:23:47.311 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_asym 00:23:47.311 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:23:47.311 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_sym 00:23:47.311 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:23:47.311 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.6 (socket 1) 00:23:47.311 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_asym 00:23:47.311 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:23:47.311 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_sym 00:23:47.311 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:23:47.311 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.7 (socket 1) 00:23:47.311 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_asym 00:23:47.311 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:23:47.311 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_sym 00:23:47.311 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:23:47.311 TELEMETRY: No legacy callbacks, legacy socket not created 00:23:47.311 00:23:47.312 00:23:47.312 CUnit - A unit testing framework for C - Version 2.1-3 00:23:47.312 http://cunit.sourceforge.net/ 00:23:47.312 00:23:47.312 00:23:47.312 Suite: memory 00:23:47.312 Test: test ... 00:23:47.312 register 0x200000200000 2097152 00:23:47.312 register 0x201000a00000 2097152 00:23:47.312 malloc 3145728 00:23:47.312 register 0x200000400000 4194304 00:23:47.312 buf 0x200000500000 len 3145728 PASSED 00:23:47.312 malloc 64 00:23:47.312 buf 0x2000004fff40 len 64 PASSED 00:23:47.312 malloc 4194304 00:23:47.312 register 0x200000800000 6291456 00:23:47.312 buf 0x200000a00000 len 4194304 PASSED 00:23:47.312 free 0x200000500000 3145728 00:23:47.312 free 0x2000004fff40 64 00:23:47.312 unregister 0x200000400000 4194304 PASSED 00:23:47.312 free 0x200000a00000 4194304 00:23:47.312 unregister 0x200000800000 6291456 PASSED 00:23:47.312 malloc 8388608 00:23:47.312 register 0x200000400000 10485760 00:23:47.312 buf 0x200000600000 len 8388608 PASSED 00:23:47.312 free 0x200000600000 8388608 00:23:47.312 unregister 0x200000400000 10485760 PASSED 00:23:47.312 passed 00:23:47.312 00:23:47.312 Run Summary: Type Total Ran Passed Failed Inactive 00:23:47.312 suites 1 1 n/a 0 0 00:23:47.312 tests 1 1 1 0 0 00:23:47.312 asserts 16 16 16 0 n/a 00:23:47.312 00:23:47.312 Elapsed time = 0.007 seconds 00:23:47.312 00:23:47.312 real 0m0.087s 00:23:47.312 user 0m0.023s 00:23:47.312 sys 0m0.063s 00:23:47.312 11:34:31 env.env_mem_callbacks -- common/autotest_common.sh@1125 -- # xtrace_disable 00:23:47.312 11:34:31 env.env_mem_callbacks -- common/autotest_common.sh@10 -- # set +x 00:23:47.312 ************************************ 00:23:47.312 END TEST env_mem_callbacks 00:23:47.312 ************************************ 00:23:47.312 00:23:47.312 real 0m4.122s 00:23:47.312 user 0m2.366s 00:23:47.312 sys 0m1.348s 00:23:47.312 11:34:31 env -- common/autotest_common.sh@1125 -- # xtrace_disable 00:23:47.312 11:34:31 env -- common/autotest_common.sh@10 -- # set +x 00:23:47.312 ************************************ 00:23:47.312 END TEST env 00:23:47.312 ************************************ 00:23:47.312 11:34:31 -- spdk/autotest.sh@169 -- # run_test rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/rpc.sh 00:23:47.312 11:34:31 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:23:47.312 11:34:31 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:23:47.312 11:34:31 -- common/autotest_common.sh@10 -- # set +x 00:23:47.312 ************************************ 00:23:47.312 START TEST rpc 00:23:47.312 ************************************ 00:23:47.312 11:34:31 rpc -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/rpc.sh 00:23:47.570 * Looking for test storage... 00:23:47.570 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:23:47.570 11:34:31 rpc -- rpc/rpc.sh@65 -- # spdk_pid=94389 00:23:47.570 11:34:31 rpc -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:23:47.570 11:34:31 rpc -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:23:47.570 11:34:31 rpc -- rpc/rpc.sh@67 -- # waitforlisten 94389 00:23:47.570 11:34:31 rpc -- common/autotest_common.sh@830 -- # '[' -z 94389 ']' 00:23:47.570 11:34:31 rpc -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:23:47.570 11:34:31 rpc -- common/autotest_common.sh@835 -- # local max_retries=100 00:23:47.570 11:34:31 rpc -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:23:47.570 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:23:47.570 11:34:31 rpc -- common/autotest_common.sh@839 -- # xtrace_disable 00:23:47.570 11:34:31 rpc -- common/autotest_common.sh@10 -- # set +x 00:23:47.570 [2024-06-10 11:34:31.380285] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:23:47.570 [2024-06-10 11:34:31.380344] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid94389 ] 00:23:47.570 [2024-06-10 11:34:31.467961] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:47.827 [2024-06-10 11:34:31.553107] app.c: 604:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:23:47.827 [2024-06-10 11:34:31.553148] app.c: 605:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 94389' to capture a snapshot of events at runtime. 00:23:47.827 [2024-06-10 11:34:31.553157] app.c: 610:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:23:47.827 [2024-06-10 11:34:31.553168] app.c: 611:app_setup_trace: *NOTICE*: SPDK application currently running. 00:23:47.827 [2024-06-10 11:34:31.553174] app.c: 612:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid94389 for offline analysis/debug. 00:23:47.827 [2024-06-10 11:34:31.553197] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:23:48.392 11:34:32 rpc -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:23:48.392 11:34:32 rpc -- common/autotest_common.sh@863 -- # return 0 00:23:48.392 11:34:32 rpc -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:23:48.392 11:34:32 rpc -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:23:48.392 11:34:32 rpc -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:23:48.392 11:34:32 rpc -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:23:48.392 11:34:32 rpc -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:23:48.392 11:34:32 rpc -- common/autotest_common.sh@1106 -- # xtrace_disable 00:23:48.392 11:34:32 rpc -- common/autotest_common.sh@10 -- # set +x 00:23:48.392 ************************************ 00:23:48.392 START TEST rpc_integrity 00:23:48.392 ************************************ 00:23:48.392 11:34:32 rpc.rpc_integrity -- common/autotest_common.sh@1124 -- # rpc_integrity 00:23:48.392 11:34:32 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:23:48.392 11:34:32 rpc.rpc_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:23:48.392 11:34:32 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:23:48.392 11:34:32 rpc.rpc_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:23:48.392 11:34:32 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:23:48.392 11:34:32 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # jq length 00:23:48.392 11:34:32 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:23:48.392 11:34:32 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:23:48.392 11:34:32 rpc.rpc_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:23:48.392 11:34:32 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:23:48.392 11:34:32 rpc.rpc_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:23:48.392 11:34:32 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:23:48.392 11:34:32 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:23:48.392 11:34:32 rpc.rpc_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:23:48.392 11:34:32 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:23:48.392 11:34:32 rpc.rpc_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:23:48.392 11:34:32 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:23:48.392 { 00:23:48.392 "name": "Malloc0", 00:23:48.392 "aliases": [ 00:23:48.392 "16a60d09-9227-4db7-8c05-4a3fbd37797e" 00:23:48.392 ], 00:23:48.392 "product_name": "Malloc disk", 00:23:48.392 "block_size": 512, 00:23:48.392 "num_blocks": 16384, 00:23:48.392 "uuid": "16a60d09-9227-4db7-8c05-4a3fbd37797e", 00:23:48.392 "assigned_rate_limits": { 00:23:48.392 "rw_ios_per_sec": 0, 00:23:48.392 "rw_mbytes_per_sec": 0, 00:23:48.392 "r_mbytes_per_sec": 0, 00:23:48.392 "w_mbytes_per_sec": 0 00:23:48.392 }, 00:23:48.392 "claimed": false, 00:23:48.392 "zoned": false, 00:23:48.392 "supported_io_types": { 00:23:48.392 "read": true, 00:23:48.392 "write": true, 00:23:48.392 "unmap": true, 00:23:48.392 "write_zeroes": true, 00:23:48.392 "flush": true, 00:23:48.392 "reset": true, 00:23:48.392 "compare": false, 00:23:48.392 "compare_and_write": false, 00:23:48.392 "abort": true, 00:23:48.392 "nvme_admin": false, 00:23:48.392 "nvme_io": false 00:23:48.392 }, 00:23:48.392 "memory_domains": [ 00:23:48.392 { 00:23:48.392 "dma_device_id": "system", 00:23:48.392 "dma_device_type": 1 00:23:48.392 }, 00:23:48.392 { 00:23:48.392 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:48.392 "dma_device_type": 2 00:23:48.392 } 00:23:48.392 ], 00:23:48.392 "driver_specific": {} 00:23:48.392 } 00:23:48.392 ]' 00:23:48.392 11:34:32 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # jq length 00:23:48.650 11:34:32 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:23:48.650 11:34:32 rpc.rpc_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:23:48.650 11:34:32 rpc.rpc_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:23:48.650 11:34:32 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:23:48.650 [2024-06-10 11:34:32.341449] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:23:48.650 [2024-06-10 11:34:32.341481] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:48.650 [2024-06-10 11:34:32.341494] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a2f520 00:23:48.650 [2024-06-10 11:34:32.341503] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:48.650 [2024-06-10 11:34:32.342653] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:48.650 [2024-06-10 11:34:32.342675] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:23:48.650 Passthru0 00:23:48.650 11:34:32 rpc.rpc_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:23:48.650 11:34:32 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:23:48.650 11:34:32 rpc.rpc_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:23:48.650 11:34:32 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:23:48.650 11:34:32 rpc.rpc_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:23:48.650 11:34:32 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:23:48.650 { 00:23:48.650 "name": "Malloc0", 00:23:48.650 "aliases": [ 00:23:48.650 "16a60d09-9227-4db7-8c05-4a3fbd37797e" 00:23:48.650 ], 00:23:48.650 "product_name": "Malloc disk", 00:23:48.650 "block_size": 512, 00:23:48.650 "num_blocks": 16384, 00:23:48.650 "uuid": "16a60d09-9227-4db7-8c05-4a3fbd37797e", 00:23:48.650 "assigned_rate_limits": { 00:23:48.650 "rw_ios_per_sec": 0, 00:23:48.650 "rw_mbytes_per_sec": 0, 00:23:48.650 "r_mbytes_per_sec": 0, 00:23:48.650 "w_mbytes_per_sec": 0 00:23:48.650 }, 00:23:48.650 "claimed": true, 00:23:48.650 "claim_type": "exclusive_write", 00:23:48.650 "zoned": false, 00:23:48.650 "supported_io_types": { 00:23:48.650 "read": true, 00:23:48.650 "write": true, 00:23:48.650 "unmap": true, 00:23:48.650 "write_zeroes": true, 00:23:48.650 "flush": true, 00:23:48.650 "reset": true, 00:23:48.650 "compare": false, 00:23:48.650 "compare_and_write": false, 00:23:48.650 "abort": true, 00:23:48.650 "nvme_admin": false, 00:23:48.650 "nvme_io": false 00:23:48.650 }, 00:23:48.650 "memory_domains": [ 00:23:48.650 { 00:23:48.650 "dma_device_id": "system", 00:23:48.650 "dma_device_type": 1 00:23:48.650 }, 00:23:48.650 { 00:23:48.650 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:48.650 "dma_device_type": 2 00:23:48.650 } 00:23:48.650 ], 00:23:48.650 "driver_specific": {} 00:23:48.650 }, 00:23:48.650 { 00:23:48.650 "name": "Passthru0", 00:23:48.650 "aliases": [ 00:23:48.650 "35ed8d8b-3d08-54e1-9add-930cda8433cd" 00:23:48.650 ], 00:23:48.650 "product_name": "passthru", 00:23:48.650 "block_size": 512, 00:23:48.650 "num_blocks": 16384, 00:23:48.650 "uuid": "35ed8d8b-3d08-54e1-9add-930cda8433cd", 00:23:48.650 "assigned_rate_limits": { 00:23:48.650 "rw_ios_per_sec": 0, 00:23:48.650 "rw_mbytes_per_sec": 0, 00:23:48.650 "r_mbytes_per_sec": 0, 00:23:48.650 "w_mbytes_per_sec": 0 00:23:48.650 }, 00:23:48.650 "claimed": false, 00:23:48.650 "zoned": false, 00:23:48.650 "supported_io_types": { 00:23:48.650 "read": true, 00:23:48.650 "write": true, 00:23:48.650 "unmap": true, 00:23:48.650 "write_zeroes": true, 00:23:48.650 "flush": true, 00:23:48.650 "reset": true, 00:23:48.650 "compare": false, 00:23:48.650 "compare_and_write": false, 00:23:48.650 "abort": true, 00:23:48.650 "nvme_admin": false, 00:23:48.650 "nvme_io": false 00:23:48.650 }, 00:23:48.650 "memory_domains": [ 00:23:48.650 { 00:23:48.650 "dma_device_id": "system", 00:23:48.650 "dma_device_type": 1 00:23:48.650 }, 00:23:48.650 { 00:23:48.650 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:48.650 "dma_device_type": 2 00:23:48.650 } 00:23:48.650 ], 00:23:48.650 "driver_specific": { 00:23:48.650 "passthru": { 00:23:48.650 "name": "Passthru0", 00:23:48.650 "base_bdev_name": "Malloc0" 00:23:48.650 } 00:23:48.650 } 00:23:48.650 } 00:23:48.650 ]' 00:23:48.650 11:34:32 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # jq length 00:23:48.650 11:34:32 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:23:48.650 11:34:32 rpc.rpc_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:23:48.650 11:34:32 rpc.rpc_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:23:48.650 11:34:32 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:23:48.650 11:34:32 rpc.rpc_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:23:48.650 11:34:32 rpc.rpc_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:23:48.650 11:34:32 rpc.rpc_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:23:48.650 11:34:32 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:23:48.650 11:34:32 rpc.rpc_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:23:48.650 11:34:32 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:23:48.650 11:34:32 rpc.rpc_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:23:48.651 11:34:32 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:23:48.651 11:34:32 rpc.rpc_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:23:48.651 11:34:32 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:23:48.651 11:34:32 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # jq length 00:23:48.651 11:34:32 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:23:48.651 00:23:48.651 real 0m0.271s 00:23:48.651 user 0m0.169s 00:23:48.651 sys 0m0.046s 00:23:48.651 11:34:32 rpc.rpc_integrity -- common/autotest_common.sh@1125 -- # xtrace_disable 00:23:48.651 11:34:32 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:23:48.651 ************************************ 00:23:48.651 END TEST rpc_integrity 00:23:48.651 ************************************ 00:23:48.651 11:34:32 rpc -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:23:48.651 11:34:32 rpc -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:23:48.651 11:34:32 rpc -- common/autotest_common.sh@1106 -- # xtrace_disable 00:23:48.651 11:34:32 rpc -- common/autotest_common.sh@10 -- # set +x 00:23:48.651 ************************************ 00:23:48.651 START TEST rpc_plugins 00:23:48.651 ************************************ 00:23:48.651 11:34:32 rpc.rpc_plugins -- common/autotest_common.sh@1124 -- # rpc_plugins 00:23:48.651 11:34:32 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:23:48.651 11:34:32 rpc.rpc_plugins -- common/autotest_common.sh@560 -- # xtrace_disable 00:23:48.651 11:34:32 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:23:48.651 11:34:32 rpc.rpc_plugins -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:23:48.651 11:34:32 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:23:48.651 11:34:32 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:23:48.651 11:34:32 rpc.rpc_plugins -- common/autotest_common.sh@560 -- # xtrace_disable 00:23:48.651 11:34:32 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:23:48.651 11:34:32 rpc.rpc_plugins -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:23:48.651 11:34:32 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # bdevs='[ 00:23:48.651 { 00:23:48.651 "name": "Malloc1", 00:23:48.651 "aliases": [ 00:23:48.651 "74b6e473-3c24-4816-a734-086309f6d7b8" 00:23:48.651 ], 00:23:48.651 "product_name": "Malloc disk", 00:23:48.651 "block_size": 4096, 00:23:48.651 "num_blocks": 256, 00:23:48.651 "uuid": "74b6e473-3c24-4816-a734-086309f6d7b8", 00:23:48.651 "assigned_rate_limits": { 00:23:48.651 "rw_ios_per_sec": 0, 00:23:48.651 "rw_mbytes_per_sec": 0, 00:23:48.651 "r_mbytes_per_sec": 0, 00:23:48.651 "w_mbytes_per_sec": 0 00:23:48.651 }, 00:23:48.651 "claimed": false, 00:23:48.651 "zoned": false, 00:23:48.651 "supported_io_types": { 00:23:48.651 "read": true, 00:23:48.651 "write": true, 00:23:48.651 "unmap": true, 00:23:48.651 "write_zeroes": true, 00:23:48.651 "flush": true, 00:23:48.651 "reset": true, 00:23:48.651 "compare": false, 00:23:48.651 "compare_and_write": false, 00:23:48.651 "abort": true, 00:23:48.651 "nvme_admin": false, 00:23:48.651 "nvme_io": false 00:23:48.651 }, 00:23:48.651 "memory_domains": [ 00:23:48.651 { 00:23:48.651 "dma_device_id": "system", 00:23:48.651 "dma_device_type": 1 00:23:48.651 }, 00:23:48.651 { 00:23:48.651 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:48.651 "dma_device_type": 2 00:23:48.651 } 00:23:48.651 ], 00:23:48.651 "driver_specific": {} 00:23:48.651 } 00:23:48.651 ]' 00:23:48.909 11:34:32 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # jq length 00:23:48.909 11:34:32 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:23:48.909 11:34:32 rpc.rpc_plugins -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:23:48.909 11:34:32 rpc.rpc_plugins -- common/autotest_common.sh@560 -- # xtrace_disable 00:23:48.909 11:34:32 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:23:48.909 11:34:32 rpc.rpc_plugins -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:23:48.909 11:34:32 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:23:48.909 11:34:32 rpc.rpc_plugins -- common/autotest_common.sh@560 -- # xtrace_disable 00:23:48.909 11:34:32 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:23:48.909 11:34:32 rpc.rpc_plugins -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:23:48.909 11:34:32 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # bdevs='[]' 00:23:48.909 11:34:32 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # jq length 00:23:48.909 11:34:32 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:23:48.909 00:23:48.909 real 0m0.143s 00:23:48.909 user 0m0.088s 00:23:48.909 sys 0m0.026s 00:23:48.909 11:34:32 rpc.rpc_plugins -- common/autotest_common.sh@1125 -- # xtrace_disable 00:23:48.909 11:34:32 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:23:48.909 ************************************ 00:23:48.909 END TEST rpc_plugins 00:23:48.909 ************************************ 00:23:48.909 11:34:32 rpc -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:23:48.909 11:34:32 rpc -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:23:48.909 11:34:32 rpc -- common/autotest_common.sh@1106 -- # xtrace_disable 00:23:48.909 11:34:32 rpc -- common/autotest_common.sh@10 -- # set +x 00:23:48.909 ************************************ 00:23:48.909 START TEST rpc_trace_cmd_test 00:23:48.909 ************************************ 00:23:48.909 11:34:32 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1124 -- # rpc_trace_cmd_test 00:23:48.909 11:34:32 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@40 -- # local info 00:23:48.909 11:34:32 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:23:48.909 11:34:32 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@560 -- # xtrace_disable 00:23:48.909 11:34:32 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:23:48.909 11:34:32 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:23:48.909 11:34:32 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # info='{ 00:23:48.909 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid94389", 00:23:48.909 "tpoint_group_mask": "0x8", 00:23:48.909 "iscsi_conn": { 00:23:48.909 "mask": "0x2", 00:23:48.909 "tpoint_mask": "0x0" 00:23:48.909 }, 00:23:48.909 "scsi": { 00:23:48.909 "mask": "0x4", 00:23:48.909 "tpoint_mask": "0x0" 00:23:48.909 }, 00:23:48.909 "bdev": { 00:23:48.909 "mask": "0x8", 00:23:48.909 "tpoint_mask": "0xffffffffffffffff" 00:23:48.909 }, 00:23:48.909 "nvmf_rdma": { 00:23:48.909 "mask": "0x10", 00:23:48.909 "tpoint_mask": "0x0" 00:23:48.909 }, 00:23:48.909 "nvmf_tcp": { 00:23:48.909 "mask": "0x20", 00:23:48.909 "tpoint_mask": "0x0" 00:23:48.909 }, 00:23:48.909 "ftl": { 00:23:48.909 "mask": "0x40", 00:23:48.909 "tpoint_mask": "0x0" 00:23:48.909 }, 00:23:48.909 "blobfs": { 00:23:48.909 "mask": "0x80", 00:23:48.909 "tpoint_mask": "0x0" 00:23:48.909 }, 00:23:48.909 "dsa": { 00:23:48.909 "mask": "0x200", 00:23:48.909 "tpoint_mask": "0x0" 00:23:48.909 }, 00:23:48.909 "thread": { 00:23:48.909 "mask": "0x400", 00:23:48.909 "tpoint_mask": "0x0" 00:23:48.909 }, 00:23:48.909 "nvme_pcie": { 00:23:48.909 "mask": "0x800", 00:23:48.909 "tpoint_mask": "0x0" 00:23:48.909 }, 00:23:48.909 "iaa": { 00:23:48.909 "mask": "0x1000", 00:23:48.909 "tpoint_mask": "0x0" 00:23:48.909 }, 00:23:48.909 "nvme_tcp": { 00:23:48.909 "mask": "0x2000", 00:23:48.909 "tpoint_mask": "0x0" 00:23:48.909 }, 00:23:48.909 "bdev_nvme": { 00:23:48.909 "mask": "0x4000", 00:23:48.909 "tpoint_mask": "0x0" 00:23:48.909 }, 00:23:48.909 "sock": { 00:23:48.909 "mask": "0x8000", 00:23:48.909 "tpoint_mask": "0x0" 00:23:48.909 } 00:23:48.909 }' 00:23:48.909 11:34:32 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # jq length 00:23:48.909 11:34:32 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # '[' 16 -gt 2 ']' 00:23:48.909 11:34:32 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:23:49.167 11:34:32 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:23:49.167 11:34:32 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:23:49.167 11:34:32 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:23:49.167 11:34:32 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:23:49.167 11:34:32 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:23:49.167 11:34:32 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:23:49.167 11:34:32 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:23:49.167 00:23:49.167 real 0m0.198s 00:23:49.167 user 0m0.156s 00:23:49.167 sys 0m0.035s 00:23:49.167 11:34:32 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:23:49.167 11:34:32 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:23:49.167 ************************************ 00:23:49.167 END TEST rpc_trace_cmd_test 00:23:49.167 ************************************ 00:23:49.167 11:34:33 rpc -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:23:49.167 11:34:33 rpc -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:23:49.167 11:34:33 rpc -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:23:49.167 11:34:33 rpc -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:23:49.167 11:34:33 rpc -- common/autotest_common.sh@1106 -- # xtrace_disable 00:23:49.167 11:34:33 rpc -- common/autotest_common.sh@10 -- # set +x 00:23:49.167 ************************************ 00:23:49.167 START TEST rpc_daemon_integrity 00:23:49.167 ************************************ 00:23:49.167 11:34:33 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1124 -- # rpc_integrity 00:23:49.167 11:34:33 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:23:49.167 11:34:33 rpc.rpc_daemon_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:23:49.167 11:34:33 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:23:49.167 11:34:33 rpc.rpc_daemon_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:23:49.167 11:34:33 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:23:49.167 11:34:33 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # jq length 00:23:49.425 11:34:33 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:23:49.425 11:34:33 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:23:49.425 11:34:33 rpc.rpc_daemon_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:23:49.425 11:34:33 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:23:49.425 11:34:33 rpc.rpc_daemon_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:23:49.425 11:34:33 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:23:49.425 11:34:33 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:23:49.425 11:34:33 rpc.rpc_daemon_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:23:49.425 11:34:33 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:23:49.425 11:34:33 rpc.rpc_daemon_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:23:49.425 11:34:33 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:23:49.425 { 00:23:49.425 "name": "Malloc2", 00:23:49.425 "aliases": [ 00:23:49.425 "14df8545-2ddd-4700-9e1a-0989c20d0327" 00:23:49.425 ], 00:23:49.425 "product_name": "Malloc disk", 00:23:49.425 "block_size": 512, 00:23:49.425 "num_blocks": 16384, 00:23:49.425 "uuid": "14df8545-2ddd-4700-9e1a-0989c20d0327", 00:23:49.425 "assigned_rate_limits": { 00:23:49.425 "rw_ios_per_sec": 0, 00:23:49.425 "rw_mbytes_per_sec": 0, 00:23:49.425 "r_mbytes_per_sec": 0, 00:23:49.425 "w_mbytes_per_sec": 0 00:23:49.425 }, 00:23:49.425 "claimed": false, 00:23:49.425 "zoned": false, 00:23:49.425 "supported_io_types": { 00:23:49.425 "read": true, 00:23:49.425 "write": true, 00:23:49.425 "unmap": true, 00:23:49.425 "write_zeroes": true, 00:23:49.425 "flush": true, 00:23:49.425 "reset": true, 00:23:49.425 "compare": false, 00:23:49.425 "compare_and_write": false, 00:23:49.426 "abort": true, 00:23:49.426 "nvme_admin": false, 00:23:49.426 "nvme_io": false 00:23:49.426 }, 00:23:49.426 "memory_domains": [ 00:23:49.426 { 00:23:49.426 "dma_device_id": "system", 00:23:49.426 "dma_device_type": 1 00:23:49.426 }, 00:23:49.426 { 00:23:49.426 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:49.426 "dma_device_type": 2 00:23:49.426 } 00:23:49.426 ], 00:23:49.426 "driver_specific": {} 00:23:49.426 } 00:23:49.426 ]' 00:23:49.426 11:34:33 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # jq length 00:23:49.426 11:34:33 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:23:49.426 11:34:33 rpc.rpc_daemon_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:23:49.426 11:34:33 rpc.rpc_daemon_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:23:49.426 11:34:33 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:23:49.426 [2024-06-10 11:34:33.195785] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:23:49.426 [2024-06-10 11:34:33.195814] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:49.426 [2024-06-10 11:34:33.195829] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a2f100 00:23:49.426 [2024-06-10 11:34:33.195838] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:49.426 [2024-06-10 11:34:33.196803] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:49.426 [2024-06-10 11:34:33.196823] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:23:49.426 Passthru0 00:23:49.426 11:34:33 rpc.rpc_daemon_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:23:49.426 11:34:33 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:23:49.426 11:34:33 rpc.rpc_daemon_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:23:49.426 11:34:33 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:23:49.426 11:34:33 rpc.rpc_daemon_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:23:49.426 11:34:33 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:23:49.426 { 00:23:49.426 "name": "Malloc2", 00:23:49.426 "aliases": [ 00:23:49.426 "14df8545-2ddd-4700-9e1a-0989c20d0327" 00:23:49.426 ], 00:23:49.426 "product_name": "Malloc disk", 00:23:49.426 "block_size": 512, 00:23:49.426 "num_blocks": 16384, 00:23:49.426 "uuid": "14df8545-2ddd-4700-9e1a-0989c20d0327", 00:23:49.426 "assigned_rate_limits": { 00:23:49.426 "rw_ios_per_sec": 0, 00:23:49.426 "rw_mbytes_per_sec": 0, 00:23:49.426 "r_mbytes_per_sec": 0, 00:23:49.426 "w_mbytes_per_sec": 0 00:23:49.426 }, 00:23:49.426 "claimed": true, 00:23:49.426 "claim_type": "exclusive_write", 00:23:49.426 "zoned": false, 00:23:49.426 "supported_io_types": { 00:23:49.426 "read": true, 00:23:49.426 "write": true, 00:23:49.426 "unmap": true, 00:23:49.426 "write_zeroes": true, 00:23:49.426 "flush": true, 00:23:49.426 "reset": true, 00:23:49.426 "compare": false, 00:23:49.426 "compare_and_write": false, 00:23:49.426 "abort": true, 00:23:49.426 "nvme_admin": false, 00:23:49.426 "nvme_io": false 00:23:49.426 }, 00:23:49.426 "memory_domains": [ 00:23:49.426 { 00:23:49.426 "dma_device_id": "system", 00:23:49.426 "dma_device_type": 1 00:23:49.426 }, 00:23:49.426 { 00:23:49.426 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:49.426 "dma_device_type": 2 00:23:49.426 } 00:23:49.426 ], 00:23:49.426 "driver_specific": {} 00:23:49.426 }, 00:23:49.426 { 00:23:49.426 "name": "Passthru0", 00:23:49.426 "aliases": [ 00:23:49.426 "22e79ec9-4b43-5473-a5a7-76efb98438d6" 00:23:49.426 ], 00:23:49.426 "product_name": "passthru", 00:23:49.426 "block_size": 512, 00:23:49.426 "num_blocks": 16384, 00:23:49.426 "uuid": "22e79ec9-4b43-5473-a5a7-76efb98438d6", 00:23:49.426 "assigned_rate_limits": { 00:23:49.426 "rw_ios_per_sec": 0, 00:23:49.426 "rw_mbytes_per_sec": 0, 00:23:49.426 "r_mbytes_per_sec": 0, 00:23:49.426 "w_mbytes_per_sec": 0 00:23:49.426 }, 00:23:49.426 "claimed": false, 00:23:49.426 "zoned": false, 00:23:49.426 "supported_io_types": { 00:23:49.426 "read": true, 00:23:49.426 "write": true, 00:23:49.426 "unmap": true, 00:23:49.426 "write_zeroes": true, 00:23:49.426 "flush": true, 00:23:49.426 "reset": true, 00:23:49.426 "compare": false, 00:23:49.426 "compare_and_write": false, 00:23:49.426 "abort": true, 00:23:49.426 "nvme_admin": false, 00:23:49.426 "nvme_io": false 00:23:49.426 }, 00:23:49.426 "memory_domains": [ 00:23:49.426 { 00:23:49.426 "dma_device_id": "system", 00:23:49.426 "dma_device_type": 1 00:23:49.426 }, 00:23:49.426 { 00:23:49.426 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:49.426 "dma_device_type": 2 00:23:49.426 } 00:23:49.426 ], 00:23:49.426 "driver_specific": { 00:23:49.426 "passthru": { 00:23:49.426 "name": "Passthru0", 00:23:49.426 "base_bdev_name": "Malloc2" 00:23:49.426 } 00:23:49.426 } 00:23:49.426 } 00:23:49.426 ]' 00:23:49.426 11:34:33 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # jq length 00:23:49.426 11:34:33 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:23:49.426 11:34:33 rpc.rpc_daemon_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:23:49.426 11:34:33 rpc.rpc_daemon_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:23:49.426 11:34:33 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:23:49.426 11:34:33 rpc.rpc_daemon_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:23:49.426 11:34:33 rpc.rpc_daemon_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:23:49.426 11:34:33 rpc.rpc_daemon_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:23:49.426 11:34:33 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:23:49.426 11:34:33 rpc.rpc_daemon_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:23:49.426 11:34:33 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:23:49.426 11:34:33 rpc.rpc_daemon_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:23:49.426 11:34:33 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:23:49.426 11:34:33 rpc.rpc_daemon_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:23:49.426 11:34:33 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:23:49.426 11:34:33 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # jq length 00:23:49.426 11:34:33 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:23:49.426 00:23:49.426 real 0m0.269s 00:23:49.426 user 0m0.170s 00:23:49.426 sys 0m0.048s 00:23:49.426 11:34:33 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1125 -- # xtrace_disable 00:23:49.426 11:34:33 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:23:49.426 ************************************ 00:23:49.426 END TEST rpc_daemon_integrity 00:23:49.426 ************************************ 00:23:49.426 11:34:33 rpc -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:23:49.426 11:34:33 rpc -- rpc/rpc.sh@84 -- # killprocess 94389 00:23:49.426 11:34:33 rpc -- common/autotest_common.sh@949 -- # '[' -z 94389 ']' 00:23:49.426 11:34:33 rpc -- common/autotest_common.sh@953 -- # kill -0 94389 00:23:49.426 11:34:33 rpc -- common/autotest_common.sh@954 -- # uname 00:23:49.684 11:34:33 rpc -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:23:49.684 11:34:33 rpc -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 94389 00:23:49.684 11:34:33 rpc -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:23:49.684 11:34:33 rpc -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:23:49.684 11:34:33 rpc -- common/autotest_common.sh@967 -- # echo 'killing process with pid 94389' 00:23:49.684 killing process with pid 94389 00:23:49.684 11:34:33 rpc -- common/autotest_common.sh@968 -- # kill 94389 00:23:49.684 11:34:33 rpc -- common/autotest_common.sh@973 -- # wait 94389 00:23:49.942 00:23:49.942 real 0m2.549s 00:23:49.942 user 0m3.145s 00:23:49.942 sys 0m0.844s 00:23:49.942 11:34:33 rpc -- common/autotest_common.sh@1125 -- # xtrace_disable 00:23:49.942 11:34:33 rpc -- common/autotest_common.sh@10 -- # set +x 00:23:49.942 ************************************ 00:23:49.942 END TEST rpc 00:23:49.942 ************************************ 00:23:49.942 11:34:33 -- spdk/autotest.sh@170 -- # run_test skip_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:23:49.942 11:34:33 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:23:49.942 11:34:33 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:23:49.942 11:34:33 -- common/autotest_common.sh@10 -- # set +x 00:23:49.942 ************************************ 00:23:49.942 START TEST skip_rpc 00:23:49.942 ************************************ 00:23:49.942 11:34:33 skip_rpc -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:23:50.200 * Looking for test storage... 00:23:50.200 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:23:50.200 11:34:33 skip_rpc -- rpc/skip_rpc.sh@11 -- # CONFIG_PATH=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:23:50.200 11:34:33 skip_rpc -- rpc/skip_rpc.sh@12 -- # LOG_PATH=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:23:50.200 11:34:33 skip_rpc -- rpc/skip_rpc.sh@73 -- # run_test skip_rpc test_skip_rpc 00:23:50.200 11:34:33 skip_rpc -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:23:50.200 11:34:33 skip_rpc -- common/autotest_common.sh@1106 -- # xtrace_disable 00:23:50.200 11:34:33 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:23:50.200 ************************************ 00:23:50.200 START TEST skip_rpc 00:23:50.200 ************************************ 00:23:50.200 11:34:33 skip_rpc.skip_rpc -- common/autotest_common.sh@1124 -- # test_skip_rpc 00:23:50.200 11:34:33 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@16 -- # local spdk_pid=94919 00:23:50.200 11:34:33 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@18 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:23:50.200 11:34:33 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 00:23:50.200 11:34:33 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@19 -- # sleep 5 00:23:50.200 [2024-06-10 11:34:34.041324] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:23:50.200 [2024-06-10 11:34:34.041368] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid94919 ] 00:23:50.200 [2024-06-10 11:34:34.126510] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:50.458 [2024-06-10 11:34:34.207840] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:23:55.719 11:34:38 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@21 -- # NOT rpc_cmd spdk_get_version 00:23:55.719 11:34:38 skip_rpc.skip_rpc -- common/autotest_common.sh@649 -- # local es=0 00:23:55.719 11:34:38 skip_rpc.skip_rpc -- common/autotest_common.sh@651 -- # valid_exec_arg rpc_cmd spdk_get_version 00:23:55.719 11:34:38 skip_rpc.skip_rpc -- common/autotest_common.sh@637 -- # local arg=rpc_cmd 00:23:55.719 11:34:38 skip_rpc.skip_rpc -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:23:55.719 11:34:38 skip_rpc.skip_rpc -- common/autotest_common.sh@641 -- # type -t rpc_cmd 00:23:55.719 11:34:38 skip_rpc.skip_rpc -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:23:55.719 11:34:38 skip_rpc.skip_rpc -- common/autotest_common.sh@652 -- # rpc_cmd spdk_get_version 00:23:55.719 11:34:38 skip_rpc.skip_rpc -- common/autotest_common.sh@560 -- # xtrace_disable 00:23:55.719 11:34:38 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:23:55.719 11:34:38 skip_rpc.skip_rpc -- common/autotest_common.sh@588 -- # [[ 1 == 0 ]] 00:23:55.719 11:34:38 skip_rpc.skip_rpc -- common/autotest_common.sh@652 -- # es=1 00:23:55.719 11:34:38 skip_rpc.skip_rpc -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:23:55.719 11:34:38 skip_rpc.skip_rpc -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:23:55.719 11:34:38 skip_rpc.skip_rpc -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:23:55.719 11:34:38 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@22 -- # trap - SIGINT SIGTERM EXIT 00:23:55.719 11:34:39 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@23 -- # killprocess 94919 00:23:55.719 11:34:39 skip_rpc.skip_rpc -- common/autotest_common.sh@949 -- # '[' -z 94919 ']' 00:23:55.719 11:34:39 skip_rpc.skip_rpc -- common/autotest_common.sh@953 -- # kill -0 94919 00:23:55.719 11:34:39 skip_rpc.skip_rpc -- common/autotest_common.sh@954 -- # uname 00:23:55.719 11:34:39 skip_rpc.skip_rpc -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:23:55.719 11:34:39 skip_rpc.skip_rpc -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 94919 00:23:55.719 11:34:39 skip_rpc.skip_rpc -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:23:55.719 11:34:39 skip_rpc.skip_rpc -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:23:55.719 11:34:39 skip_rpc.skip_rpc -- common/autotest_common.sh@967 -- # echo 'killing process with pid 94919' 00:23:55.719 killing process with pid 94919 00:23:55.719 11:34:39 skip_rpc.skip_rpc -- common/autotest_common.sh@968 -- # kill 94919 00:23:55.719 11:34:39 skip_rpc.skip_rpc -- common/autotest_common.sh@973 -- # wait 94919 00:23:55.719 00:23:55.719 real 0m5.388s 00:23:55.719 user 0m5.089s 00:23:55.719 sys 0m0.305s 00:23:55.719 11:34:39 skip_rpc.skip_rpc -- common/autotest_common.sh@1125 -- # xtrace_disable 00:23:55.719 11:34:39 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:23:55.719 ************************************ 00:23:55.719 END TEST skip_rpc 00:23:55.719 ************************************ 00:23:55.719 11:34:39 skip_rpc -- rpc/skip_rpc.sh@74 -- # run_test skip_rpc_with_json test_skip_rpc_with_json 00:23:55.719 11:34:39 skip_rpc -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:23:55.719 11:34:39 skip_rpc -- common/autotest_common.sh@1106 -- # xtrace_disable 00:23:55.719 11:34:39 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:23:55.719 ************************************ 00:23:55.719 START TEST skip_rpc_with_json 00:23:55.719 ************************************ 00:23:55.719 11:34:39 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1124 -- # test_skip_rpc_with_json 00:23:55.719 11:34:39 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@44 -- # gen_json_config 00:23:55.719 11:34:39 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@28 -- # local spdk_pid=95651 00:23:55.719 11:34:39 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@30 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:23:55.719 11:34:39 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:23:55.719 11:34:39 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@31 -- # waitforlisten 95651 00:23:55.719 11:34:39 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@830 -- # '[' -z 95651 ']' 00:23:55.719 11:34:39 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:23:55.719 11:34:39 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@835 -- # local max_retries=100 00:23:55.719 11:34:39 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:23:55.719 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:23:55.719 11:34:39 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@839 -- # xtrace_disable 00:23:55.719 11:34:39 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:23:55.719 [2024-06-10 11:34:39.506757] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:23:55.719 [2024-06-10 11:34:39.506812] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid95651 ] 00:23:55.719 [2024-06-10 11:34:39.594420] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:55.977 [2024-06-10 11:34:39.682069] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:23:56.545 11:34:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:23:56.545 11:34:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@863 -- # return 0 00:23:56.545 11:34:40 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_get_transports --trtype tcp 00:23:56.545 11:34:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@560 -- # xtrace_disable 00:23:56.545 11:34:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:23:56.545 [2024-06-10 11:34:40.298622] nvmf_rpc.c:2558:rpc_nvmf_get_transports: *ERROR*: transport 'tcp' does not exist 00:23:56.545 request: 00:23:56.545 { 00:23:56.545 "trtype": "tcp", 00:23:56.545 "method": "nvmf_get_transports", 00:23:56.545 "req_id": 1 00:23:56.545 } 00:23:56.545 Got JSON-RPC error response 00:23:56.545 response: 00:23:56.545 { 00:23:56.545 "code": -19, 00:23:56.545 "message": "No such device" 00:23:56.545 } 00:23:56.545 11:34:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@588 -- # [[ 1 == 0 ]] 00:23:56.545 11:34:40 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_create_transport -t tcp 00:23:56.545 11:34:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@560 -- # xtrace_disable 00:23:56.545 11:34:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:23:56.545 [2024-06-10 11:34:40.310730] tcp.c: 672:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:23:56.545 11:34:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:23:56.545 11:34:40 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@36 -- # rpc_cmd save_config 00:23:56.545 11:34:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@560 -- # xtrace_disable 00:23:56.545 11:34:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:23:56.545 11:34:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:23:56.545 11:34:40 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@37 -- # cat /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:23:56.545 { 00:23:56.545 "subsystems": [ 00:23:56.545 { 00:23:56.545 "subsystem": "keyring", 00:23:56.545 "config": [] 00:23:56.545 }, 00:23:56.545 { 00:23:56.545 "subsystem": "iobuf", 00:23:56.545 "config": [ 00:23:56.545 { 00:23:56.545 "method": "iobuf_set_options", 00:23:56.545 "params": { 00:23:56.545 "small_pool_count": 8192, 00:23:56.545 "large_pool_count": 1024, 00:23:56.545 "small_bufsize": 8192, 00:23:56.545 "large_bufsize": 135168 00:23:56.545 } 00:23:56.545 } 00:23:56.545 ] 00:23:56.545 }, 00:23:56.545 { 00:23:56.545 "subsystem": "sock", 00:23:56.545 "config": [ 00:23:56.545 { 00:23:56.545 "method": "sock_set_default_impl", 00:23:56.545 "params": { 00:23:56.545 "impl_name": "posix" 00:23:56.545 } 00:23:56.545 }, 00:23:56.545 { 00:23:56.545 "method": "sock_impl_set_options", 00:23:56.545 "params": { 00:23:56.545 "impl_name": "ssl", 00:23:56.545 "recv_buf_size": 4096, 00:23:56.545 "send_buf_size": 4096, 00:23:56.545 "enable_recv_pipe": true, 00:23:56.545 "enable_quickack": false, 00:23:56.545 "enable_placement_id": 0, 00:23:56.545 "enable_zerocopy_send_server": true, 00:23:56.545 "enable_zerocopy_send_client": false, 00:23:56.545 "zerocopy_threshold": 0, 00:23:56.545 "tls_version": 0, 00:23:56.545 "enable_ktls": false 00:23:56.545 } 00:23:56.545 }, 00:23:56.545 { 00:23:56.545 "method": "sock_impl_set_options", 00:23:56.545 "params": { 00:23:56.545 "impl_name": "posix", 00:23:56.545 "recv_buf_size": 2097152, 00:23:56.545 "send_buf_size": 2097152, 00:23:56.545 "enable_recv_pipe": true, 00:23:56.545 "enable_quickack": false, 00:23:56.545 "enable_placement_id": 0, 00:23:56.545 "enable_zerocopy_send_server": true, 00:23:56.545 "enable_zerocopy_send_client": false, 00:23:56.545 "zerocopy_threshold": 0, 00:23:56.545 "tls_version": 0, 00:23:56.545 "enable_ktls": false 00:23:56.545 } 00:23:56.545 } 00:23:56.545 ] 00:23:56.545 }, 00:23:56.545 { 00:23:56.545 "subsystem": "vmd", 00:23:56.545 "config": [] 00:23:56.545 }, 00:23:56.545 { 00:23:56.545 "subsystem": "accel", 00:23:56.545 "config": [ 00:23:56.545 { 00:23:56.545 "method": "accel_set_options", 00:23:56.545 "params": { 00:23:56.545 "small_cache_size": 128, 00:23:56.545 "large_cache_size": 16, 00:23:56.545 "task_count": 2048, 00:23:56.545 "sequence_count": 2048, 00:23:56.545 "buf_count": 2048 00:23:56.545 } 00:23:56.545 } 00:23:56.545 ] 00:23:56.545 }, 00:23:56.545 { 00:23:56.545 "subsystem": "bdev", 00:23:56.545 "config": [ 00:23:56.545 { 00:23:56.545 "method": "bdev_set_options", 00:23:56.545 "params": { 00:23:56.545 "bdev_io_pool_size": 65535, 00:23:56.545 "bdev_io_cache_size": 256, 00:23:56.545 "bdev_auto_examine": true, 00:23:56.546 "iobuf_small_cache_size": 128, 00:23:56.546 "iobuf_large_cache_size": 16 00:23:56.546 } 00:23:56.546 }, 00:23:56.546 { 00:23:56.546 "method": "bdev_raid_set_options", 00:23:56.546 "params": { 00:23:56.546 "process_window_size_kb": 1024 00:23:56.546 } 00:23:56.546 }, 00:23:56.546 { 00:23:56.546 "method": "bdev_iscsi_set_options", 00:23:56.546 "params": { 00:23:56.546 "timeout_sec": 30 00:23:56.546 } 00:23:56.546 }, 00:23:56.546 { 00:23:56.546 "method": "bdev_nvme_set_options", 00:23:56.546 "params": { 00:23:56.546 "action_on_timeout": "none", 00:23:56.546 "timeout_us": 0, 00:23:56.546 "timeout_admin_us": 0, 00:23:56.546 "keep_alive_timeout_ms": 10000, 00:23:56.546 "arbitration_burst": 0, 00:23:56.546 "low_priority_weight": 0, 00:23:56.546 "medium_priority_weight": 0, 00:23:56.546 "high_priority_weight": 0, 00:23:56.546 "nvme_adminq_poll_period_us": 10000, 00:23:56.546 "nvme_ioq_poll_period_us": 0, 00:23:56.546 "io_queue_requests": 0, 00:23:56.546 "delay_cmd_submit": true, 00:23:56.546 "transport_retry_count": 4, 00:23:56.546 "bdev_retry_count": 3, 00:23:56.546 "transport_ack_timeout": 0, 00:23:56.546 "ctrlr_loss_timeout_sec": 0, 00:23:56.546 "reconnect_delay_sec": 0, 00:23:56.546 "fast_io_fail_timeout_sec": 0, 00:23:56.546 "disable_auto_failback": false, 00:23:56.546 "generate_uuids": false, 00:23:56.546 "transport_tos": 0, 00:23:56.546 "nvme_error_stat": false, 00:23:56.546 "rdma_srq_size": 0, 00:23:56.546 "io_path_stat": false, 00:23:56.546 "allow_accel_sequence": false, 00:23:56.546 "rdma_max_cq_size": 0, 00:23:56.546 "rdma_cm_event_timeout_ms": 0, 00:23:56.546 "dhchap_digests": [ 00:23:56.546 "sha256", 00:23:56.546 "sha384", 00:23:56.546 "sha512" 00:23:56.546 ], 00:23:56.546 "dhchap_dhgroups": [ 00:23:56.546 "null", 00:23:56.546 "ffdhe2048", 00:23:56.546 "ffdhe3072", 00:23:56.546 "ffdhe4096", 00:23:56.546 "ffdhe6144", 00:23:56.546 "ffdhe8192" 00:23:56.546 ] 00:23:56.546 } 00:23:56.546 }, 00:23:56.546 { 00:23:56.546 "method": "bdev_nvme_set_hotplug", 00:23:56.546 "params": { 00:23:56.546 "period_us": 100000, 00:23:56.546 "enable": false 00:23:56.546 } 00:23:56.546 }, 00:23:56.546 { 00:23:56.546 "method": "bdev_wait_for_examine" 00:23:56.546 } 00:23:56.546 ] 00:23:56.546 }, 00:23:56.546 { 00:23:56.546 "subsystem": "scsi", 00:23:56.546 "config": null 00:23:56.546 }, 00:23:56.546 { 00:23:56.546 "subsystem": "scheduler", 00:23:56.546 "config": [ 00:23:56.546 { 00:23:56.546 "method": "framework_set_scheduler", 00:23:56.546 "params": { 00:23:56.546 "name": "static" 00:23:56.546 } 00:23:56.546 } 00:23:56.546 ] 00:23:56.546 }, 00:23:56.546 { 00:23:56.546 "subsystem": "vhost_scsi", 00:23:56.546 "config": [] 00:23:56.546 }, 00:23:56.546 { 00:23:56.546 "subsystem": "vhost_blk", 00:23:56.546 "config": [] 00:23:56.546 }, 00:23:56.546 { 00:23:56.546 "subsystem": "ublk", 00:23:56.546 "config": [] 00:23:56.546 }, 00:23:56.546 { 00:23:56.546 "subsystem": "nbd", 00:23:56.546 "config": [] 00:23:56.546 }, 00:23:56.546 { 00:23:56.546 "subsystem": "nvmf", 00:23:56.546 "config": [ 00:23:56.546 { 00:23:56.546 "method": "nvmf_set_config", 00:23:56.546 "params": { 00:23:56.546 "discovery_filter": "match_any", 00:23:56.546 "admin_cmd_passthru": { 00:23:56.546 "identify_ctrlr": false 00:23:56.546 } 00:23:56.546 } 00:23:56.546 }, 00:23:56.546 { 00:23:56.546 "method": "nvmf_set_max_subsystems", 00:23:56.546 "params": { 00:23:56.546 "max_subsystems": 1024 00:23:56.546 } 00:23:56.546 }, 00:23:56.546 { 00:23:56.546 "method": "nvmf_set_crdt", 00:23:56.546 "params": { 00:23:56.546 "crdt1": 0, 00:23:56.546 "crdt2": 0, 00:23:56.546 "crdt3": 0 00:23:56.546 } 00:23:56.546 }, 00:23:56.546 { 00:23:56.546 "method": "nvmf_create_transport", 00:23:56.546 "params": { 00:23:56.546 "trtype": "TCP", 00:23:56.546 "max_queue_depth": 128, 00:23:56.546 "max_io_qpairs_per_ctrlr": 127, 00:23:56.546 "in_capsule_data_size": 4096, 00:23:56.546 "max_io_size": 131072, 00:23:56.546 "io_unit_size": 131072, 00:23:56.546 "max_aq_depth": 128, 00:23:56.546 "num_shared_buffers": 511, 00:23:56.546 "buf_cache_size": 4294967295, 00:23:56.546 "dif_insert_or_strip": false, 00:23:56.546 "zcopy": false, 00:23:56.546 "c2h_success": true, 00:23:56.546 "sock_priority": 0, 00:23:56.546 "abort_timeout_sec": 1, 00:23:56.546 "ack_timeout": 0, 00:23:56.546 "data_wr_pool_size": 0 00:23:56.546 } 00:23:56.546 } 00:23:56.546 ] 00:23:56.546 }, 00:23:56.546 { 00:23:56.547 "subsystem": "iscsi", 00:23:56.547 "config": [ 00:23:56.547 { 00:23:56.547 "method": "iscsi_set_options", 00:23:56.547 "params": { 00:23:56.547 "node_base": "iqn.2016-06.io.spdk", 00:23:56.547 "max_sessions": 128, 00:23:56.547 "max_connections_per_session": 2, 00:23:56.547 "max_queue_depth": 64, 00:23:56.547 "default_time2wait": 2, 00:23:56.547 "default_time2retain": 20, 00:23:56.547 "first_burst_length": 8192, 00:23:56.547 "immediate_data": true, 00:23:56.547 "allow_duplicated_isid": false, 00:23:56.547 "error_recovery_level": 0, 00:23:56.547 "nop_timeout": 60, 00:23:56.547 "nop_in_interval": 30, 00:23:56.547 "disable_chap": false, 00:23:56.547 "require_chap": false, 00:23:56.547 "mutual_chap": false, 00:23:56.547 "chap_group": 0, 00:23:56.547 "max_large_datain_per_connection": 64, 00:23:56.547 "max_r2t_per_connection": 4, 00:23:56.547 "pdu_pool_size": 36864, 00:23:56.547 "immediate_data_pool_size": 16384, 00:23:56.547 "data_out_pool_size": 2048 00:23:56.547 } 00:23:56.547 } 00:23:56.547 ] 00:23:56.547 } 00:23:56.547 ] 00:23:56.547 } 00:23:56.547 11:34:40 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@39 -- # trap - SIGINT SIGTERM EXIT 00:23:56.547 11:34:40 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@40 -- # killprocess 95651 00:23:56.547 11:34:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@949 -- # '[' -z 95651 ']' 00:23:56.547 11:34:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # kill -0 95651 00:23:56.547 11:34:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # uname 00:23:56.547 11:34:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:23:56.547 11:34:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 95651 00:23:56.805 11:34:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:23:56.805 11:34:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:23:56.805 11:34:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@967 -- # echo 'killing process with pid 95651' 00:23:56.805 killing process with pid 95651 00:23:56.805 11:34:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@968 -- # kill 95651 00:23:56.805 11:34:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@973 -- # wait 95651 00:23:57.063 11:34:40 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@47 -- # local spdk_pid=95839 00:23:57.063 11:34:40 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:23:57.063 11:34:40 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@48 -- # sleep 5 00:24:02.323 11:34:45 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@50 -- # killprocess 95839 00:24:02.323 11:34:45 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@949 -- # '[' -z 95839 ']' 00:24:02.323 11:34:45 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # kill -0 95839 00:24:02.323 11:34:45 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # uname 00:24:02.323 11:34:45 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:24:02.323 11:34:45 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 95839 00:24:02.323 11:34:45 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:24:02.323 11:34:45 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:24:02.323 11:34:45 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@967 -- # echo 'killing process with pid 95839' 00:24:02.323 killing process with pid 95839 00:24:02.323 11:34:45 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@968 -- # kill 95839 00:24:02.323 11:34:45 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@973 -- # wait 95839 00:24:02.323 11:34:46 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@51 -- # grep -q 'TCP Transport Init' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:24:02.581 11:34:46 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@52 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:24:02.581 00:24:02.581 real 0m6.827s 00:24:02.581 user 0m6.500s 00:24:02.581 sys 0m0.731s 00:24:02.581 11:34:46 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1125 -- # xtrace_disable 00:24:02.581 11:34:46 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:24:02.581 ************************************ 00:24:02.581 END TEST skip_rpc_with_json 00:24:02.581 ************************************ 00:24:02.581 11:34:46 skip_rpc -- rpc/skip_rpc.sh@75 -- # run_test skip_rpc_with_delay test_skip_rpc_with_delay 00:24:02.581 11:34:46 skip_rpc -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:24:02.581 11:34:46 skip_rpc -- common/autotest_common.sh@1106 -- # xtrace_disable 00:24:02.581 11:34:46 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:24:02.581 ************************************ 00:24:02.581 START TEST skip_rpc_with_delay 00:24:02.581 ************************************ 00:24:02.581 11:34:46 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1124 -- # test_skip_rpc_with_delay 00:24:02.581 11:34:46 skip_rpc.skip_rpc_with_delay -- rpc/skip_rpc.sh@57 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:24:02.581 11:34:46 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@649 -- # local es=0 00:24:02.581 11:34:46 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:24:02.581 11:34:46 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:24:02.581 11:34:46 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:24:02.581 11:34:46 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:24:02.581 11:34:46 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:24:02.581 11:34:46 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:24:02.581 11:34:46 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:24:02.581 11:34:46 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:24:02.581 11:34:46 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:24:02.581 11:34:46 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:24:02.581 [2024-06-10 11:34:46.420049] app.c: 832:spdk_app_start: *ERROR*: Cannot use '--wait-for-rpc' if no RPC server is going to be started. 00:24:02.581 [2024-06-10 11:34:46.420131] app.c: 711:unclaim_cpu_cores: *ERROR*: Failed to unlink lock fd for core 0, errno: 2 00:24:02.581 11:34:46 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@652 -- # es=1 00:24:02.581 11:34:46 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:24:02.582 11:34:46 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:24:02.582 11:34:46 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:24:02.582 00:24:02.582 real 0m0.078s 00:24:02.582 user 0m0.045s 00:24:02.582 sys 0m0.032s 00:24:02.582 11:34:46 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1125 -- # xtrace_disable 00:24:02.582 11:34:46 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@10 -- # set +x 00:24:02.582 ************************************ 00:24:02.582 END TEST skip_rpc_with_delay 00:24:02.582 ************************************ 00:24:02.582 11:34:46 skip_rpc -- rpc/skip_rpc.sh@77 -- # uname 00:24:02.582 11:34:46 skip_rpc -- rpc/skip_rpc.sh@77 -- # '[' Linux '!=' FreeBSD ']' 00:24:02.582 11:34:46 skip_rpc -- rpc/skip_rpc.sh@78 -- # run_test exit_on_failed_rpc_init test_exit_on_failed_rpc_init 00:24:02.582 11:34:46 skip_rpc -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:24:02.582 11:34:46 skip_rpc -- common/autotest_common.sh@1106 -- # xtrace_disable 00:24:02.582 11:34:46 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:24:02.582 ************************************ 00:24:02.582 START TEST exit_on_failed_rpc_init 00:24:02.582 ************************************ 00:24:02.582 11:34:46 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1124 -- # test_exit_on_failed_rpc_init 00:24:02.582 11:34:46 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@62 -- # local spdk_pid=96594 00:24:02.582 11:34:46 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@63 -- # waitforlisten 96594 00:24:02.582 11:34:46 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:24:02.582 11:34:46 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@830 -- # '[' -z 96594 ']' 00:24:02.582 11:34:46 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:24:02.582 11:34:46 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@835 -- # local max_retries=100 00:24:02.582 11:34:46 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:24:02.582 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:24:02.582 11:34:46 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@839 -- # xtrace_disable 00:24:02.582 11:34:46 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:24:02.840 [2024-06-10 11:34:46.574623] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:24:02.840 [2024-06-10 11:34:46.574679] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid96594 ] 00:24:02.840 [2024-06-10 11:34:46.662282] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:02.840 [2024-06-10 11:34:46.749400] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:24:03.773 11:34:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:24:03.773 11:34:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@863 -- # return 0 00:24:03.773 11:34:47 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@65 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:24:03.773 11:34:47 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@67 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:24:03.773 11:34:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@649 -- # local es=0 00:24:03.773 11:34:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:24:03.773 11:34:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:24:03.773 11:34:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:24:03.773 11:34:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:24:03.773 11:34:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:24:03.773 11:34:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:24:03.773 11:34:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:24:03.773 11:34:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:24:03.773 11:34:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:24:03.773 11:34:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:24:03.773 [2024-06-10 11:34:47.432928] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:24:03.773 [2024-06-10 11:34:47.432979] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid96756 ] 00:24:03.773 [2024-06-10 11:34:47.518948] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:03.773 [2024-06-10 11:34:47.600810] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:24:03.773 [2024-06-10 11:34:47.600886] rpc.c: 180:_spdk_rpc_listen: *ERROR*: RPC Unix domain socket path /var/tmp/spdk.sock in use. Specify another. 00:24:03.773 [2024-06-10 11:34:47.600898] rpc.c: 166:spdk_rpc_initialize: *ERROR*: Unable to start RPC service at /var/tmp/spdk.sock 00:24:03.773 [2024-06-10 11:34:47.600905] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:24:03.773 11:34:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@652 -- # es=234 00:24:03.773 11:34:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:24:03.773 11:34:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@661 -- # es=106 00:24:03.773 11:34:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@662 -- # case "$es" in 00:24:03.773 11:34:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@669 -- # es=1 00:24:03.773 11:34:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:24:03.773 11:34:47 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@69 -- # trap - SIGINT SIGTERM EXIT 00:24:03.773 11:34:47 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@70 -- # killprocess 96594 00:24:03.773 11:34:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@949 -- # '[' -z 96594 ']' 00:24:03.773 11:34:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@953 -- # kill -0 96594 00:24:03.773 11:34:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@954 -- # uname 00:24:03.773 11:34:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:24:03.773 11:34:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 96594 00:24:04.031 11:34:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:24:04.031 11:34:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:24:04.031 11:34:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@967 -- # echo 'killing process with pid 96594' 00:24:04.031 killing process with pid 96594 00:24:04.031 11:34:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@968 -- # kill 96594 00:24:04.031 11:34:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@973 -- # wait 96594 00:24:04.289 00:24:04.289 real 0m1.579s 00:24:04.289 user 0m1.741s 00:24:04.289 sys 0m0.520s 00:24:04.289 11:34:48 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1125 -- # xtrace_disable 00:24:04.289 11:34:48 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:24:04.289 ************************************ 00:24:04.289 END TEST exit_on_failed_rpc_init 00:24:04.289 ************************************ 00:24:04.289 11:34:48 skip_rpc -- rpc/skip_rpc.sh@81 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:24:04.289 00:24:04.289 real 0m14.296s 00:24:04.289 user 0m13.521s 00:24:04.289 sys 0m1.894s 00:24:04.289 11:34:48 skip_rpc -- common/autotest_common.sh@1125 -- # xtrace_disable 00:24:04.289 11:34:48 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:24:04.289 ************************************ 00:24:04.289 END TEST skip_rpc 00:24:04.289 ************************************ 00:24:04.289 11:34:48 -- spdk/autotest.sh@171 -- # run_test rpc_client /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:24:04.289 11:34:48 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:24:04.289 11:34:48 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:24:04.289 11:34:48 -- common/autotest_common.sh@10 -- # set +x 00:24:04.289 ************************************ 00:24:04.289 START TEST rpc_client 00:24:04.289 ************************************ 00:24:04.289 11:34:48 rpc_client -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:24:04.547 * Looking for test storage... 00:24:04.547 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client 00:24:04.547 11:34:48 rpc_client -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:24:04.547 OK 00:24:04.547 11:34:48 rpc_client -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:24:04.547 00:24:04.547 real 0m0.134s 00:24:04.547 user 0m0.065s 00:24:04.547 sys 0m0.080s 00:24:04.547 11:34:48 rpc_client -- common/autotest_common.sh@1125 -- # xtrace_disable 00:24:04.547 11:34:48 rpc_client -- common/autotest_common.sh@10 -- # set +x 00:24:04.547 ************************************ 00:24:04.547 END TEST rpc_client 00:24:04.547 ************************************ 00:24:04.547 11:34:48 -- spdk/autotest.sh@172 -- # run_test json_config /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config.sh 00:24:04.547 11:34:48 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:24:04.547 11:34:48 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:24:04.547 11:34:48 -- common/autotest_common.sh@10 -- # set +x 00:24:04.547 ************************************ 00:24:04.547 START TEST json_config 00:24:04.547 ************************************ 00:24:04.547 11:34:48 json_config -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config.sh 00:24:04.805 11:34:48 json_config -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:24:04.805 11:34:48 json_config -- nvmf/common.sh@7 -- # uname -s 00:24:04.805 11:34:48 json_config -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:24:04.805 11:34:48 json_config -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:24:04.805 11:34:48 json_config -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:24:04.805 11:34:48 json_config -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:24:04.805 11:34:48 json_config -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:24:04.805 11:34:48 json_config -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:24:04.805 11:34:48 json_config -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:24:04.805 11:34:48 json_config -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:24:04.805 11:34:48 json_config -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:24:04.805 11:34:48 json_config -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:24:04.805 11:34:48 json_config -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:804da62e-425e-e711-906e-0017a4403562 00:24:04.805 11:34:48 json_config -- nvmf/common.sh@18 -- # NVME_HOSTID=804da62e-425e-e711-906e-0017a4403562 00:24:04.805 11:34:48 json_config -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:24:04.805 11:34:48 json_config -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:24:04.805 11:34:48 json_config -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:24:04.805 11:34:48 json_config -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:24:04.806 11:34:48 json_config -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:24:04.806 11:34:48 json_config -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:24:04.806 11:34:48 json_config -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:24:04.806 11:34:48 json_config -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:24:04.806 11:34:48 json_config -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:24:04.806 11:34:48 json_config -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:24:04.806 11:34:48 json_config -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:24:04.806 11:34:48 json_config -- paths/export.sh@5 -- # export PATH 00:24:04.806 11:34:48 json_config -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:24:04.806 11:34:48 json_config -- nvmf/common.sh@47 -- # : 0 00:24:04.806 11:34:48 json_config -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:24:04.806 11:34:48 json_config -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:24:04.806 11:34:48 json_config -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:24:04.806 11:34:48 json_config -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:24:04.806 11:34:48 json_config -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:24:04.806 11:34:48 json_config -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:24:04.806 11:34:48 json_config -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:24:04.806 11:34:48 json_config -- nvmf/common.sh@51 -- # have_pci_nics=0 00:24:04.806 11:34:48 json_config -- json_config/json_config.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/common.sh 00:24:04.806 11:34:48 json_config -- json_config/json_config.sh@11 -- # [[ 0 -eq 1 ]] 00:24:04.806 11:34:48 json_config -- json_config/json_config.sh@15 -- # [[ 0 -ne 1 ]] 00:24:04.806 11:34:48 json_config -- json_config/json_config.sh@15 -- # [[ 0 -eq 1 ]] 00:24:04.806 11:34:48 json_config -- json_config/json_config.sh@26 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:24:04.806 11:34:48 json_config -- json_config/json_config.sh@31 -- # app_pid=(['target']='' ['initiator']='') 00:24:04.806 11:34:48 json_config -- json_config/json_config.sh@31 -- # declare -A app_pid 00:24:04.806 11:34:48 json_config -- json_config/json_config.sh@32 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock' ['initiator']='/var/tmp/spdk_initiator.sock') 00:24:04.806 11:34:48 json_config -- json_config/json_config.sh@32 -- # declare -A app_socket 00:24:04.806 11:34:48 json_config -- json_config/json_config.sh@33 -- # app_params=(['target']='-m 0x1 -s 1024' ['initiator']='-m 0x2 -g -u -s 1024') 00:24:04.806 11:34:48 json_config -- json_config/json_config.sh@33 -- # declare -A app_params 00:24:04.806 11:34:48 json_config -- json_config/json_config.sh@34 -- # configs_path=(['target']='/var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json' ['initiator']='/var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_initiator_config.json') 00:24:04.806 11:34:48 json_config -- json_config/json_config.sh@34 -- # declare -A configs_path 00:24:04.806 11:34:48 json_config -- json_config/json_config.sh@40 -- # last_event_id=0 00:24:04.806 11:34:48 json_config -- json_config/json_config.sh@355 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:24:04.806 11:34:48 json_config -- json_config/json_config.sh@356 -- # echo 'INFO: JSON configuration test init' 00:24:04.806 INFO: JSON configuration test init 00:24:04.806 11:34:48 json_config -- json_config/json_config.sh@357 -- # json_config_test_init 00:24:04.806 11:34:48 json_config -- json_config/json_config.sh@262 -- # timing_enter json_config_test_init 00:24:04.806 11:34:48 json_config -- common/autotest_common.sh@723 -- # xtrace_disable 00:24:04.806 11:34:48 json_config -- common/autotest_common.sh@10 -- # set +x 00:24:04.806 11:34:48 json_config -- json_config/json_config.sh@263 -- # timing_enter json_config_setup_target 00:24:04.806 11:34:48 json_config -- common/autotest_common.sh@723 -- # xtrace_disable 00:24:04.806 11:34:48 json_config -- common/autotest_common.sh@10 -- # set +x 00:24:04.806 11:34:48 json_config -- json_config/json_config.sh@265 -- # json_config_test_start_app target --wait-for-rpc 00:24:04.806 11:34:48 json_config -- json_config/common.sh@9 -- # local app=target 00:24:04.806 11:34:48 json_config -- json_config/common.sh@10 -- # shift 00:24:04.806 11:34:48 json_config -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:24:04.806 11:34:48 json_config -- json_config/common.sh@13 -- # [[ -z '' ]] 00:24:04.806 11:34:48 json_config -- json_config/common.sh@15 -- # local app_extra_params= 00:24:04.806 11:34:48 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:24:04.806 11:34:48 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:24:04.806 11:34:48 json_config -- json_config/common.sh@22 -- # app_pid["$app"]=96966 00:24:04.806 11:34:48 json_config -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:24:04.806 Waiting for target to run... 00:24:04.806 11:34:48 json_config -- json_config/common.sh@25 -- # waitforlisten 96966 /var/tmp/spdk_tgt.sock 00:24:04.806 11:34:48 json_config -- common/autotest_common.sh@830 -- # '[' -z 96966 ']' 00:24:04.806 11:34:48 json_config -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --wait-for-rpc 00:24:04.806 11:34:48 json_config -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:24:04.806 11:34:48 json_config -- common/autotest_common.sh@835 -- # local max_retries=100 00:24:04.806 11:34:48 json_config -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:24:04.806 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:24:04.806 11:34:48 json_config -- common/autotest_common.sh@839 -- # xtrace_disable 00:24:04.806 11:34:48 json_config -- common/autotest_common.sh@10 -- # set +x 00:24:04.806 [2024-06-10 11:34:48.619275] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:24:04.806 [2024-06-10 11:34:48.619337] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid96966 ] 00:24:05.373 [2024-06-10 11:34:49.141046] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:05.373 [2024-06-10 11:34:49.232245] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:24:05.630 11:34:49 json_config -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:24:05.630 11:34:49 json_config -- common/autotest_common.sh@863 -- # return 0 00:24:05.630 11:34:49 json_config -- json_config/common.sh@26 -- # echo '' 00:24:05.630 00:24:05.630 11:34:49 json_config -- json_config/json_config.sh@269 -- # create_accel_config 00:24:05.630 11:34:49 json_config -- json_config/json_config.sh@93 -- # timing_enter create_accel_config 00:24:05.630 11:34:49 json_config -- common/autotest_common.sh@723 -- # xtrace_disable 00:24:05.630 11:34:49 json_config -- common/autotest_common.sh@10 -- # set +x 00:24:05.630 11:34:49 json_config -- json_config/json_config.sh@95 -- # [[ 1 -eq 1 ]] 00:24:05.630 11:34:49 json_config -- json_config/json_config.sh@96 -- # tgt_rpc dpdk_cryptodev_scan_accel_module 00:24:05.630 11:34:49 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock dpdk_cryptodev_scan_accel_module 00:24:05.888 11:34:49 json_config -- json_config/json_config.sh@97 -- # tgt_rpc accel_assign_opc -o encrypt -m dpdk_cryptodev 00:24:05.888 11:34:49 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock accel_assign_opc -o encrypt -m dpdk_cryptodev 00:24:05.888 [2024-06-10 11:34:49.737860] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:24:05.888 11:34:49 json_config -- json_config/json_config.sh@98 -- # tgt_rpc accel_assign_opc -o decrypt -m dpdk_cryptodev 00:24:05.888 11:34:49 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock accel_assign_opc -o decrypt -m dpdk_cryptodev 00:24:06.145 [2024-06-10 11:34:49.930349] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:24:06.145 11:34:49 json_config -- json_config/json_config.sh@101 -- # timing_exit create_accel_config 00:24:06.145 11:34:49 json_config -- common/autotest_common.sh@729 -- # xtrace_disable 00:24:06.145 11:34:49 json_config -- common/autotest_common.sh@10 -- # set +x 00:24:06.145 11:34:49 json_config -- json_config/json_config.sh@273 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --json-with-subsystems 00:24:06.145 11:34:49 json_config -- json_config/json_config.sh@274 -- # tgt_rpc load_config 00:24:06.145 11:34:49 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock load_config 00:24:06.403 [2024-06-10 11:34:50.178056] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:24:08.930 11:34:52 json_config -- json_config/json_config.sh@276 -- # tgt_check_notification_types 00:24:08.930 11:34:52 json_config -- json_config/json_config.sh@43 -- # timing_enter tgt_check_notification_types 00:24:08.930 11:34:52 json_config -- common/autotest_common.sh@723 -- # xtrace_disable 00:24:08.930 11:34:52 json_config -- common/autotest_common.sh@10 -- # set +x 00:24:08.930 11:34:52 json_config -- json_config/json_config.sh@45 -- # local ret=0 00:24:08.930 11:34:52 json_config -- json_config/json_config.sh@46 -- # enabled_types=('bdev_register' 'bdev_unregister') 00:24:08.930 11:34:52 json_config -- json_config/json_config.sh@46 -- # local enabled_types 00:24:08.930 11:34:52 json_config -- json_config/json_config.sh@48 -- # tgt_rpc notify_get_types 00:24:08.930 11:34:52 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_types 00:24:08.930 11:34:52 json_config -- json_config/json_config.sh@48 -- # jq -r '.[]' 00:24:09.188 11:34:52 json_config -- json_config/json_config.sh@48 -- # get_types=('bdev_register' 'bdev_unregister') 00:24:09.188 11:34:52 json_config -- json_config/json_config.sh@48 -- # local get_types 00:24:09.188 11:34:52 json_config -- json_config/json_config.sh@49 -- # [[ bdev_register bdev_unregister != \b\d\e\v\_\r\e\g\i\s\t\e\r\ \b\d\e\v\_\u\n\r\e\g\i\s\t\e\r ]] 00:24:09.188 11:34:52 json_config -- json_config/json_config.sh@54 -- # timing_exit tgt_check_notification_types 00:24:09.188 11:34:52 json_config -- common/autotest_common.sh@729 -- # xtrace_disable 00:24:09.188 11:34:52 json_config -- common/autotest_common.sh@10 -- # set +x 00:24:09.188 11:34:52 json_config -- json_config/json_config.sh@55 -- # return 0 00:24:09.188 11:34:52 json_config -- json_config/json_config.sh@278 -- # [[ 1 -eq 1 ]] 00:24:09.188 11:34:52 json_config -- json_config/json_config.sh@279 -- # create_bdev_subsystem_config 00:24:09.188 11:34:52 json_config -- json_config/json_config.sh@105 -- # timing_enter create_bdev_subsystem_config 00:24:09.188 11:34:52 json_config -- common/autotest_common.sh@723 -- # xtrace_disable 00:24:09.188 11:34:52 json_config -- common/autotest_common.sh@10 -- # set +x 00:24:09.188 11:34:52 json_config -- json_config/json_config.sh@107 -- # expected_notifications=() 00:24:09.188 11:34:52 json_config -- json_config/json_config.sh@107 -- # local expected_notifications 00:24:09.188 11:34:52 json_config -- json_config/json_config.sh@111 -- # expected_notifications+=($(get_notifications)) 00:24:09.188 11:34:52 json_config -- json_config/json_config.sh@111 -- # get_notifications 00:24:09.188 11:34:52 json_config -- json_config/json_config.sh@59 -- # local ev_type ev_ctx event_id 00:24:09.188 11:34:52 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:24:09.188 11:34:52 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:24:09.188 11:34:52 json_config -- json_config/json_config.sh@58 -- # tgt_rpc notify_get_notifications -i 0 00:24:09.188 11:34:52 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_notifications -i 0 00:24:09.188 11:34:52 json_config -- json_config/json_config.sh@58 -- # jq -r '.[] | "\(.type):\(.ctx):\(.id)"' 00:24:09.188 11:34:53 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1 00:24:09.188 11:34:53 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:24:09.188 11:34:53 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:24:09.188 11:34:53 json_config -- json_config/json_config.sh@113 -- # [[ 1 -eq 1 ]] 00:24:09.188 11:34:53 json_config -- json_config/json_config.sh@114 -- # local lvol_store_base_bdev=Nvme0n1 00:24:09.188 11:34:53 json_config -- json_config/json_config.sh@116 -- # tgt_rpc bdev_split_create Nvme0n1 2 00:24:09.188 11:34:53 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_split_create Nvme0n1 2 00:24:09.446 Nvme0n1p0 Nvme0n1p1 00:24:09.446 11:34:53 json_config -- json_config/json_config.sh@117 -- # tgt_rpc bdev_split_create Malloc0 3 00:24:09.446 11:34:53 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_split_create Malloc0 3 00:24:09.704 [2024-06-10 11:34:53.452145] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:24:09.704 [2024-06-10 11:34:53.452196] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:24:09.704 00:24:09.704 11:34:53 json_config -- json_config/json_config.sh@118 -- # tgt_rpc bdev_malloc_create 8 4096 --name Malloc3 00:24:09.704 11:34:53 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 4096 --name Malloc3 00:24:09.704 Malloc3 00:24:09.704 11:34:53 json_config -- json_config/json_config.sh@119 -- # tgt_rpc bdev_passthru_create -b Malloc3 -p PTBdevFromMalloc3 00:24:09.704 11:34:53 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_passthru_create -b Malloc3 -p PTBdevFromMalloc3 00:24:09.962 [2024-06-10 11:34:53.797090] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:24:09.962 [2024-06-10 11:34:53.797138] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:09.962 [2024-06-10 11:34:53.797156] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x10c5700 00:24:09.962 [2024-06-10 11:34:53.797164] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:09.962 [2024-06-10 11:34:53.798413] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:09.962 [2024-06-10 11:34:53.798441] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: PTBdevFromMalloc3 00:24:09.962 PTBdevFromMalloc3 00:24:09.962 11:34:53 json_config -- json_config/json_config.sh@121 -- # tgt_rpc bdev_null_create Null0 32 512 00:24:09.962 11:34:53 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_null_create Null0 32 512 00:24:10.220 Null0 00:24:10.220 11:34:53 json_config -- json_config/json_config.sh@123 -- # tgt_rpc bdev_malloc_create 32 512 --name Malloc0 00:24:10.220 11:34:53 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 32 512 --name Malloc0 00:24:10.220 Malloc0 00:24:10.220 11:34:54 json_config -- json_config/json_config.sh@124 -- # tgt_rpc bdev_malloc_create 16 4096 --name Malloc1 00:24:10.220 11:34:54 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 16 4096 --name Malloc1 00:24:10.477 Malloc1 00:24:10.477 11:34:54 json_config -- json_config/json_config.sh@137 -- # expected_notifications+=(bdev_register:${lvol_store_base_bdev}p1 bdev_register:${lvol_store_base_bdev}p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1) 00:24:10.477 11:34:54 json_config -- json_config/json_config.sh@140 -- # dd if=/dev/zero of=/sample_aio bs=1024 count=102400 00:24:10.735 102400+0 records in 00:24:10.735 102400+0 records out 00:24:10.735 104857600 bytes (105 MB, 100 MiB) copied, 0.217165 s, 483 MB/s 00:24:10.735 11:34:54 json_config -- json_config/json_config.sh@141 -- # tgt_rpc bdev_aio_create /sample_aio aio_disk 1024 00:24:10.735 11:34:54 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_aio_create /sample_aio aio_disk 1024 00:24:10.993 aio_disk 00:24:10.993 11:34:54 json_config -- json_config/json_config.sh@142 -- # expected_notifications+=(bdev_register:aio_disk) 00:24:10.993 11:34:54 json_config -- json_config/json_config.sh@147 -- # tgt_rpc bdev_lvol_create_lvstore -c 1048576 Nvme0n1p0 lvs_test 00:24:10.993 11:34:54 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create_lvstore -c 1048576 Nvme0n1p0 lvs_test 00:24:10.993 f6a44dc5-ad30-4dcf-aee2-56736e30910e 00:24:11.251 11:34:54 json_config -- json_config/json_config.sh@154 -- # expected_notifications+=("bdev_register:$(tgt_rpc bdev_lvol_create -l lvs_test lvol0 32)" "bdev_register:$(tgt_rpc bdev_lvol_create -l lvs_test -t lvol1 32)" "bdev_register:$(tgt_rpc bdev_lvol_snapshot lvs_test/lvol0 snapshot0)" "bdev_register:$(tgt_rpc bdev_lvol_clone lvs_test/snapshot0 clone0)") 00:24:11.251 11:34:54 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_create -l lvs_test lvol0 32 00:24:11.251 11:34:54 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create -l lvs_test lvol0 32 00:24:11.251 11:34:55 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_create -l lvs_test -t lvol1 32 00:24:11.251 11:34:55 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create -l lvs_test -t lvol1 32 00:24:11.509 11:34:55 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_snapshot lvs_test/lvol0 snapshot0 00:24:11.509 11:34:55 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_snapshot lvs_test/lvol0 snapshot0 00:24:11.767 11:34:55 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_clone lvs_test/snapshot0 clone0 00:24:11.767 11:34:55 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_clone lvs_test/snapshot0 clone0 00:24:11.767 11:34:55 json_config -- json_config/json_config.sh@157 -- # [[ 1 -eq 1 ]] 00:24:11.767 11:34:55 json_config -- json_config/json_config.sh@158 -- # tgt_rpc bdev_malloc_create 8 1024 --name MallocForCryptoBdev 00:24:11.767 11:34:55 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 1024 --name MallocForCryptoBdev 00:24:12.025 MallocForCryptoBdev 00:24:12.025 11:34:55 json_config -- json_config/json_config.sh@159 -- # lspci -d:37c8 00:24:12.025 11:34:55 json_config -- json_config/json_config.sh@159 -- # wc -l 00:24:12.025 11:34:55 json_config -- json_config/json_config.sh@159 -- # [[ 3 -eq 0 ]] 00:24:12.025 11:34:55 json_config -- json_config/json_config.sh@162 -- # local crypto_driver=crypto_qat 00:24:12.025 11:34:55 json_config -- json_config/json_config.sh@165 -- # tgt_rpc bdev_crypto_create MallocForCryptoBdev CryptoMallocBdev -p crypto_qat -k 01234567891234560123456789123456 00:24:12.025 11:34:55 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_crypto_create MallocForCryptoBdev CryptoMallocBdev -p crypto_qat -k 01234567891234560123456789123456 00:24:12.283 [2024-06-10 11:34:55.984589] vbdev_crypto_rpc.c: 136:rpc_bdev_crypto_create: *WARNING*: "crypto_pmd" parameters is obsolete and ignored 00:24:12.283 CryptoMallocBdev 00:24:12.283 11:34:55 json_config -- json_config/json_config.sh@169 -- # expected_notifications+=(bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev) 00:24:12.283 11:34:55 json_config -- json_config/json_config.sh@172 -- # [[ 0 -eq 1 ]] 00:24:12.283 11:34:55 json_config -- json_config/json_config.sh@178 -- # tgt_check_notifications bdev_register:Nvme0n1 bdev_register:Nvme0n1p1 bdev_register:Nvme0n1p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1 bdev_register:aio_disk bdev_register:e3e0c8d4-dea5-4d27-8e88-bcd11016ee91 bdev_register:87e550b5-41f4-43f9-bfa5-7cfd1dcc2797 bdev_register:b3471c82-f5e4-4501-be11-9ffdba8ad605 bdev_register:14140433-3ad6-4fae-9fa1-bc1af11c139b bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev 00:24:12.283 11:34:55 json_config -- json_config/json_config.sh@67 -- # local events_to_check 00:24:12.283 11:34:55 json_config -- json_config/json_config.sh@68 -- # local recorded_events 00:24:12.283 11:34:55 json_config -- json_config/json_config.sh@71 -- # events_to_check=($(printf '%s\n' "$@" | sort)) 00:24:12.283 11:34:55 json_config -- json_config/json_config.sh@71 -- # printf '%s\n' bdev_register:Nvme0n1 bdev_register:Nvme0n1p1 bdev_register:Nvme0n1p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1 bdev_register:aio_disk bdev_register:e3e0c8d4-dea5-4d27-8e88-bcd11016ee91 bdev_register:87e550b5-41f4-43f9-bfa5-7cfd1dcc2797 bdev_register:b3471c82-f5e4-4501-be11-9ffdba8ad605 bdev_register:14140433-3ad6-4fae-9fa1-bc1af11c139b bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev 00:24:12.283 11:34:56 json_config -- json_config/json_config.sh@71 -- # sort 00:24:12.283 11:34:56 json_config -- json_config/json_config.sh@72 -- # recorded_events=($(get_notifications | sort)) 00:24:12.283 11:34:56 json_config -- json_config/json_config.sh@72 -- # get_notifications 00:24:12.283 11:34:56 json_config -- json_config/json_config.sh@72 -- # sort 00:24:12.283 11:34:56 json_config -- json_config/json_config.sh@59 -- # local ev_type ev_ctx event_id 00:24:12.283 11:34:56 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:24:12.283 11:34:56 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:24:12.283 11:34:56 json_config -- json_config/json_config.sh@58 -- # tgt_rpc notify_get_notifications -i 0 00:24:12.283 11:34:56 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_notifications -i 0 00:24:12.283 11:34:56 json_config -- json_config/json_config.sh@58 -- # jq -r '.[] | "\(.type):\(.ctx):\(.id)"' 00:24:12.283 11:34:56 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1 00:24:12.283 11:34:56 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:24:12.283 11:34:56 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:24:12.283 11:34:56 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1p1 00:24:12.283 11:34:56 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:24:12.283 11:34:56 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:24:12.283 11:34:56 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1p0 00:24:12.283 11:34:56 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:24:12.283 11:34:56 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:24:12.283 11:34:56 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc3 00:24:12.283 11:34:56 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:24:12.283 11:34:56 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:24:12.283 11:34:56 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:PTBdevFromMalloc3 00:24:12.283 11:34:56 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:24:12.283 11:34:56 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:24:12.283 11:34:56 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Null0 00:24:12.283 11:34:56 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:24:12.283 11:34:56 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:24:12.283 11:34:56 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0 00:24:12.283 11:34:56 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:24:12.283 11:34:56 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:24:12.283 11:34:56 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p2 00:24:12.283 11:34:56 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:24:12.283 11:34:56 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:24:12.283 11:34:56 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p1 00:24:12.283 11:34:56 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:24:12.283 11:34:56 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:24:12.283 11:34:56 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p0 00:24:12.283 11:34:56 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:24:12.283 11:34:56 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:24:12.283 11:34:56 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc1 00:24:12.283 11:34:56 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:24:12.283 11:34:56 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:24:12.283 11:34:56 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:aio_disk 00:24:12.283 11:34:56 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:24:12.283 11:34:56 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:24:12.283 11:34:56 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:e3e0c8d4-dea5-4d27-8e88-bcd11016ee91 00:24:12.283 11:34:56 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:24:12.283 11:34:56 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:24:12.283 11:34:56 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:87e550b5-41f4-43f9-bfa5-7cfd1dcc2797 00:24:12.283 11:34:56 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:24:12.283 11:34:56 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:24:12.283 11:34:56 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:b3471c82-f5e4-4501-be11-9ffdba8ad605 00:24:12.283 11:34:56 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:24:12.283 11:34:56 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:24:12.283 11:34:56 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:14140433-3ad6-4fae-9fa1-bc1af11c139b 00:24:12.283 11:34:56 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:24:12.283 11:34:56 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:24:12.283 11:34:56 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:MallocForCryptoBdev 00:24:12.283 11:34:56 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:24:12.283 11:34:56 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:24:12.283 11:34:56 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:CryptoMallocBdev 00:24:12.283 11:34:56 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:24:12.283 11:34:56 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:24:12.284 11:34:56 json_config -- json_config/json_config.sh@74 -- # [[ bdev_register:14140433-3ad6-4fae-9fa1-bc1af11c139b bdev_register:87e550b5-41f4-43f9-bfa5-7cfd1dcc2797 bdev_register:aio_disk bdev_register:b3471c82-f5e4-4501-be11-9ffdba8ad605 bdev_register:CryptoMallocBdev bdev_register:e3e0c8d4-dea5-4d27-8e88-bcd11016ee91 bdev_register:Malloc0 bdev_register:Malloc0p0 bdev_register:Malloc0p1 bdev_register:Malloc0p2 bdev_register:Malloc1 bdev_register:Malloc3 bdev_register:MallocForCryptoBdev bdev_register:Null0 bdev_register:Nvme0n1 bdev_register:Nvme0n1p0 bdev_register:Nvme0n1p1 bdev_register:PTBdevFromMalloc3 != \b\d\e\v\_\r\e\g\i\s\t\e\r\:\1\4\1\4\0\4\3\3\-\3\a\d\6\-\4\f\a\e\-\9\f\a\1\-\b\c\1\a\f\1\1\c\1\3\9\b\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\8\7\e\5\5\0\b\5\-\4\1\f\4\-\4\3\f\9\-\b\f\a\5\-\7\c\f\d\1\d\c\c\2\7\9\7\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\a\i\o\_\d\i\s\k\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\b\3\4\7\1\c\8\2\-\f\5\e\4\-\4\5\0\1\-\b\e\1\1\-\9\f\f\d\b\a\8\a\d\6\0\5\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\C\r\y\p\t\o\M\a\l\l\o\c\B\d\e\v\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\e\3\e\0\c\8\d\4\-\d\e\a\5\-\4\d\2\7\-\8\e\8\8\-\b\c\d\1\1\0\1\6\e\e\9\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\2\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\3\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\F\o\r\C\r\y\p\t\o\B\d\e\v\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\u\l\l\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\p\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\p\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\P\T\B\d\e\v\F\r\o\m\M\a\l\l\o\c\3 ]] 00:24:12.284 11:34:56 json_config -- json_config/json_config.sh@86 -- # cat 00:24:12.284 11:34:56 json_config -- json_config/json_config.sh@86 -- # printf ' %s\n' bdev_register:14140433-3ad6-4fae-9fa1-bc1af11c139b bdev_register:87e550b5-41f4-43f9-bfa5-7cfd1dcc2797 bdev_register:aio_disk bdev_register:b3471c82-f5e4-4501-be11-9ffdba8ad605 bdev_register:CryptoMallocBdev bdev_register:e3e0c8d4-dea5-4d27-8e88-bcd11016ee91 bdev_register:Malloc0 bdev_register:Malloc0p0 bdev_register:Malloc0p1 bdev_register:Malloc0p2 bdev_register:Malloc1 bdev_register:Malloc3 bdev_register:MallocForCryptoBdev bdev_register:Null0 bdev_register:Nvme0n1 bdev_register:Nvme0n1p0 bdev_register:Nvme0n1p1 bdev_register:PTBdevFromMalloc3 00:24:12.284 Expected events matched: 00:24:12.284 bdev_register:14140433-3ad6-4fae-9fa1-bc1af11c139b 00:24:12.284 bdev_register:87e550b5-41f4-43f9-bfa5-7cfd1dcc2797 00:24:12.284 bdev_register:aio_disk 00:24:12.284 bdev_register:b3471c82-f5e4-4501-be11-9ffdba8ad605 00:24:12.284 bdev_register:CryptoMallocBdev 00:24:12.284 bdev_register:e3e0c8d4-dea5-4d27-8e88-bcd11016ee91 00:24:12.284 bdev_register:Malloc0 00:24:12.284 bdev_register:Malloc0p0 00:24:12.284 bdev_register:Malloc0p1 00:24:12.284 bdev_register:Malloc0p2 00:24:12.284 bdev_register:Malloc1 00:24:12.284 bdev_register:Malloc3 00:24:12.284 bdev_register:MallocForCryptoBdev 00:24:12.284 bdev_register:Null0 00:24:12.284 bdev_register:Nvme0n1 00:24:12.284 bdev_register:Nvme0n1p0 00:24:12.284 bdev_register:Nvme0n1p1 00:24:12.284 bdev_register:PTBdevFromMalloc3 00:24:12.284 11:34:56 json_config -- json_config/json_config.sh@180 -- # timing_exit create_bdev_subsystem_config 00:24:12.284 11:34:56 json_config -- common/autotest_common.sh@729 -- # xtrace_disable 00:24:12.284 11:34:56 json_config -- common/autotest_common.sh@10 -- # set +x 00:24:12.541 11:34:56 json_config -- json_config/json_config.sh@282 -- # [[ 0 -eq 1 ]] 00:24:12.541 11:34:56 json_config -- json_config/json_config.sh@286 -- # [[ 0 -eq 1 ]] 00:24:12.541 11:34:56 json_config -- json_config/json_config.sh@290 -- # [[ 0 -eq 1 ]] 00:24:12.541 11:34:56 json_config -- json_config/json_config.sh@293 -- # timing_exit json_config_setup_target 00:24:12.541 11:34:56 json_config -- common/autotest_common.sh@729 -- # xtrace_disable 00:24:12.541 11:34:56 json_config -- common/autotest_common.sh@10 -- # set +x 00:24:12.541 11:34:56 json_config -- json_config/json_config.sh@295 -- # [[ 0 -eq 1 ]] 00:24:12.541 11:34:56 json_config -- json_config/json_config.sh@300 -- # tgt_rpc bdev_malloc_create 8 512 --name MallocBdevForConfigChangeCheck 00:24:12.541 11:34:56 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 512 --name MallocBdevForConfigChangeCheck 00:24:12.541 MallocBdevForConfigChangeCheck 00:24:12.541 11:34:56 json_config -- json_config/json_config.sh@302 -- # timing_exit json_config_test_init 00:24:12.541 11:34:56 json_config -- common/autotest_common.sh@729 -- # xtrace_disable 00:24:12.541 11:34:56 json_config -- common/autotest_common.sh@10 -- # set +x 00:24:12.798 11:34:56 json_config -- json_config/json_config.sh@359 -- # tgt_rpc save_config 00:24:12.798 11:34:56 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:24:13.056 11:34:56 json_config -- json_config/json_config.sh@361 -- # echo 'INFO: shutting down applications...' 00:24:13.056 INFO: shutting down applications... 00:24:13.056 11:34:56 json_config -- json_config/json_config.sh@362 -- # [[ 0 -eq 1 ]] 00:24:13.056 11:34:56 json_config -- json_config/json_config.sh@368 -- # json_config_clear target 00:24:13.056 11:34:56 json_config -- json_config/json_config.sh@332 -- # [[ -n 22 ]] 00:24:13.056 11:34:56 json_config -- json_config/json_config.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/clear_config.py -s /var/tmp/spdk_tgt.sock clear_config 00:24:13.056 [2024-06-10 11:34:56.995494] vbdev_lvol.c: 150:vbdev_lvs_hotremove_cb: *NOTICE*: bdev Nvme0n1p0 being removed: closing lvstore lvs_test 00:24:14.952 Calling clear_iscsi_subsystem 00:24:14.952 Calling clear_nvmf_subsystem 00:24:14.952 Calling clear_nbd_subsystem 00:24:14.952 Calling clear_ublk_subsystem 00:24:14.952 Calling clear_vhost_blk_subsystem 00:24:14.952 Calling clear_vhost_scsi_subsystem 00:24:14.952 Calling clear_bdev_subsystem 00:24:14.953 11:34:58 json_config -- json_config/json_config.sh@337 -- # local config_filter=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py 00:24:14.953 11:34:58 json_config -- json_config/json_config.sh@343 -- # count=100 00:24:14.953 11:34:58 json_config -- json_config/json_config.sh@344 -- # '[' 100 -gt 0 ']' 00:24:14.953 11:34:58 json_config -- json_config/json_config.sh@345 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:24:14.953 11:34:58 json_config -- json_config/json_config.sh@345 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method delete_global_parameters 00:24:14.953 11:34:58 json_config -- json_config/json_config.sh@345 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method check_empty 00:24:14.953 11:34:58 json_config -- json_config/json_config.sh@345 -- # break 00:24:14.953 11:34:58 json_config -- json_config/json_config.sh@350 -- # '[' 100 -eq 0 ']' 00:24:14.953 11:34:58 json_config -- json_config/json_config.sh@369 -- # json_config_test_shutdown_app target 00:24:14.953 11:34:58 json_config -- json_config/common.sh@31 -- # local app=target 00:24:14.953 11:34:58 json_config -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:24:14.953 11:34:58 json_config -- json_config/common.sh@35 -- # [[ -n 96966 ]] 00:24:14.953 11:34:58 json_config -- json_config/common.sh@38 -- # kill -SIGINT 96966 00:24:14.953 11:34:58 json_config -- json_config/common.sh@40 -- # (( i = 0 )) 00:24:14.953 11:34:58 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:24:14.953 11:34:58 json_config -- json_config/common.sh@41 -- # kill -0 96966 00:24:14.953 11:34:58 json_config -- json_config/common.sh@45 -- # sleep 0.5 00:24:15.519 11:34:59 json_config -- json_config/common.sh@40 -- # (( i++ )) 00:24:15.519 11:34:59 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:24:15.519 11:34:59 json_config -- json_config/common.sh@41 -- # kill -0 96966 00:24:15.519 11:34:59 json_config -- json_config/common.sh@42 -- # app_pid["$app"]= 00:24:15.519 11:34:59 json_config -- json_config/common.sh@43 -- # break 00:24:15.519 11:34:59 json_config -- json_config/common.sh@48 -- # [[ -n '' ]] 00:24:15.519 11:34:59 json_config -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:24:15.519 SPDK target shutdown done 00:24:15.519 11:34:59 json_config -- json_config/json_config.sh@371 -- # echo 'INFO: relaunching applications...' 00:24:15.519 INFO: relaunching applications... 00:24:15.519 11:34:59 json_config -- json_config/json_config.sh@372 -- # json_config_test_start_app target --json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:24:15.519 11:34:59 json_config -- json_config/common.sh@9 -- # local app=target 00:24:15.519 11:34:59 json_config -- json_config/common.sh@10 -- # shift 00:24:15.519 11:34:59 json_config -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:24:15.519 11:34:59 json_config -- json_config/common.sh@13 -- # [[ -z '' ]] 00:24:15.519 11:34:59 json_config -- json_config/common.sh@15 -- # local app_extra_params= 00:24:15.519 11:34:59 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:24:15.519 11:34:59 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:24:15.519 11:34:59 json_config -- json_config/common.sh@22 -- # app_pid["$app"]=98576 00:24:15.519 11:34:59 json_config -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:24:15.519 Waiting for target to run... 00:24:15.519 11:34:59 json_config -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:24:15.519 11:34:59 json_config -- json_config/common.sh@25 -- # waitforlisten 98576 /var/tmp/spdk_tgt.sock 00:24:15.519 11:34:59 json_config -- common/autotest_common.sh@830 -- # '[' -z 98576 ']' 00:24:15.519 11:34:59 json_config -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:24:15.519 11:34:59 json_config -- common/autotest_common.sh@835 -- # local max_retries=100 00:24:15.519 11:34:59 json_config -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:24:15.519 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:24:15.519 11:34:59 json_config -- common/autotest_common.sh@839 -- # xtrace_disable 00:24:15.519 11:34:59 json_config -- common/autotest_common.sh@10 -- # set +x 00:24:15.778 [2024-06-10 11:34:59.466490] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:24:15.778 [2024-06-10 11:34:59.466555] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid98576 ] 00:24:16.345 [2024-06-10 11:35:00.028268] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:16.345 [2024-06-10 11:35:00.104702] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:24:16.345 [2024-06-10 11:35:00.158195] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:24:16.345 [2024-06-10 11:35:00.166236] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:24:16.345 [2024-06-10 11:35:00.174245] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:24:16.345 [2024-06-10 11:35:00.253648] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:24:18.966 [2024-06-10 11:35:02.427481] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:24:18.966 [2024-06-10 11:35:02.427539] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:24:18.966 [2024-06-10 11:35:02.427566] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:24:18.966 [2024-06-10 11:35:02.435501] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Nvme0n1 00:24:18.966 [2024-06-10 11:35:02.435522] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Nvme0n1 00:24:18.966 [2024-06-10 11:35:02.443526] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:24:18.966 [2024-06-10 11:35:02.443552] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:24:18.966 [2024-06-10 11:35:02.451546] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "CryptoMallocBdev_AES_CBC" 00:24:18.966 [2024-06-10 11:35:02.451565] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: MallocForCryptoBdev 00:24:18.966 [2024-06-10 11:35:02.451589] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:24:18.966 [2024-06-10 11:35:02.797712] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:24:18.966 [2024-06-10 11:35:02.797748] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:18.966 [2024-06-10 11:35:02.797760] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17275b0 00:24:18.966 [2024-06-10 11:35:02.797785] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:18.966 [2024-06-10 11:35:02.797993] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:18.966 [2024-06-10 11:35:02.798006] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: PTBdevFromMalloc3 00:24:18.966 11:35:02 json_config -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:24:18.967 11:35:02 json_config -- common/autotest_common.sh@863 -- # return 0 00:24:18.967 11:35:02 json_config -- json_config/common.sh@26 -- # echo '' 00:24:18.967 00:24:18.967 11:35:02 json_config -- json_config/json_config.sh@373 -- # [[ 0 -eq 1 ]] 00:24:18.967 11:35:02 json_config -- json_config/json_config.sh@377 -- # echo 'INFO: Checking if target configuration is the same...' 00:24:18.967 INFO: Checking if target configuration is the same... 00:24:18.967 11:35:02 json_config -- json_config/json_config.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh /dev/fd/62 /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:24:18.967 11:35:02 json_config -- json_config/json_config.sh@378 -- # tgt_rpc save_config 00:24:18.967 11:35:02 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:24:19.225 + '[' 2 -ne 2 ']' 00:24:19.225 +++ dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh 00:24:19.225 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/../.. 00:24:19.225 + rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:24:19.225 +++ basename /dev/fd/62 00:24:19.225 ++ mktemp /tmp/62.XXX 00:24:19.225 + tmp_file_1=/tmp/62.YEt 00:24:19.225 +++ basename /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:24:19.225 ++ mktemp /tmp/spdk_tgt_config.json.XXX 00:24:19.225 + tmp_file_2=/tmp/spdk_tgt_config.json.CPk 00:24:19.225 + ret=0 00:24:19.225 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:24:19.483 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:24:19.483 + diff -u /tmp/62.YEt /tmp/spdk_tgt_config.json.CPk 00:24:19.483 + echo 'INFO: JSON config files are the same' 00:24:19.483 INFO: JSON config files are the same 00:24:19.483 + rm /tmp/62.YEt /tmp/spdk_tgt_config.json.CPk 00:24:19.483 + exit 0 00:24:19.483 11:35:03 json_config -- json_config/json_config.sh@379 -- # [[ 0 -eq 1 ]] 00:24:19.483 11:35:03 json_config -- json_config/json_config.sh@384 -- # echo 'INFO: changing configuration and checking if this can be detected...' 00:24:19.483 INFO: changing configuration and checking if this can be detected... 00:24:19.483 11:35:03 json_config -- json_config/json_config.sh@386 -- # tgt_rpc bdev_malloc_delete MallocBdevForConfigChangeCheck 00:24:19.483 11:35:03 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_delete MallocBdevForConfigChangeCheck 00:24:19.741 11:35:03 json_config -- json_config/json_config.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh /dev/fd/62 /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:24:19.741 11:35:03 json_config -- json_config/json_config.sh@387 -- # tgt_rpc save_config 00:24:19.741 11:35:03 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:24:19.741 + '[' 2 -ne 2 ']' 00:24:19.741 +++ dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh 00:24:19.741 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/../.. 00:24:19.741 + rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:24:19.741 +++ basename /dev/fd/62 00:24:19.741 ++ mktemp /tmp/62.XXX 00:24:19.741 + tmp_file_1=/tmp/62.Z5d 00:24:19.741 +++ basename /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:24:19.741 ++ mktemp /tmp/spdk_tgt_config.json.XXX 00:24:19.741 + tmp_file_2=/tmp/spdk_tgt_config.json.ldj 00:24:19.741 + ret=0 00:24:19.741 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:24:19.999 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:24:19.999 + diff -u /tmp/62.Z5d /tmp/spdk_tgt_config.json.ldj 00:24:19.999 + ret=1 00:24:19.999 + echo '=== Start of file: /tmp/62.Z5d ===' 00:24:19.999 + cat /tmp/62.Z5d 00:24:19.999 + echo '=== End of file: /tmp/62.Z5d ===' 00:24:19.999 + echo '' 00:24:19.999 + echo '=== Start of file: /tmp/spdk_tgt_config.json.ldj ===' 00:24:19.999 + cat /tmp/spdk_tgt_config.json.ldj 00:24:19.999 + echo '=== End of file: /tmp/spdk_tgt_config.json.ldj ===' 00:24:19.999 + echo '' 00:24:19.999 + rm /tmp/62.Z5d /tmp/spdk_tgt_config.json.ldj 00:24:19.999 + exit 1 00:24:19.999 11:35:03 json_config -- json_config/json_config.sh@391 -- # echo 'INFO: configuration change detected.' 00:24:19.999 INFO: configuration change detected. 00:24:19.999 11:35:03 json_config -- json_config/json_config.sh@394 -- # json_config_test_fini 00:24:19.999 11:35:03 json_config -- json_config/json_config.sh@306 -- # timing_enter json_config_test_fini 00:24:19.999 11:35:03 json_config -- common/autotest_common.sh@723 -- # xtrace_disable 00:24:19.999 11:35:03 json_config -- common/autotest_common.sh@10 -- # set +x 00:24:19.999 11:35:03 json_config -- json_config/json_config.sh@307 -- # local ret=0 00:24:19.999 11:35:03 json_config -- json_config/json_config.sh@309 -- # [[ -n '' ]] 00:24:19.999 11:35:03 json_config -- json_config/json_config.sh@317 -- # [[ -n 98576 ]] 00:24:19.999 11:35:03 json_config -- json_config/json_config.sh@320 -- # cleanup_bdev_subsystem_config 00:24:19.999 11:35:03 json_config -- json_config/json_config.sh@184 -- # timing_enter cleanup_bdev_subsystem_config 00:24:19.999 11:35:03 json_config -- common/autotest_common.sh@723 -- # xtrace_disable 00:24:19.999 11:35:03 json_config -- common/autotest_common.sh@10 -- # set +x 00:24:19.999 11:35:03 json_config -- json_config/json_config.sh@186 -- # [[ 1 -eq 1 ]] 00:24:19.999 11:35:03 json_config -- json_config/json_config.sh@187 -- # tgt_rpc bdev_lvol_delete lvs_test/clone0 00:24:19.999 11:35:03 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/clone0 00:24:20.257 11:35:03 json_config -- json_config/json_config.sh@188 -- # tgt_rpc bdev_lvol_delete lvs_test/lvol0 00:24:20.257 11:35:03 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/lvol0 00:24:20.257 11:35:04 json_config -- json_config/json_config.sh@189 -- # tgt_rpc bdev_lvol_delete lvs_test/snapshot0 00:24:20.257 11:35:04 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/snapshot0 00:24:20.515 11:35:04 json_config -- json_config/json_config.sh@190 -- # tgt_rpc bdev_lvol_delete_lvstore -l lvs_test 00:24:20.515 11:35:04 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete_lvstore -l lvs_test 00:24:20.773 11:35:04 json_config -- json_config/json_config.sh@193 -- # uname -s 00:24:20.773 11:35:04 json_config -- json_config/json_config.sh@193 -- # [[ Linux = Linux ]] 00:24:20.773 11:35:04 json_config -- json_config/json_config.sh@194 -- # rm -f /sample_aio 00:24:20.773 11:35:04 json_config -- json_config/json_config.sh@197 -- # [[ 0 -eq 1 ]] 00:24:20.773 11:35:04 json_config -- json_config/json_config.sh@201 -- # timing_exit cleanup_bdev_subsystem_config 00:24:20.773 11:35:04 json_config -- common/autotest_common.sh@729 -- # xtrace_disable 00:24:20.773 11:35:04 json_config -- common/autotest_common.sh@10 -- # set +x 00:24:20.773 11:35:04 json_config -- json_config/json_config.sh@323 -- # killprocess 98576 00:24:20.773 11:35:04 json_config -- common/autotest_common.sh@949 -- # '[' -z 98576 ']' 00:24:20.773 11:35:04 json_config -- common/autotest_common.sh@953 -- # kill -0 98576 00:24:20.773 11:35:04 json_config -- common/autotest_common.sh@954 -- # uname 00:24:20.773 11:35:04 json_config -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:24:20.773 11:35:04 json_config -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 98576 00:24:20.773 11:35:04 json_config -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:24:20.773 11:35:04 json_config -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:24:20.773 11:35:04 json_config -- common/autotest_common.sh@967 -- # echo 'killing process with pid 98576' 00:24:20.773 killing process with pid 98576 00:24:20.773 11:35:04 json_config -- common/autotest_common.sh@968 -- # kill 98576 00:24:20.773 11:35:04 json_config -- common/autotest_common.sh@973 -- # wait 98576 00:24:22.678 11:35:06 json_config -- json_config/json_config.sh@326 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_initiator_config.json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:24:22.678 11:35:06 json_config -- json_config/json_config.sh@327 -- # timing_exit json_config_test_fini 00:24:22.678 11:35:06 json_config -- common/autotest_common.sh@729 -- # xtrace_disable 00:24:22.678 11:35:06 json_config -- common/autotest_common.sh@10 -- # set +x 00:24:22.678 11:35:06 json_config -- json_config/json_config.sh@328 -- # return 0 00:24:22.678 11:35:06 json_config -- json_config/json_config.sh@396 -- # echo 'INFO: Success' 00:24:22.678 INFO: Success 00:24:22.678 00:24:22.678 real 0m18.038s 00:24:22.678 user 0m21.711s 00:24:22.678 sys 0m3.536s 00:24:22.678 11:35:06 json_config -- common/autotest_common.sh@1125 -- # xtrace_disable 00:24:22.678 11:35:06 json_config -- common/autotest_common.sh@10 -- # set +x 00:24:22.678 ************************************ 00:24:22.678 END TEST json_config 00:24:22.678 ************************************ 00:24:22.678 11:35:06 -- spdk/autotest.sh@173 -- # run_test json_config_extra_key /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:24:22.678 11:35:06 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:24:22.678 11:35:06 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:24:22.678 11:35:06 -- common/autotest_common.sh@10 -- # set +x 00:24:22.678 ************************************ 00:24:22.678 START TEST json_config_extra_key 00:24:22.678 ************************************ 00:24:22.678 11:35:06 json_config_extra_key -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:24:22.679 11:35:06 json_config_extra_key -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:24:22.679 11:35:06 json_config_extra_key -- nvmf/common.sh@7 -- # uname -s 00:24:22.679 11:35:06 json_config_extra_key -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:24:22.679 11:35:06 json_config_extra_key -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:24:22.679 11:35:06 json_config_extra_key -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:24:22.679 11:35:06 json_config_extra_key -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:24:22.679 11:35:06 json_config_extra_key -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:24:22.679 11:35:06 json_config_extra_key -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:24:22.679 11:35:06 json_config_extra_key -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:24:22.679 11:35:06 json_config_extra_key -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:24:22.679 11:35:06 json_config_extra_key -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:24:22.679 11:35:06 json_config_extra_key -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:24:22.939 11:35:06 json_config_extra_key -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:804da62e-425e-e711-906e-0017a4403562 00:24:22.939 11:35:06 json_config_extra_key -- nvmf/common.sh@18 -- # NVME_HOSTID=804da62e-425e-e711-906e-0017a4403562 00:24:22.939 11:35:06 json_config_extra_key -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:24:22.939 11:35:06 json_config_extra_key -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:24:22.939 11:35:06 json_config_extra_key -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:24:22.939 11:35:06 json_config_extra_key -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:24:22.939 11:35:06 json_config_extra_key -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:24:22.939 11:35:06 json_config_extra_key -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:24:22.939 11:35:06 json_config_extra_key -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:24:22.939 11:35:06 json_config_extra_key -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:24:22.939 11:35:06 json_config_extra_key -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:24:22.939 11:35:06 json_config_extra_key -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:24:22.939 11:35:06 json_config_extra_key -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:24:22.939 11:35:06 json_config_extra_key -- paths/export.sh@5 -- # export PATH 00:24:22.939 11:35:06 json_config_extra_key -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:24:22.939 11:35:06 json_config_extra_key -- nvmf/common.sh@47 -- # : 0 00:24:22.939 11:35:06 json_config_extra_key -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:24:22.939 11:35:06 json_config_extra_key -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:24:22.939 11:35:06 json_config_extra_key -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:24:22.939 11:35:06 json_config_extra_key -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:24:22.939 11:35:06 json_config_extra_key -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:24:22.939 11:35:06 json_config_extra_key -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:24:22.939 11:35:06 json_config_extra_key -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:24:22.939 11:35:06 json_config_extra_key -- nvmf/common.sh@51 -- # have_pci_nics=0 00:24:22.939 11:35:06 json_config_extra_key -- json_config/json_config_extra_key.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/common.sh 00:24:22.939 11:35:06 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # app_pid=(['target']='') 00:24:22.939 11:35:06 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # declare -A app_pid 00:24:22.939 11:35:06 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:24:22.939 11:35:06 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # declare -A app_socket 00:24:22.939 11:35:06 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # app_params=(['target']='-m 0x1 -s 1024') 00:24:22.939 11:35:06 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # declare -A app_params 00:24:22.939 11:35:06 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # configs_path=(['target']='/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json') 00:24:22.939 11:35:06 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # declare -A configs_path 00:24:22.939 11:35:06 json_config_extra_key -- json_config/json_config_extra_key.sh@22 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:24:22.939 11:35:06 json_config_extra_key -- json_config/json_config_extra_key.sh@24 -- # echo 'INFO: launching applications...' 00:24:22.939 INFO: launching applications... 00:24:22.939 11:35:06 json_config_extra_key -- json_config/json_config_extra_key.sh@25 -- # json_config_test_start_app target --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json 00:24:22.939 11:35:06 json_config_extra_key -- json_config/common.sh@9 -- # local app=target 00:24:22.939 11:35:06 json_config_extra_key -- json_config/common.sh@10 -- # shift 00:24:22.939 11:35:06 json_config_extra_key -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:24:22.939 11:35:06 json_config_extra_key -- json_config/common.sh@13 -- # [[ -z '' ]] 00:24:22.939 11:35:06 json_config_extra_key -- json_config/common.sh@15 -- # local app_extra_params= 00:24:22.939 11:35:06 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:24:22.939 11:35:06 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:24:22.939 11:35:06 json_config_extra_key -- json_config/common.sh@22 -- # app_pid["$app"]=99633 00:24:22.939 11:35:06 json_config_extra_key -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:24:22.939 Waiting for target to run... 00:24:22.939 11:35:06 json_config_extra_key -- json_config/common.sh@25 -- # waitforlisten 99633 /var/tmp/spdk_tgt.sock 00:24:22.939 11:35:06 json_config_extra_key -- common/autotest_common.sh@830 -- # '[' -z 99633 ']' 00:24:22.939 11:35:06 json_config_extra_key -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json 00:24:22.939 11:35:06 json_config_extra_key -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:24:22.939 11:35:06 json_config_extra_key -- common/autotest_common.sh@835 -- # local max_retries=100 00:24:22.939 11:35:06 json_config_extra_key -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:24:22.939 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:24:22.939 11:35:06 json_config_extra_key -- common/autotest_common.sh@839 -- # xtrace_disable 00:24:22.939 11:35:06 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:24:22.939 [2024-06-10 11:35:06.709296] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:24:22.939 [2024-06-10 11:35:06.709351] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid99633 ] 00:24:23.198 [2024-06-10 11:35:07.024224] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:23.198 [2024-06-10 11:35:07.098136] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:24:23.767 11:35:07 json_config_extra_key -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:24:23.767 11:35:07 json_config_extra_key -- common/autotest_common.sh@863 -- # return 0 00:24:23.767 11:35:07 json_config_extra_key -- json_config/common.sh@26 -- # echo '' 00:24:23.767 00:24:23.767 11:35:07 json_config_extra_key -- json_config/json_config_extra_key.sh@27 -- # echo 'INFO: shutting down applications...' 00:24:23.767 INFO: shutting down applications... 00:24:23.767 11:35:07 json_config_extra_key -- json_config/json_config_extra_key.sh@28 -- # json_config_test_shutdown_app target 00:24:23.767 11:35:07 json_config_extra_key -- json_config/common.sh@31 -- # local app=target 00:24:23.767 11:35:07 json_config_extra_key -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:24:23.767 11:35:07 json_config_extra_key -- json_config/common.sh@35 -- # [[ -n 99633 ]] 00:24:23.767 11:35:07 json_config_extra_key -- json_config/common.sh@38 -- # kill -SIGINT 99633 00:24:23.767 11:35:07 json_config_extra_key -- json_config/common.sh@40 -- # (( i = 0 )) 00:24:23.767 11:35:07 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:24:23.767 11:35:07 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 99633 00:24:23.767 11:35:07 json_config_extra_key -- json_config/common.sh@45 -- # sleep 0.5 00:24:24.336 11:35:08 json_config_extra_key -- json_config/common.sh@40 -- # (( i++ )) 00:24:24.336 11:35:08 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:24:24.336 11:35:08 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 99633 00:24:24.336 11:35:08 json_config_extra_key -- json_config/common.sh@42 -- # app_pid["$app"]= 00:24:24.336 11:35:08 json_config_extra_key -- json_config/common.sh@43 -- # break 00:24:24.336 11:35:08 json_config_extra_key -- json_config/common.sh@48 -- # [[ -n '' ]] 00:24:24.336 11:35:08 json_config_extra_key -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:24:24.336 SPDK target shutdown done 00:24:24.336 11:35:08 json_config_extra_key -- json_config/json_config_extra_key.sh@30 -- # echo Success 00:24:24.336 Success 00:24:24.336 00:24:24.336 real 0m1.465s 00:24:24.336 user 0m1.054s 00:24:24.336 sys 0m0.442s 00:24:24.336 11:35:08 json_config_extra_key -- common/autotest_common.sh@1125 -- # xtrace_disable 00:24:24.336 11:35:08 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:24:24.336 ************************************ 00:24:24.336 END TEST json_config_extra_key 00:24:24.336 ************************************ 00:24:24.336 11:35:08 -- spdk/autotest.sh@174 -- # run_test alias_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:24:24.336 11:35:08 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:24:24.336 11:35:08 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:24:24.336 11:35:08 -- common/autotest_common.sh@10 -- # set +x 00:24:24.336 ************************************ 00:24:24.336 START TEST alias_rpc 00:24:24.336 ************************************ 00:24:24.337 11:35:08 alias_rpc -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:24:24.337 * Looking for test storage... 00:24:24.337 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc 00:24:24.337 11:35:08 alias_rpc -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:24:24.337 11:35:08 alias_rpc -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=99864 00:24:24.337 11:35:08 alias_rpc -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 99864 00:24:24.337 11:35:08 alias_rpc -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:24:24.337 11:35:08 alias_rpc -- common/autotest_common.sh@830 -- # '[' -z 99864 ']' 00:24:24.337 11:35:08 alias_rpc -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:24:24.337 11:35:08 alias_rpc -- common/autotest_common.sh@835 -- # local max_retries=100 00:24:24.337 11:35:08 alias_rpc -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:24:24.337 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:24:24.337 11:35:08 alias_rpc -- common/autotest_common.sh@839 -- # xtrace_disable 00:24:24.337 11:35:08 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:24:24.337 [2024-06-10 11:35:08.239696] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:24:24.337 [2024-06-10 11:35:08.239751] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid99864 ] 00:24:24.596 [2024-06-10 11:35:08.325796] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:24.596 [2024-06-10 11:35:08.410501] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:24:25.165 11:35:09 alias_rpc -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:24:25.165 11:35:09 alias_rpc -- common/autotest_common.sh@863 -- # return 0 00:24:25.165 11:35:09 alias_rpc -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_config -i 00:24:25.424 11:35:09 alias_rpc -- alias_rpc/alias_rpc.sh@19 -- # killprocess 99864 00:24:25.424 11:35:09 alias_rpc -- common/autotest_common.sh@949 -- # '[' -z 99864 ']' 00:24:25.424 11:35:09 alias_rpc -- common/autotest_common.sh@953 -- # kill -0 99864 00:24:25.424 11:35:09 alias_rpc -- common/autotest_common.sh@954 -- # uname 00:24:25.424 11:35:09 alias_rpc -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:24:25.424 11:35:09 alias_rpc -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 99864 00:24:25.424 11:35:09 alias_rpc -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:24:25.424 11:35:09 alias_rpc -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:24:25.424 11:35:09 alias_rpc -- common/autotest_common.sh@967 -- # echo 'killing process with pid 99864' 00:24:25.424 killing process with pid 99864 00:24:25.424 11:35:09 alias_rpc -- common/autotest_common.sh@968 -- # kill 99864 00:24:25.424 11:35:09 alias_rpc -- common/autotest_common.sh@973 -- # wait 99864 00:24:25.684 00:24:25.684 real 0m1.529s 00:24:25.684 user 0m1.594s 00:24:25.684 sys 0m0.456s 00:24:25.684 11:35:09 alias_rpc -- common/autotest_common.sh@1125 -- # xtrace_disable 00:24:25.684 11:35:09 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:24:25.684 ************************************ 00:24:25.684 END TEST alias_rpc 00:24:25.684 ************************************ 00:24:25.943 11:35:09 -- spdk/autotest.sh@176 -- # [[ 0 -eq 0 ]] 00:24:25.943 11:35:09 -- spdk/autotest.sh@177 -- # run_test spdkcli_tcp /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/tcp.sh 00:24:25.943 11:35:09 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:24:25.943 11:35:09 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:24:25.943 11:35:09 -- common/autotest_common.sh@10 -- # set +x 00:24:25.943 ************************************ 00:24:25.943 START TEST spdkcli_tcp 00:24:25.943 ************************************ 00:24:25.943 11:35:09 spdkcli_tcp -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/tcp.sh 00:24:25.943 * Looking for test storage... 00:24:25.943 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli 00:24:25.943 11:35:09 spdkcli_tcp -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/common.sh 00:24:25.943 11:35:09 spdkcli_tcp -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:24:25.943 11:35:09 spdkcli_tcp -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/clear_config.py 00:24:25.943 11:35:09 spdkcli_tcp -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:24:25.943 11:35:09 spdkcli_tcp -- spdkcli/tcp.sh@19 -- # PORT=9998 00:24:25.943 11:35:09 spdkcli_tcp -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:24:25.943 11:35:09 spdkcli_tcp -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:24:25.943 11:35:09 spdkcli_tcp -- common/autotest_common.sh@723 -- # xtrace_disable 00:24:25.943 11:35:09 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:24:25.943 11:35:09 spdkcli_tcp -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=100098 00:24:25.943 11:35:09 spdkcli_tcp -- spdkcli/tcp.sh@27 -- # waitforlisten 100098 00:24:25.943 11:35:09 spdkcli_tcp -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:24:25.943 11:35:09 spdkcli_tcp -- common/autotest_common.sh@830 -- # '[' -z 100098 ']' 00:24:25.943 11:35:09 spdkcli_tcp -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:24:25.943 11:35:09 spdkcli_tcp -- common/autotest_common.sh@835 -- # local max_retries=100 00:24:25.943 11:35:09 spdkcli_tcp -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:24:25.943 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:24:25.943 11:35:09 spdkcli_tcp -- common/autotest_common.sh@839 -- # xtrace_disable 00:24:25.943 11:35:09 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:24:25.943 [2024-06-10 11:35:09.860374] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:24:25.943 [2024-06-10 11:35:09.860431] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid100098 ] 00:24:26.202 [2024-06-10 11:35:09.948737] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:24:26.202 [2024-06-10 11:35:10.034725] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:24:26.202 [2024-06-10 11:35:10.034728] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:24:26.770 11:35:10 spdkcli_tcp -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:24:26.770 11:35:10 spdkcli_tcp -- common/autotest_common.sh@863 -- # return 0 00:24:26.770 11:35:10 spdkcli_tcp -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:24:26.770 11:35:10 spdkcli_tcp -- spdkcli/tcp.sh@31 -- # socat_pid=100270 00:24:26.770 11:35:10 spdkcli_tcp -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:24:27.030 [ 00:24:27.030 "bdev_malloc_delete", 00:24:27.030 "bdev_malloc_create", 00:24:27.030 "bdev_null_resize", 00:24:27.030 "bdev_null_delete", 00:24:27.030 "bdev_null_create", 00:24:27.030 "bdev_nvme_cuse_unregister", 00:24:27.030 "bdev_nvme_cuse_register", 00:24:27.030 "bdev_opal_new_user", 00:24:27.030 "bdev_opal_set_lock_state", 00:24:27.030 "bdev_opal_delete", 00:24:27.030 "bdev_opal_get_info", 00:24:27.030 "bdev_opal_create", 00:24:27.030 "bdev_nvme_opal_revert", 00:24:27.030 "bdev_nvme_opal_init", 00:24:27.030 "bdev_nvme_send_cmd", 00:24:27.030 "bdev_nvme_get_path_iostat", 00:24:27.030 "bdev_nvme_get_mdns_discovery_info", 00:24:27.030 "bdev_nvme_stop_mdns_discovery", 00:24:27.030 "bdev_nvme_start_mdns_discovery", 00:24:27.030 "bdev_nvme_set_multipath_policy", 00:24:27.030 "bdev_nvme_set_preferred_path", 00:24:27.030 "bdev_nvme_get_io_paths", 00:24:27.030 "bdev_nvme_remove_error_injection", 00:24:27.030 "bdev_nvme_add_error_injection", 00:24:27.030 "bdev_nvme_get_discovery_info", 00:24:27.030 "bdev_nvme_stop_discovery", 00:24:27.030 "bdev_nvme_start_discovery", 00:24:27.030 "bdev_nvme_get_controller_health_info", 00:24:27.030 "bdev_nvme_disable_controller", 00:24:27.030 "bdev_nvme_enable_controller", 00:24:27.030 "bdev_nvme_reset_controller", 00:24:27.030 "bdev_nvme_get_transport_statistics", 00:24:27.030 "bdev_nvme_apply_firmware", 00:24:27.030 "bdev_nvme_detach_controller", 00:24:27.030 "bdev_nvme_get_controllers", 00:24:27.030 "bdev_nvme_attach_controller", 00:24:27.030 "bdev_nvme_set_hotplug", 00:24:27.031 "bdev_nvme_set_options", 00:24:27.031 "bdev_passthru_delete", 00:24:27.031 "bdev_passthru_create", 00:24:27.031 "bdev_lvol_set_parent_bdev", 00:24:27.031 "bdev_lvol_set_parent", 00:24:27.031 "bdev_lvol_check_shallow_copy", 00:24:27.031 "bdev_lvol_start_shallow_copy", 00:24:27.031 "bdev_lvol_grow_lvstore", 00:24:27.031 "bdev_lvol_get_lvols", 00:24:27.031 "bdev_lvol_get_lvstores", 00:24:27.031 "bdev_lvol_delete", 00:24:27.031 "bdev_lvol_set_read_only", 00:24:27.031 "bdev_lvol_resize", 00:24:27.031 "bdev_lvol_decouple_parent", 00:24:27.031 "bdev_lvol_inflate", 00:24:27.031 "bdev_lvol_rename", 00:24:27.031 "bdev_lvol_clone_bdev", 00:24:27.031 "bdev_lvol_clone", 00:24:27.031 "bdev_lvol_snapshot", 00:24:27.031 "bdev_lvol_create", 00:24:27.031 "bdev_lvol_delete_lvstore", 00:24:27.031 "bdev_lvol_rename_lvstore", 00:24:27.031 "bdev_lvol_create_lvstore", 00:24:27.031 "bdev_raid_set_options", 00:24:27.031 "bdev_raid_remove_base_bdev", 00:24:27.031 "bdev_raid_add_base_bdev", 00:24:27.031 "bdev_raid_delete", 00:24:27.031 "bdev_raid_create", 00:24:27.031 "bdev_raid_get_bdevs", 00:24:27.031 "bdev_error_inject_error", 00:24:27.031 "bdev_error_delete", 00:24:27.031 "bdev_error_create", 00:24:27.031 "bdev_split_delete", 00:24:27.031 "bdev_split_create", 00:24:27.031 "bdev_delay_delete", 00:24:27.031 "bdev_delay_create", 00:24:27.031 "bdev_delay_update_latency", 00:24:27.031 "bdev_zone_block_delete", 00:24:27.031 "bdev_zone_block_create", 00:24:27.031 "blobfs_create", 00:24:27.031 "blobfs_detect", 00:24:27.031 "blobfs_set_cache_size", 00:24:27.031 "bdev_crypto_delete", 00:24:27.031 "bdev_crypto_create", 00:24:27.031 "bdev_compress_delete", 00:24:27.031 "bdev_compress_create", 00:24:27.031 "bdev_compress_get_orphans", 00:24:27.031 "bdev_aio_delete", 00:24:27.031 "bdev_aio_rescan", 00:24:27.031 "bdev_aio_create", 00:24:27.031 "bdev_ftl_set_property", 00:24:27.031 "bdev_ftl_get_properties", 00:24:27.031 "bdev_ftl_get_stats", 00:24:27.031 "bdev_ftl_unmap", 00:24:27.031 "bdev_ftl_unload", 00:24:27.031 "bdev_ftl_delete", 00:24:27.031 "bdev_ftl_load", 00:24:27.031 "bdev_ftl_create", 00:24:27.031 "bdev_virtio_attach_controller", 00:24:27.031 "bdev_virtio_scsi_get_devices", 00:24:27.031 "bdev_virtio_detach_controller", 00:24:27.031 "bdev_virtio_blk_set_hotplug", 00:24:27.031 "bdev_iscsi_delete", 00:24:27.031 "bdev_iscsi_create", 00:24:27.031 "bdev_iscsi_set_options", 00:24:27.031 "accel_error_inject_error", 00:24:27.031 "ioat_scan_accel_module", 00:24:27.031 "dsa_scan_accel_module", 00:24:27.031 "iaa_scan_accel_module", 00:24:27.031 "dpdk_cryptodev_get_driver", 00:24:27.031 "dpdk_cryptodev_set_driver", 00:24:27.031 "dpdk_cryptodev_scan_accel_module", 00:24:27.031 "compressdev_scan_accel_module", 00:24:27.031 "keyring_file_remove_key", 00:24:27.031 "keyring_file_add_key", 00:24:27.031 "keyring_linux_set_options", 00:24:27.031 "iscsi_get_histogram", 00:24:27.031 "iscsi_enable_histogram", 00:24:27.031 "iscsi_set_options", 00:24:27.031 "iscsi_get_auth_groups", 00:24:27.031 "iscsi_auth_group_remove_secret", 00:24:27.031 "iscsi_auth_group_add_secret", 00:24:27.031 "iscsi_delete_auth_group", 00:24:27.031 "iscsi_create_auth_group", 00:24:27.031 "iscsi_set_discovery_auth", 00:24:27.031 "iscsi_get_options", 00:24:27.031 "iscsi_target_node_request_logout", 00:24:27.031 "iscsi_target_node_set_redirect", 00:24:27.031 "iscsi_target_node_set_auth", 00:24:27.031 "iscsi_target_node_add_lun", 00:24:27.031 "iscsi_get_stats", 00:24:27.031 "iscsi_get_connections", 00:24:27.031 "iscsi_portal_group_set_auth", 00:24:27.031 "iscsi_start_portal_group", 00:24:27.031 "iscsi_delete_portal_group", 00:24:27.031 "iscsi_create_portal_group", 00:24:27.031 "iscsi_get_portal_groups", 00:24:27.031 "iscsi_delete_target_node", 00:24:27.031 "iscsi_target_node_remove_pg_ig_maps", 00:24:27.031 "iscsi_target_node_add_pg_ig_maps", 00:24:27.031 "iscsi_create_target_node", 00:24:27.031 "iscsi_get_target_nodes", 00:24:27.031 "iscsi_delete_initiator_group", 00:24:27.031 "iscsi_initiator_group_remove_initiators", 00:24:27.031 "iscsi_initiator_group_add_initiators", 00:24:27.031 "iscsi_create_initiator_group", 00:24:27.031 "iscsi_get_initiator_groups", 00:24:27.031 "nvmf_set_crdt", 00:24:27.031 "nvmf_set_config", 00:24:27.031 "nvmf_set_max_subsystems", 00:24:27.031 "nvmf_stop_mdns_prr", 00:24:27.031 "nvmf_publish_mdns_prr", 00:24:27.031 "nvmf_subsystem_get_listeners", 00:24:27.031 "nvmf_subsystem_get_qpairs", 00:24:27.031 "nvmf_subsystem_get_controllers", 00:24:27.031 "nvmf_get_stats", 00:24:27.031 "nvmf_get_transports", 00:24:27.031 "nvmf_create_transport", 00:24:27.031 "nvmf_get_targets", 00:24:27.031 "nvmf_delete_target", 00:24:27.031 "nvmf_create_target", 00:24:27.031 "nvmf_subsystem_allow_any_host", 00:24:27.031 "nvmf_subsystem_remove_host", 00:24:27.031 "nvmf_subsystem_add_host", 00:24:27.031 "nvmf_ns_remove_host", 00:24:27.031 "nvmf_ns_add_host", 00:24:27.031 "nvmf_subsystem_remove_ns", 00:24:27.031 "nvmf_subsystem_add_ns", 00:24:27.031 "nvmf_subsystem_listener_set_ana_state", 00:24:27.031 "nvmf_discovery_get_referrals", 00:24:27.031 "nvmf_discovery_remove_referral", 00:24:27.031 "nvmf_discovery_add_referral", 00:24:27.031 "nvmf_subsystem_remove_listener", 00:24:27.031 "nvmf_subsystem_add_listener", 00:24:27.031 "nvmf_delete_subsystem", 00:24:27.031 "nvmf_create_subsystem", 00:24:27.031 "nvmf_get_subsystems", 00:24:27.031 "env_dpdk_get_mem_stats", 00:24:27.031 "nbd_get_disks", 00:24:27.031 "nbd_stop_disk", 00:24:27.031 "nbd_start_disk", 00:24:27.031 "ublk_recover_disk", 00:24:27.031 "ublk_get_disks", 00:24:27.031 "ublk_stop_disk", 00:24:27.031 "ublk_start_disk", 00:24:27.031 "ublk_destroy_target", 00:24:27.031 "ublk_create_target", 00:24:27.031 "virtio_blk_create_transport", 00:24:27.031 "virtio_blk_get_transports", 00:24:27.031 "vhost_controller_set_coalescing", 00:24:27.031 "vhost_get_controllers", 00:24:27.031 "vhost_delete_controller", 00:24:27.031 "vhost_create_blk_controller", 00:24:27.031 "vhost_scsi_controller_remove_target", 00:24:27.031 "vhost_scsi_controller_add_target", 00:24:27.031 "vhost_start_scsi_controller", 00:24:27.031 "vhost_create_scsi_controller", 00:24:27.031 "thread_set_cpumask", 00:24:27.031 "framework_get_scheduler", 00:24:27.031 "framework_set_scheduler", 00:24:27.031 "framework_get_reactors", 00:24:27.031 "thread_get_io_channels", 00:24:27.031 "thread_get_pollers", 00:24:27.031 "thread_get_stats", 00:24:27.031 "framework_monitor_context_switch", 00:24:27.031 "spdk_kill_instance", 00:24:27.031 "log_enable_timestamps", 00:24:27.031 "log_get_flags", 00:24:27.031 "log_clear_flag", 00:24:27.031 "log_set_flag", 00:24:27.031 "log_get_level", 00:24:27.031 "log_set_level", 00:24:27.031 "log_get_print_level", 00:24:27.031 "log_set_print_level", 00:24:27.031 "framework_enable_cpumask_locks", 00:24:27.031 "framework_disable_cpumask_locks", 00:24:27.031 "framework_wait_init", 00:24:27.031 "framework_start_init", 00:24:27.031 "scsi_get_devices", 00:24:27.031 "bdev_get_histogram", 00:24:27.031 "bdev_enable_histogram", 00:24:27.031 "bdev_set_qos_limit", 00:24:27.031 "bdev_set_qd_sampling_period", 00:24:27.031 "bdev_get_bdevs", 00:24:27.031 "bdev_reset_iostat", 00:24:27.031 "bdev_get_iostat", 00:24:27.031 "bdev_examine", 00:24:27.031 "bdev_wait_for_examine", 00:24:27.031 "bdev_set_options", 00:24:27.031 "notify_get_notifications", 00:24:27.031 "notify_get_types", 00:24:27.031 "accel_get_stats", 00:24:27.031 "accel_set_options", 00:24:27.031 "accel_set_driver", 00:24:27.031 "accel_crypto_key_destroy", 00:24:27.031 "accel_crypto_keys_get", 00:24:27.031 "accel_crypto_key_create", 00:24:27.031 "accel_assign_opc", 00:24:27.031 "accel_get_module_info", 00:24:27.031 "accel_get_opc_assignments", 00:24:27.032 "vmd_rescan", 00:24:27.032 "vmd_remove_device", 00:24:27.032 "vmd_enable", 00:24:27.032 "sock_get_default_impl", 00:24:27.032 "sock_set_default_impl", 00:24:27.032 "sock_impl_set_options", 00:24:27.032 "sock_impl_get_options", 00:24:27.032 "iobuf_get_stats", 00:24:27.032 "iobuf_set_options", 00:24:27.032 "framework_get_pci_devices", 00:24:27.032 "framework_get_config", 00:24:27.032 "framework_get_subsystems", 00:24:27.032 "trace_get_info", 00:24:27.032 "trace_get_tpoint_group_mask", 00:24:27.032 "trace_disable_tpoint_group", 00:24:27.032 "trace_enable_tpoint_group", 00:24:27.032 "trace_clear_tpoint_mask", 00:24:27.032 "trace_set_tpoint_mask", 00:24:27.032 "keyring_get_keys", 00:24:27.032 "spdk_get_version", 00:24:27.032 "rpc_get_methods" 00:24:27.032 ] 00:24:27.032 11:35:10 spdkcli_tcp -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:24:27.032 11:35:10 spdkcli_tcp -- common/autotest_common.sh@729 -- # xtrace_disable 00:24:27.032 11:35:10 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:24:27.032 11:35:10 spdkcli_tcp -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:24:27.032 11:35:10 spdkcli_tcp -- spdkcli/tcp.sh@38 -- # killprocess 100098 00:24:27.032 11:35:10 spdkcli_tcp -- common/autotest_common.sh@949 -- # '[' -z 100098 ']' 00:24:27.032 11:35:10 spdkcli_tcp -- common/autotest_common.sh@953 -- # kill -0 100098 00:24:27.032 11:35:10 spdkcli_tcp -- common/autotest_common.sh@954 -- # uname 00:24:27.032 11:35:10 spdkcli_tcp -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:24:27.032 11:35:10 spdkcli_tcp -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 100098 00:24:27.032 11:35:10 spdkcli_tcp -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:24:27.032 11:35:10 spdkcli_tcp -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:24:27.032 11:35:10 spdkcli_tcp -- common/autotest_common.sh@967 -- # echo 'killing process with pid 100098' 00:24:27.032 killing process with pid 100098 00:24:27.032 11:35:10 spdkcli_tcp -- common/autotest_common.sh@968 -- # kill 100098 00:24:27.032 11:35:10 spdkcli_tcp -- common/autotest_common.sh@973 -- # wait 100098 00:24:27.600 00:24:27.600 real 0m1.599s 00:24:27.600 user 0m2.821s 00:24:27.600 sys 0m0.523s 00:24:27.600 11:35:11 spdkcli_tcp -- common/autotest_common.sh@1125 -- # xtrace_disable 00:24:27.600 11:35:11 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:24:27.600 ************************************ 00:24:27.600 END TEST spdkcli_tcp 00:24:27.600 ************************************ 00:24:27.600 11:35:11 -- spdk/autotest.sh@180 -- # run_test dpdk_mem_utility /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:24:27.600 11:35:11 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:24:27.600 11:35:11 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:24:27.600 11:35:11 -- common/autotest_common.sh@10 -- # set +x 00:24:27.600 ************************************ 00:24:27.600 START TEST dpdk_mem_utility 00:24:27.600 ************************************ 00:24:27.600 11:35:11 dpdk_mem_utility -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:24:27.600 * Looking for test storage... 00:24:27.600 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility 00:24:27.600 11:35:11 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:24:27.600 11:35:11 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=100480 00:24:27.600 11:35:11 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 100480 00:24:27.600 11:35:11 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:24:27.600 11:35:11 dpdk_mem_utility -- common/autotest_common.sh@830 -- # '[' -z 100480 ']' 00:24:27.600 11:35:11 dpdk_mem_utility -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:24:27.600 11:35:11 dpdk_mem_utility -- common/autotest_common.sh@835 -- # local max_retries=100 00:24:27.600 11:35:11 dpdk_mem_utility -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:24:27.600 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:24:27.600 11:35:11 dpdk_mem_utility -- common/autotest_common.sh@839 -- # xtrace_disable 00:24:27.600 11:35:11 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:24:27.600 [2024-06-10 11:35:11.532857] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:24:27.600 [2024-06-10 11:35:11.532940] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid100480 ] 00:24:27.859 [2024-06-10 11:35:11.621247] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:27.859 [2024-06-10 11:35:11.708284] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:24:28.428 11:35:12 dpdk_mem_utility -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:24:28.428 11:35:12 dpdk_mem_utility -- common/autotest_common.sh@863 -- # return 0 00:24:28.428 11:35:12 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:24:28.428 11:35:12 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:24:28.428 11:35:12 dpdk_mem_utility -- common/autotest_common.sh@560 -- # xtrace_disable 00:24:28.428 11:35:12 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:24:28.428 { 00:24:28.428 "filename": "/tmp/spdk_mem_dump.txt" 00:24:28.428 } 00:24:28.428 11:35:12 dpdk_mem_utility -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:24:28.428 11:35:12 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:24:28.691 DPDK memory size 816.000000 MiB in 2 heap(s) 00:24:28.691 2 heaps totaling size 816.000000 MiB 00:24:28.691 size: 814.000000 MiB heap id: 0 00:24:28.691 size: 2.000000 MiB heap id: 1 00:24:28.691 end heaps---------- 00:24:28.691 8 mempools totaling size 598.116089 MiB 00:24:28.691 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:24:28.691 size: 158.602051 MiB name: PDU_data_out_Pool 00:24:28.691 size: 84.521057 MiB name: bdev_io_100480 00:24:28.691 size: 51.011292 MiB name: evtpool_100480 00:24:28.691 size: 50.003479 MiB name: msgpool_100480 00:24:28.691 size: 21.763794 MiB name: PDU_Pool 00:24:28.691 size: 19.513306 MiB name: SCSI_TASK_Pool 00:24:28.691 size: 0.026123 MiB name: Session_Pool 00:24:28.691 end mempools------- 00:24:28.691 201 memzones totaling size 4.176453 MiB 00:24:28.691 size: 1.000366 MiB name: RG_ring_0_100480 00:24:28.691 size: 1.000366 MiB name: RG_ring_1_100480 00:24:28.691 size: 1.000366 MiB name: RG_ring_4_100480 00:24:28.691 size: 1.000366 MiB name: RG_ring_5_100480 00:24:28.691 size: 0.125366 MiB name: RG_ring_2_100480 00:24:28.691 size: 0.015991 MiB name: RG_ring_3_100480 00:24:28.691 size: 0.001160 MiB name: QAT_SYM_CAPA_GEN_1 00:24:28.691 size: 0.000305 MiB name: 0000:3d:01.0_qat 00:24:28.691 size: 0.000305 MiB name: 0000:3d:01.1_qat 00:24:28.691 size: 0.000305 MiB name: 0000:3d:01.2_qat 00:24:28.691 size: 0.000305 MiB name: 0000:3d:01.3_qat 00:24:28.691 size: 0.000305 MiB name: 0000:3d:01.4_qat 00:24:28.691 size: 0.000305 MiB name: 0000:3d:01.5_qat 00:24:28.691 size: 0.000305 MiB name: 0000:3d:01.6_qat 00:24:28.691 size: 0.000305 MiB name: 0000:3d:01.7_qat 00:24:28.691 size: 0.000305 MiB name: 0000:3d:02.0_qat 00:24:28.691 size: 0.000305 MiB name: 0000:3d:02.1_qat 00:24:28.691 size: 0.000305 MiB name: 0000:3d:02.2_qat 00:24:28.691 size: 0.000305 MiB name: 0000:3d:02.3_qat 00:24:28.691 size: 0.000305 MiB name: 0000:3d:02.4_qat 00:24:28.691 size: 0.000305 MiB name: 0000:3d:02.5_qat 00:24:28.691 size: 0.000305 MiB name: 0000:3d:02.6_qat 00:24:28.691 size: 0.000305 MiB name: 0000:3d:02.7_qat 00:24:28.691 size: 0.000305 MiB name: 0000:3f:01.0_qat 00:24:28.691 size: 0.000305 MiB name: 0000:3f:01.1_qat 00:24:28.691 size: 0.000305 MiB name: 0000:3f:01.2_qat 00:24:28.691 size: 0.000305 MiB name: 0000:3f:01.3_qat 00:24:28.691 size: 0.000305 MiB name: 0000:3f:01.4_qat 00:24:28.691 size: 0.000305 MiB name: 0000:3f:01.5_qat 00:24:28.691 size: 0.000305 MiB name: 0000:3f:01.6_qat 00:24:28.691 size: 0.000305 MiB name: 0000:3f:01.7_qat 00:24:28.691 size: 0.000305 MiB name: 0000:3f:02.0_qat 00:24:28.691 size: 0.000305 MiB name: 0000:3f:02.1_qat 00:24:28.691 size: 0.000305 MiB name: 0000:3f:02.2_qat 00:24:28.691 size: 0.000305 MiB name: 0000:3f:02.3_qat 00:24:28.691 size: 0.000305 MiB name: 0000:3f:02.4_qat 00:24:28.691 size: 0.000305 MiB name: 0000:3f:02.5_qat 00:24:28.691 size: 0.000305 MiB name: 0000:3f:02.6_qat 00:24:28.691 size: 0.000305 MiB name: 0000:3f:02.7_qat 00:24:28.691 size: 0.000305 MiB name: 0000:da:01.0_qat 00:24:28.691 size: 0.000305 MiB name: 0000:da:01.1_qat 00:24:28.691 size: 0.000305 MiB name: 0000:da:01.2_qat 00:24:28.691 size: 0.000305 MiB name: 0000:da:01.3_qat 00:24:28.691 size: 0.000305 MiB name: 0000:da:01.4_qat 00:24:28.691 size: 0.000305 MiB name: 0000:da:01.5_qat 00:24:28.691 size: 0.000305 MiB name: 0000:da:01.6_qat 00:24:28.691 size: 0.000305 MiB name: 0000:da:01.7_qat 00:24:28.691 size: 0.000305 MiB name: 0000:da:02.0_qat 00:24:28.691 size: 0.000305 MiB name: 0000:da:02.1_qat 00:24:28.691 size: 0.000305 MiB name: 0000:da:02.2_qat 00:24:28.691 size: 0.000305 MiB name: 0000:da:02.3_qat 00:24:28.691 size: 0.000305 MiB name: 0000:da:02.4_qat 00:24:28.691 size: 0.000305 MiB name: 0000:da:02.5_qat 00:24:28.691 size: 0.000305 MiB name: 0000:da:02.6_qat 00:24:28.691 size: 0.000305 MiB name: 0000:da:02.7_qat 00:24:28.691 size: 0.000183 MiB name: QAT_ASYM_CAPA_GEN_1 00:24:28.691 size: 0.000122 MiB name: rte_cryptodev_data_0 00:24:28.691 size: 0.000122 MiB name: rte_cryptodev_data_1 00:24:28.691 size: 0.000122 MiB name: rte_compressdev_data_0 00:24:28.691 size: 0.000122 MiB name: rte_cryptodev_data_2 00:24:28.691 size: 0.000122 MiB name: rte_cryptodev_data_3 00:24:28.691 size: 0.000122 MiB name: rte_compressdev_data_1 00:24:28.691 size: 0.000122 MiB name: rte_cryptodev_data_4 00:24:28.691 size: 0.000122 MiB name: rte_cryptodev_data_5 00:24:28.691 size: 0.000122 MiB name: rte_compressdev_data_2 00:24:28.691 size: 0.000122 MiB name: rte_cryptodev_data_6 00:24:28.691 size: 0.000122 MiB name: rte_cryptodev_data_7 00:24:28.691 size: 0.000122 MiB name: rte_compressdev_data_3 00:24:28.691 size: 0.000122 MiB name: rte_cryptodev_data_8 00:24:28.691 size: 0.000122 MiB name: rte_cryptodev_data_9 00:24:28.691 size: 0.000122 MiB name: rte_compressdev_data_4 00:24:28.691 size: 0.000122 MiB name: rte_cryptodev_data_10 00:24:28.691 size: 0.000122 MiB name: rte_cryptodev_data_11 00:24:28.691 size: 0.000122 MiB name: rte_compressdev_data_5 00:24:28.691 size: 0.000122 MiB name: rte_cryptodev_data_12 00:24:28.691 size: 0.000122 MiB name: rte_cryptodev_data_13 00:24:28.691 size: 0.000122 MiB name: rte_compressdev_data_6 00:24:28.691 size: 0.000122 MiB name: rte_cryptodev_data_14 00:24:28.691 size: 0.000122 MiB name: rte_cryptodev_data_15 00:24:28.691 size: 0.000122 MiB name: rte_compressdev_data_7 00:24:28.691 size: 0.000122 MiB name: rte_cryptodev_data_16 00:24:28.691 size: 0.000122 MiB name: rte_cryptodev_data_17 00:24:28.691 size: 0.000122 MiB name: rte_compressdev_data_8 00:24:28.691 size: 0.000122 MiB name: rte_cryptodev_data_18 00:24:28.691 size: 0.000122 MiB name: rte_cryptodev_data_19 00:24:28.691 size: 0.000122 MiB name: rte_compressdev_data_9 00:24:28.691 size: 0.000122 MiB name: rte_cryptodev_data_20 00:24:28.691 size: 0.000122 MiB name: rte_cryptodev_data_21 00:24:28.691 size: 0.000122 MiB name: rte_compressdev_data_10 00:24:28.691 size: 0.000122 MiB name: rte_cryptodev_data_22 00:24:28.691 size: 0.000122 MiB name: rte_cryptodev_data_23 00:24:28.692 size: 0.000122 MiB name: rte_compressdev_data_11 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_24 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_25 00:24:28.692 size: 0.000122 MiB name: rte_compressdev_data_12 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_26 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_27 00:24:28.692 size: 0.000122 MiB name: rte_compressdev_data_13 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_28 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_29 00:24:28.692 size: 0.000122 MiB name: rte_compressdev_data_14 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_30 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_31 00:24:28.692 size: 0.000122 MiB name: rte_compressdev_data_15 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_32 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_33 00:24:28.692 size: 0.000122 MiB name: rte_compressdev_data_16 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_34 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_35 00:24:28.692 size: 0.000122 MiB name: rte_compressdev_data_17 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_36 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_37 00:24:28.692 size: 0.000122 MiB name: rte_compressdev_data_18 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_38 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_39 00:24:28.692 size: 0.000122 MiB name: rte_compressdev_data_19 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_40 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_41 00:24:28.692 size: 0.000122 MiB name: rte_compressdev_data_20 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_42 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_43 00:24:28.692 size: 0.000122 MiB name: rte_compressdev_data_21 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_44 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_45 00:24:28.692 size: 0.000122 MiB name: rte_compressdev_data_22 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_46 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_47 00:24:28.692 size: 0.000122 MiB name: rte_compressdev_data_23 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_48 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_49 00:24:28.692 size: 0.000122 MiB name: rte_compressdev_data_24 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_50 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_51 00:24:28.692 size: 0.000122 MiB name: rte_compressdev_data_25 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_52 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_53 00:24:28.692 size: 0.000122 MiB name: rte_compressdev_data_26 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_54 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_55 00:24:28.692 size: 0.000122 MiB name: rte_compressdev_data_27 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_56 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_57 00:24:28.692 size: 0.000122 MiB name: rte_compressdev_data_28 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_58 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_59 00:24:28.692 size: 0.000122 MiB name: rte_compressdev_data_29 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_60 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_61 00:24:28.692 size: 0.000122 MiB name: rte_compressdev_data_30 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_62 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_63 00:24:28.692 size: 0.000122 MiB name: rte_compressdev_data_31 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_64 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_65 00:24:28.692 size: 0.000122 MiB name: rte_compressdev_data_32 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_66 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_67 00:24:28.692 size: 0.000122 MiB name: rte_compressdev_data_33 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_68 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_69 00:24:28.692 size: 0.000122 MiB name: rte_compressdev_data_34 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_70 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_71 00:24:28.692 size: 0.000122 MiB name: rte_compressdev_data_35 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_72 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_73 00:24:28.692 size: 0.000122 MiB name: rte_compressdev_data_36 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_74 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_75 00:24:28.692 size: 0.000122 MiB name: rte_compressdev_data_37 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_76 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_77 00:24:28.692 size: 0.000122 MiB name: rte_compressdev_data_38 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_78 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_79 00:24:28.692 size: 0.000122 MiB name: rte_compressdev_data_39 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_80 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_81 00:24:28.692 size: 0.000122 MiB name: rte_compressdev_data_40 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_82 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_83 00:24:28.692 size: 0.000122 MiB name: rte_compressdev_data_41 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_84 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_85 00:24:28.692 size: 0.000122 MiB name: rte_compressdev_data_42 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_86 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_87 00:24:28.692 size: 0.000122 MiB name: rte_compressdev_data_43 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_88 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_89 00:24:28.692 size: 0.000122 MiB name: rte_compressdev_data_44 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_90 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_91 00:24:28.692 size: 0.000122 MiB name: rte_compressdev_data_45 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_92 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_93 00:24:28.692 size: 0.000122 MiB name: rte_compressdev_data_46 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_94 00:24:28.692 size: 0.000122 MiB name: rte_cryptodev_data_95 00:24:28.692 size: 0.000122 MiB name: rte_compressdev_data_47 00:24:28.692 size: 0.000061 MiB name: QAT_COMP_CAPA_GEN_1 00:24:28.692 end memzones------- 00:24:28.692 11:35:12 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:24:28.692 heap id: 0 total size: 814.000000 MiB number of busy elements: 557 number of free elements: 14 00:24:28.692 list of free elements. size: 11.807861 MiB 00:24:28.692 element at address: 0x200000400000 with size: 1.999512 MiB 00:24:28.692 element at address: 0x200018e00000 with size: 0.999878 MiB 00:24:28.692 element at address: 0x200019000000 with size: 0.999878 MiB 00:24:28.692 element at address: 0x200003e00000 with size: 0.996460 MiB 00:24:28.692 element at address: 0x200031c00000 with size: 0.994446 MiB 00:24:28.692 element at address: 0x200013800000 with size: 0.978882 MiB 00:24:28.692 element at address: 0x200007000000 with size: 0.959839 MiB 00:24:28.692 element at address: 0x200019200000 with size: 0.937256 MiB 00:24:28.692 element at address: 0x20001aa00000 with size: 0.580139 MiB 00:24:28.692 element at address: 0x200003a00000 with size: 0.498535 MiB 00:24:28.692 element at address: 0x20000b200000 with size: 0.491272 MiB 00:24:28.692 element at address: 0x200000800000 with size: 0.486511 MiB 00:24:28.692 element at address: 0x200019400000 with size: 0.485840 MiB 00:24:28.692 element at address: 0x200027e00000 with size: 0.399414 MiB 00:24:28.692 list of standard malloc elements. size: 199.883850 MiB 00:24:28.692 element at address: 0x20000b3fff80 with size: 132.000122 MiB 00:24:28.692 element at address: 0x2000071fff80 with size: 64.000122 MiB 00:24:28.692 element at address: 0x200018efff80 with size: 1.000122 MiB 00:24:28.692 element at address: 0x2000190fff80 with size: 1.000122 MiB 00:24:28.692 element at address: 0x2000192fff80 with size: 1.000122 MiB 00:24:28.692 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:24:28.692 element at address: 0x2000192eff00 with size: 0.062622 MiB 00:24:28.692 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:24:28.692 element at address: 0x200000330b40 with size: 0.004395 MiB 00:24:28.692 element at address: 0x2000003340c0 with size: 0.004395 MiB 00:24:28.692 element at address: 0x200000337640 with size: 0.004395 MiB 00:24:28.692 element at address: 0x20000033abc0 with size: 0.004395 MiB 00:24:28.692 element at address: 0x20000033e140 with size: 0.004395 MiB 00:24:28.692 element at address: 0x2000003416c0 with size: 0.004395 MiB 00:24:28.692 element at address: 0x200000344c40 with size: 0.004395 MiB 00:24:28.692 element at address: 0x2000003481c0 with size: 0.004395 MiB 00:24:28.692 element at address: 0x20000034b740 with size: 0.004395 MiB 00:24:28.692 element at address: 0x20000034ecc0 with size: 0.004395 MiB 00:24:28.692 element at address: 0x200000352240 with size: 0.004395 MiB 00:24:28.692 element at address: 0x2000003557c0 with size: 0.004395 MiB 00:24:28.692 element at address: 0x200000358d40 with size: 0.004395 MiB 00:24:28.692 element at address: 0x20000035c2c0 with size: 0.004395 MiB 00:24:28.692 element at address: 0x20000035f840 with size: 0.004395 MiB 00:24:28.692 element at address: 0x200000362dc0 with size: 0.004395 MiB 00:24:28.692 element at address: 0x200000366880 with size: 0.004395 MiB 00:24:28.692 element at address: 0x20000036a340 with size: 0.004395 MiB 00:24:28.692 element at address: 0x20000036de00 with size: 0.004395 MiB 00:24:28.692 element at address: 0x2000003718c0 with size: 0.004395 MiB 00:24:28.692 element at address: 0x200000375380 with size: 0.004395 MiB 00:24:28.692 element at address: 0x200000378e40 with size: 0.004395 MiB 00:24:28.692 element at address: 0x20000037c900 with size: 0.004395 MiB 00:24:28.692 element at address: 0x2000003803c0 with size: 0.004395 MiB 00:24:28.692 element at address: 0x200000383e80 with size: 0.004395 MiB 00:24:28.693 element at address: 0x200000387940 with size: 0.004395 MiB 00:24:28.693 element at address: 0x20000038b400 with size: 0.004395 MiB 00:24:28.693 element at address: 0x20000038eec0 with size: 0.004395 MiB 00:24:28.693 element at address: 0x200000392980 with size: 0.004395 MiB 00:24:28.693 element at address: 0x200000396440 with size: 0.004395 MiB 00:24:28.693 element at address: 0x200000399f00 with size: 0.004395 MiB 00:24:28.693 element at address: 0x20000039d9c0 with size: 0.004395 MiB 00:24:28.693 element at address: 0x2000003a1480 with size: 0.004395 MiB 00:24:28.693 element at address: 0x2000003a4f40 with size: 0.004395 MiB 00:24:28.693 element at address: 0x2000003a8a00 with size: 0.004395 MiB 00:24:28.693 element at address: 0x2000003ac4c0 with size: 0.004395 MiB 00:24:28.693 element at address: 0x2000003aff80 with size: 0.004395 MiB 00:24:28.693 element at address: 0x2000003b3a40 with size: 0.004395 MiB 00:24:28.693 element at address: 0x2000003b7500 with size: 0.004395 MiB 00:24:28.693 element at address: 0x2000003bafc0 with size: 0.004395 MiB 00:24:28.693 element at address: 0x2000003bea80 with size: 0.004395 MiB 00:24:28.693 element at address: 0x2000003c2540 with size: 0.004395 MiB 00:24:28.693 element at address: 0x2000003c6000 with size: 0.004395 MiB 00:24:28.693 element at address: 0x2000003c9ac0 with size: 0.004395 MiB 00:24:28.693 element at address: 0x2000003cd580 with size: 0.004395 MiB 00:24:28.693 element at address: 0x2000003d1040 with size: 0.004395 MiB 00:24:28.693 element at address: 0x2000003d4b00 with size: 0.004395 MiB 00:24:28.693 element at address: 0x2000003d8d00 with size: 0.004395 MiB 00:24:28.693 element at address: 0x20000032ea40 with size: 0.004028 MiB 00:24:28.693 element at address: 0x20000032fac0 with size: 0.004028 MiB 00:24:28.693 element at address: 0x200000331fc0 with size: 0.004028 MiB 00:24:28.693 element at address: 0x200000333040 with size: 0.004028 MiB 00:24:28.693 element at address: 0x200000335540 with size: 0.004028 MiB 00:24:28.693 element at address: 0x2000003365c0 with size: 0.004028 MiB 00:24:28.693 element at address: 0x200000338ac0 with size: 0.004028 MiB 00:24:28.693 element at address: 0x200000339b40 with size: 0.004028 MiB 00:24:28.693 element at address: 0x20000033c040 with size: 0.004028 MiB 00:24:28.693 element at address: 0x20000033d0c0 with size: 0.004028 MiB 00:24:28.693 element at address: 0x20000033f5c0 with size: 0.004028 MiB 00:24:28.693 element at address: 0x200000340640 with size: 0.004028 MiB 00:24:28.693 element at address: 0x200000342b40 with size: 0.004028 MiB 00:24:28.693 element at address: 0x200000343bc0 with size: 0.004028 MiB 00:24:28.693 element at address: 0x2000003460c0 with size: 0.004028 MiB 00:24:28.693 element at address: 0x200000347140 with size: 0.004028 MiB 00:24:28.693 element at address: 0x200000349640 with size: 0.004028 MiB 00:24:28.693 element at address: 0x20000034a6c0 with size: 0.004028 MiB 00:24:28.693 element at address: 0x20000034cbc0 with size: 0.004028 MiB 00:24:28.693 element at address: 0x20000034dc40 with size: 0.004028 MiB 00:24:28.693 element at address: 0x200000350140 with size: 0.004028 MiB 00:24:28.693 element at address: 0x2000003511c0 with size: 0.004028 MiB 00:24:28.693 element at address: 0x2000003536c0 with size: 0.004028 MiB 00:24:28.693 element at address: 0x200000354740 with size: 0.004028 MiB 00:24:28.693 element at address: 0x200000356c40 with size: 0.004028 MiB 00:24:28.693 element at address: 0x200000357cc0 with size: 0.004028 MiB 00:24:28.693 element at address: 0x20000035a1c0 with size: 0.004028 MiB 00:24:28.693 element at address: 0x20000035b240 with size: 0.004028 MiB 00:24:28.693 element at address: 0x20000035d740 with size: 0.004028 MiB 00:24:28.693 element at address: 0x20000035e7c0 with size: 0.004028 MiB 00:24:28.693 element at address: 0x200000360cc0 with size: 0.004028 MiB 00:24:28.693 element at address: 0x200000361d40 with size: 0.004028 MiB 00:24:28.693 element at address: 0x200000364780 with size: 0.004028 MiB 00:24:28.693 element at address: 0x200000365800 with size: 0.004028 MiB 00:24:28.693 element at address: 0x200000368240 with size: 0.004028 MiB 00:24:28.693 element at address: 0x2000003692c0 with size: 0.004028 MiB 00:24:28.693 element at address: 0x20000036bd00 with size: 0.004028 MiB 00:24:28.693 element at address: 0x20000036cd80 with size: 0.004028 MiB 00:24:28.693 element at address: 0x20000036f7c0 with size: 0.004028 MiB 00:24:28.693 element at address: 0x200000370840 with size: 0.004028 MiB 00:24:28.693 element at address: 0x200000373280 with size: 0.004028 MiB 00:24:28.693 element at address: 0x200000374300 with size: 0.004028 MiB 00:24:28.693 element at address: 0x200000376d40 with size: 0.004028 MiB 00:24:28.693 element at address: 0x200000377dc0 with size: 0.004028 MiB 00:24:28.693 element at address: 0x20000037a800 with size: 0.004028 MiB 00:24:28.693 element at address: 0x20000037b880 with size: 0.004028 MiB 00:24:28.693 element at address: 0x20000037e2c0 with size: 0.004028 MiB 00:24:28.693 element at address: 0x20000037f340 with size: 0.004028 MiB 00:24:28.693 element at address: 0x200000381d80 with size: 0.004028 MiB 00:24:28.693 element at address: 0x200000382e00 with size: 0.004028 MiB 00:24:28.693 element at address: 0x200000385840 with size: 0.004028 MiB 00:24:28.693 element at address: 0x2000003868c0 with size: 0.004028 MiB 00:24:28.693 element at address: 0x200000389300 with size: 0.004028 MiB 00:24:28.693 element at address: 0x20000038a380 with size: 0.004028 MiB 00:24:28.693 element at address: 0x20000038cdc0 with size: 0.004028 MiB 00:24:28.693 element at address: 0x20000038de40 with size: 0.004028 MiB 00:24:28.693 element at address: 0x200000390880 with size: 0.004028 MiB 00:24:28.693 element at address: 0x200000391900 with size: 0.004028 MiB 00:24:28.693 element at address: 0x200000394340 with size: 0.004028 MiB 00:24:28.693 element at address: 0x2000003953c0 with size: 0.004028 MiB 00:24:28.693 element at address: 0x200000397e00 with size: 0.004028 MiB 00:24:28.693 element at address: 0x200000398e80 with size: 0.004028 MiB 00:24:28.693 element at address: 0x20000039b8c0 with size: 0.004028 MiB 00:24:28.693 element at address: 0x20000039c940 with size: 0.004028 MiB 00:24:28.693 element at address: 0x20000039f380 with size: 0.004028 MiB 00:24:28.693 element at address: 0x2000003a0400 with size: 0.004028 MiB 00:24:28.693 element at address: 0x2000003a2e40 with size: 0.004028 MiB 00:24:28.693 element at address: 0x2000003a3ec0 with size: 0.004028 MiB 00:24:28.693 element at address: 0x2000003a6900 with size: 0.004028 MiB 00:24:28.693 element at address: 0x2000003a7980 with size: 0.004028 MiB 00:24:28.693 element at address: 0x2000003aa3c0 with size: 0.004028 MiB 00:24:28.693 element at address: 0x2000003ab440 with size: 0.004028 MiB 00:24:28.693 element at address: 0x2000003ade80 with size: 0.004028 MiB 00:24:28.693 element at address: 0x2000003aef00 with size: 0.004028 MiB 00:24:28.693 element at address: 0x2000003b1940 with size: 0.004028 MiB 00:24:28.693 element at address: 0x2000003b29c0 with size: 0.004028 MiB 00:24:28.693 element at address: 0x2000003b5400 with size: 0.004028 MiB 00:24:28.693 element at address: 0x2000003b6480 with size: 0.004028 MiB 00:24:28.693 element at address: 0x2000003b8ec0 with size: 0.004028 MiB 00:24:28.693 element at address: 0x2000003b9f40 with size: 0.004028 MiB 00:24:28.693 element at address: 0x2000003bc980 with size: 0.004028 MiB 00:24:28.693 element at address: 0x2000003bda00 with size: 0.004028 MiB 00:24:28.693 element at address: 0x2000003c0440 with size: 0.004028 MiB 00:24:28.693 element at address: 0x2000003c14c0 with size: 0.004028 MiB 00:24:28.693 element at address: 0x2000003c3f00 with size: 0.004028 MiB 00:24:28.693 element at address: 0x2000003c4f80 with size: 0.004028 MiB 00:24:28.693 element at address: 0x2000003c79c0 with size: 0.004028 MiB 00:24:28.693 element at address: 0x2000003c8a40 with size: 0.004028 MiB 00:24:28.693 element at address: 0x2000003cb480 with size: 0.004028 MiB 00:24:28.693 element at address: 0x2000003cc500 with size: 0.004028 MiB 00:24:28.693 element at address: 0x2000003cef40 with size: 0.004028 MiB 00:24:28.693 element at address: 0x2000003cffc0 with size: 0.004028 MiB 00:24:28.693 element at address: 0x2000003d2a00 with size: 0.004028 MiB 00:24:28.693 element at address: 0x2000003d3a80 with size: 0.004028 MiB 00:24:28.693 element at address: 0x2000003d6c00 with size: 0.004028 MiB 00:24:28.693 element at address: 0x2000003d7c80 with size: 0.004028 MiB 00:24:28.693 element at address: 0x200000204f80 with size: 0.000305 MiB 00:24:28.693 element at address: 0x200000200000 with size: 0.000183 MiB 00:24:28.693 element at address: 0x2000002000c0 with size: 0.000183 MiB 00:24:28.693 element at address: 0x200000200180 with size: 0.000183 MiB 00:24:28.693 element at address: 0x200000200240 with size: 0.000183 MiB 00:24:28.693 element at address: 0x200000200300 with size: 0.000183 MiB 00:24:28.693 element at address: 0x2000002003c0 with size: 0.000183 MiB 00:24:28.693 element at address: 0x200000200480 with size: 0.000183 MiB 00:24:28.693 element at address: 0x200000200540 with size: 0.000183 MiB 00:24:28.693 element at address: 0x200000200600 with size: 0.000183 MiB 00:24:28.693 element at address: 0x2000002006c0 with size: 0.000183 MiB 00:24:28.693 element at address: 0x200000200780 with size: 0.000183 MiB 00:24:28.693 element at address: 0x200000200840 with size: 0.000183 MiB 00:24:28.693 element at address: 0x200000200900 with size: 0.000183 MiB 00:24:28.693 element at address: 0x2000002009c0 with size: 0.000183 MiB 00:24:28.693 element at address: 0x200000200a80 with size: 0.000183 MiB 00:24:28.693 element at address: 0x200000200b40 with size: 0.000183 MiB 00:24:28.693 element at address: 0x200000200c00 with size: 0.000183 MiB 00:24:28.693 element at address: 0x200000200cc0 with size: 0.000183 MiB 00:24:28.693 element at address: 0x200000200d80 with size: 0.000183 MiB 00:24:28.693 element at address: 0x200000200e40 with size: 0.000183 MiB 00:24:28.693 element at address: 0x200000200f00 with size: 0.000183 MiB 00:24:28.693 element at address: 0x200000200fc0 with size: 0.000183 MiB 00:24:28.693 element at address: 0x200000201080 with size: 0.000183 MiB 00:24:28.693 element at address: 0x200000201140 with size: 0.000183 MiB 00:24:28.693 element at address: 0x200000201200 with size: 0.000183 MiB 00:24:28.693 element at address: 0x2000002012c0 with size: 0.000183 MiB 00:24:28.693 element at address: 0x200000201380 with size: 0.000183 MiB 00:24:28.693 element at address: 0x200000201440 with size: 0.000183 MiB 00:24:28.693 element at address: 0x200000201500 with size: 0.000183 MiB 00:24:28.693 element at address: 0x2000002015c0 with size: 0.000183 MiB 00:24:28.693 element at address: 0x200000201680 with size: 0.000183 MiB 00:24:28.693 element at address: 0x200000201740 with size: 0.000183 MiB 00:24:28.693 element at address: 0x200000201800 with size: 0.000183 MiB 00:24:28.694 element at address: 0x2000002018c0 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000201980 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000201a40 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000201b00 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000201bc0 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000201c80 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000201d40 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000201e00 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000201ec0 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000201f80 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000202040 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000202100 with size: 0.000183 MiB 00:24:28.694 element at address: 0x2000002021c0 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000202280 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000202340 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000202400 with size: 0.000183 MiB 00:24:28.694 element at address: 0x2000002024c0 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000202580 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000202640 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000202700 with size: 0.000183 MiB 00:24:28.694 element at address: 0x2000002027c0 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000202880 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000202940 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000202a00 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000202ac0 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000202b80 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000202c40 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000202d00 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000202dc0 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000202e80 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000202f40 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000203000 with size: 0.000183 MiB 00:24:28.694 element at address: 0x2000002030c0 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000203180 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000203240 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000203300 with size: 0.000183 MiB 00:24:28.694 element at address: 0x2000002033c0 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000203480 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000203540 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000203600 with size: 0.000183 MiB 00:24:28.694 element at address: 0x2000002036c0 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000203780 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000203840 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000203900 with size: 0.000183 MiB 00:24:28.694 element at address: 0x2000002039c0 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000203a80 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000203b40 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000203c00 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000203cc0 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000203d80 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000203e40 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000203f00 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000203fc0 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000204080 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000204140 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000204200 with size: 0.000183 MiB 00:24:28.694 element at address: 0x2000002042c0 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000204380 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000204440 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000204500 with size: 0.000183 MiB 00:24:28.694 element at address: 0x2000002045c0 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000204680 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000204740 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000204800 with size: 0.000183 MiB 00:24:28.694 element at address: 0x2000002048c0 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000204980 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000204a40 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000204b00 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000204bc0 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000204c80 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000204d40 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000204e00 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000204ec0 with size: 0.000183 MiB 00:24:28.694 element at address: 0x2000002050c0 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000205180 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000205240 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000205300 with size: 0.000183 MiB 00:24:28.694 element at address: 0x2000002053c0 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000205480 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000205540 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000205600 with size: 0.000183 MiB 00:24:28.694 element at address: 0x2000002056c0 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000205780 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000205840 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000205900 with size: 0.000183 MiB 00:24:28.694 element at address: 0x2000002059c0 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000205a80 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000205b40 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000205c00 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000205cc0 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000205d80 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000205e40 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000205f00 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000205fc0 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000206080 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000206140 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000206200 with size: 0.000183 MiB 00:24:28.694 element at address: 0x2000002062c0 with size: 0.000183 MiB 00:24:28.694 element at address: 0x2000002064c0 with size: 0.000183 MiB 00:24:28.694 element at address: 0x20000020a780 with size: 0.000183 MiB 00:24:28.694 element at address: 0x20000022aa40 with size: 0.000183 MiB 00:24:28.694 element at address: 0x20000022ab00 with size: 0.000183 MiB 00:24:28.694 element at address: 0x20000022abc0 with size: 0.000183 MiB 00:24:28.694 element at address: 0x20000022ac80 with size: 0.000183 MiB 00:24:28.694 element at address: 0x20000022ad40 with size: 0.000183 MiB 00:24:28.694 element at address: 0x20000022ae00 with size: 0.000183 MiB 00:24:28.694 element at address: 0x20000022aec0 with size: 0.000183 MiB 00:24:28.694 element at address: 0x20000022af80 with size: 0.000183 MiB 00:24:28.694 element at address: 0x20000022b040 with size: 0.000183 MiB 00:24:28.694 element at address: 0x20000022b100 with size: 0.000183 MiB 00:24:28.694 element at address: 0x20000022b1c0 with size: 0.000183 MiB 00:24:28.694 element at address: 0x20000022b280 with size: 0.000183 MiB 00:24:28.694 element at address: 0x20000022b340 with size: 0.000183 MiB 00:24:28.694 element at address: 0x20000022b400 with size: 0.000183 MiB 00:24:28.694 element at address: 0x20000022b4c0 with size: 0.000183 MiB 00:24:28.694 element at address: 0x20000022b580 with size: 0.000183 MiB 00:24:28.694 element at address: 0x20000022b640 with size: 0.000183 MiB 00:24:28.694 element at address: 0x20000022b700 with size: 0.000183 MiB 00:24:28.694 element at address: 0x20000022b900 with size: 0.000183 MiB 00:24:28.694 element at address: 0x20000022b9c0 with size: 0.000183 MiB 00:24:28.694 element at address: 0x20000022ba80 with size: 0.000183 MiB 00:24:28.694 element at address: 0x20000022bb40 with size: 0.000183 MiB 00:24:28.694 element at address: 0x20000022bc00 with size: 0.000183 MiB 00:24:28.694 element at address: 0x20000022bcc0 with size: 0.000183 MiB 00:24:28.694 element at address: 0x20000022bd80 with size: 0.000183 MiB 00:24:28.694 element at address: 0x20000022be40 with size: 0.000183 MiB 00:24:28.694 element at address: 0x20000022bf00 with size: 0.000183 MiB 00:24:28.694 element at address: 0x20000022bfc0 with size: 0.000183 MiB 00:24:28.694 element at address: 0x20000022c080 with size: 0.000183 MiB 00:24:28.694 element at address: 0x20000022c140 with size: 0.000183 MiB 00:24:28.694 element at address: 0x20000022c200 with size: 0.000183 MiB 00:24:28.694 element at address: 0x20000022c2c0 with size: 0.000183 MiB 00:24:28.694 element at address: 0x20000022c380 with size: 0.000183 MiB 00:24:28.694 element at address: 0x20000022c440 with size: 0.000183 MiB 00:24:28.694 element at address: 0x20000022c500 with size: 0.000183 MiB 00:24:28.694 element at address: 0x20000032e700 with size: 0.000183 MiB 00:24:28.694 element at address: 0x20000032e7c0 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000331d40 with size: 0.000183 MiB 00:24:28.694 element at address: 0x2000003352c0 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000338840 with size: 0.000183 MiB 00:24:28.694 element at address: 0x20000033bdc0 with size: 0.000183 MiB 00:24:28.694 element at address: 0x20000033f340 with size: 0.000183 MiB 00:24:28.694 element at address: 0x2000003428c0 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000345e40 with size: 0.000183 MiB 00:24:28.694 element at address: 0x2000003493c0 with size: 0.000183 MiB 00:24:28.694 element at address: 0x20000034c940 with size: 0.000183 MiB 00:24:28.694 element at address: 0x20000034fec0 with size: 0.000183 MiB 00:24:28.694 element at address: 0x200000353440 with size: 0.000183 MiB 00:24:28.694 element at address: 0x2000003569c0 with size: 0.000183 MiB 00:24:28.695 element at address: 0x200000359f40 with size: 0.000183 MiB 00:24:28.695 element at address: 0x20000035d4c0 with size: 0.000183 MiB 00:24:28.695 element at address: 0x200000360a40 with size: 0.000183 MiB 00:24:28.695 element at address: 0x200000363fc0 with size: 0.000183 MiB 00:24:28.695 element at address: 0x200000364180 with size: 0.000183 MiB 00:24:28.695 element at address: 0x200000364240 with size: 0.000183 MiB 00:24:28.695 element at address: 0x200000364400 with size: 0.000183 MiB 00:24:28.695 element at address: 0x200000367a80 with size: 0.000183 MiB 00:24:28.695 element at address: 0x200000367c40 with size: 0.000183 MiB 00:24:28.695 element at address: 0x200000367d00 with size: 0.000183 MiB 00:24:28.695 element at address: 0x200000367ec0 with size: 0.000183 MiB 00:24:28.695 element at address: 0x20000036b540 with size: 0.000183 MiB 00:24:28.695 element at address: 0x20000036b700 with size: 0.000183 MiB 00:24:28.695 element at address: 0x20000036b7c0 with size: 0.000183 MiB 00:24:28.695 element at address: 0x20000036b980 with size: 0.000183 MiB 00:24:28.695 element at address: 0x20000036f000 with size: 0.000183 MiB 00:24:28.695 element at address: 0x20000036f1c0 with size: 0.000183 MiB 00:24:28.695 element at address: 0x20000036f280 with size: 0.000183 MiB 00:24:28.695 element at address: 0x20000036f440 with size: 0.000183 MiB 00:24:28.695 element at address: 0x200000372ac0 with size: 0.000183 MiB 00:24:28.695 element at address: 0x200000372c80 with size: 0.000183 MiB 00:24:28.695 element at address: 0x200000372d40 with size: 0.000183 MiB 00:24:28.695 element at address: 0x200000372f00 with size: 0.000183 MiB 00:24:28.695 element at address: 0x200000376580 with size: 0.000183 MiB 00:24:28.695 element at address: 0x200000376740 with size: 0.000183 MiB 00:24:28.695 element at address: 0x200000376800 with size: 0.000183 MiB 00:24:28.695 element at address: 0x2000003769c0 with size: 0.000183 MiB 00:24:28.695 element at address: 0x20000037a040 with size: 0.000183 MiB 00:24:28.695 element at address: 0x20000037a200 with size: 0.000183 MiB 00:24:28.695 element at address: 0x20000037a2c0 with size: 0.000183 MiB 00:24:28.695 element at address: 0x20000037a480 with size: 0.000183 MiB 00:24:28.695 element at address: 0x20000037db00 with size: 0.000183 MiB 00:24:28.695 element at address: 0x20000037dcc0 with size: 0.000183 MiB 00:24:28.695 element at address: 0x20000037dd80 with size: 0.000183 MiB 00:24:28.695 element at address: 0x20000037df40 with size: 0.000183 MiB 00:24:28.695 element at address: 0x2000003815c0 with size: 0.000183 MiB 00:24:28.695 element at address: 0x200000381780 with size: 0.000183 MiB 00:24:28.695 element at address: 0x200000381840 with size: 0.000183 MiB 00:24:28.695 element at address: 0x200000381a00 with size: 0.000183 MiB 00:24:28.695 element at address: 0x200000385080 with size: 0.000183 MiB 00:24:28.695 element at address: 0x200000385240 with size: 0.000183 MiB 00:24:28.695 element at address: 0x200000385300 with size: 0.000183 MiB 00:24:28.695 element at address: 0x2000003854c0 with size: 0.000183 MiB 00:24:28.695 element at address: 0x200000388b40 with size: 0.000183 MiB 00:24:28.695 element at address: 0x200000388d00 with size: 0.000183 MiB 00:24:28.695 element at address: 0x200000388dc0 with size: 0.000183 MiB 00:24:28.695 element at address: 0x200000388f80 with size: 0.000183 MiB 00:24:28.695 element at address: 0x20000038c600 with size: 0.000183 MiB 00:24:28.695 element at address: 0x20000038c7c0 with size: 0.000183 MiB 00:24:28.695 element at address: 0x20000038c880 with size: 0.000183 MiB 00:24:28.695 element at address: 0x20000038ca40 with size: 0.000183 MiB 00:24:28.695 element at address: 0x2000003900c0 with size: 0.000183 MiB 00:24:28.695 element at address: 0x200000390280 with size: 0.000183 MiB 00:24:28.695 element at address: 0x200000390340 with size: 0.000183 MiB 00:24:28.695 element at address: 0x200000390500 with size: 0.000183 MiB 00:24:28.695 element at address: 0x200000393b80 with size: 0.000183 MiB 00:24:28.695 element at address: 0x200000393d40 with size: 0.000183 MiB 00:24:28.695 element at address: 0x200000393e00 with size: 0.000183 MiB 00:24:28.695 element at address: 0x200000393fc0 with size: 0.000183 MiB 00:24:28.695 element at address: 0x200000397640 with size: 0.000183 MiB 00:24:28.695 element at address: 0x200000397800 with size: 0.000183 MiB 00:24:28.695 element at address: 0x2000003978c0 with size: 0.000183 MiB 00:24:28.695 element at address: 0x200000397a80 with size: 0.000183 MiB 00:24:28.695 element at address: 0x20000039b100 with size: 0.000183 MiB 00:24:28.695 element at address: 0x20000039b2c0 with size: 0.000183 MiB 00:24:28.695 element at address: 0x20000039b380 with size: 0.000183 MiB 00:24:28.695 element at address: 0x20000039b540 with size: 0.000183 MiB 00:24:28.695 element at address: 0x20000039ebc0 with size: 0.000183 MiB 00:24:28.695 element at address: 0x20000039ed80 with size: 0.000183 MiB 00:24:28.695 element at address: 0x20000039ee40 with size: 0.000183 MiB 00:24:28.695 element at address: 0x20000039f000 with size: 0.000183 MiB 00:24:28.695 element at address: 0x2000003a2680 with size: 0.000183 MiB 00:24:28.695 element at address: 0x2000003a2840 with size: 0.000183 MiB 00:24:28.695 element at address: 0x2000003a2900 with size: 0.000183 MiB 00:24:28.695 element at address: 0x2000003a2ac0 with size: 0.000183 MiB 00:24:28.695 element at address: 0x2000003a6140 with size: 0.000183 MiB 00:24:28.695 element at address: 0x2000003a6300 with size: 0.000183 MiB 00:24:28.695 element at address: 0x2000003a63c0 with size: 0.000183 MiB 00:24:28.695 element at address: 0x2000003a6580 with size: 0.000183 MiB 00:24:28.695 element at address: 0x2000003a9c00 with size: 0.000183 MiB 00:24:28.695 element at address: 0x2000003a9dc0 with size: 0.000183 MiB 00:24:28.695 element at address: 0x2000003a9e80 with size: 0.000183 MiB 00:24:28.695 element at address: 0x2000003aa040 with size: 0.000183 MiB 00:24:28.695 element at address: 0x2000003ad6c0 with size: 0.000183 MiB 00:24:28.695 element at address: 0x2000003ad880 with size: 0.000183 MiB 00:24:28.695 element at address: 0x2000003ad940 with size: 0.000183 MiB 00:24:28.695 element at address: 0x2000003adb00 with size: 0.000183 MiB 00:24:28.695 element at address: 0x2000003b1180 with size: 0.000183 MiB 00:24:28.695 element at address: 0x2000003b1340 with size: 0.000183 MiB 00:24:28.695 element at address: 0x2000003b1400 with size: 0.000183 MiB 00:24:28.695 element at address: 0x2000003b15c0 with size: 0.000183 MiB 00:24:28.695 element at address: 0x2000003b4c40 with size: 0.000183 MiB 00:24:28.695 element at address: 0x2000003b4e00 with size: 0.000183 MiB 00:24:28.695 element at address: 0x2000003b4ec0 with size: 0.000183 MiB 00:24:28.695 element at address: 0x2000003b5080 with size: 0.000183 MiB 00:24:28.695 element at address: 0x2000003b8700 with size: 0.000183 MiB 00:24:28.695 element at address: 0x2000003b88c0 with size: 0.000183 MiB 00:24:28.695 element at address: 0x2000003b8980 with size: 0.000183 MiB 00:24:28.695 element at address: 0x2000003b8b40 with size: 0.000183 MiB 00:24:28.695 element at address: 0x2000003bc1c0 with size: 0.000183 MiB 00:24:28.695 element at address: 0x2000003bc380 with size: 0.000183 MiB 00:24:28.695 element at address: 0x2000003bc440 with size: 0.000183 MiB 00:24:28.695 element at address: 0x2000003bc600 with size: 0.000183 MiB 00:24:28.695 element at address: 0x2000003bfc80 with size: 0.000183 MiB 00:24:28.695 element at address: 0x2000003bfe40 with size: 0.000183 MiB 00:24:28.695 element at address: 0x2000003bff00 with size: 0.000183 MiB 00:24:28.695 element at address: 0x2000003c00c0 with size: 0.000183 MiB 00:24:28.695 element at address: 0x2000003c3740 with size: 0.000183 MiB 00:24:28.695 element at address: 0x2000003c3900 with size: 0.000183 MiB 00:24:28.695 element at address: 0x2000003c39c0 with size: 0.000183 MiB 00:24:28.695 element at address: 0x2000003c3b80 with size: 0.000183 MiB 00:24:28.695 element at address: 0x2000003c7200 with size: 0.000183 MiB 00:24:28.695 element at address: 0x2000003c73c0 with size: 0.000183 MiB 00:24:28.695 element at address: 0x2000003c7480 with size: 0.000183 MiB 00:24:28.695 element at address: 0x2000003c7640 with size: 0.000183 MiB 00:24:28.695 element at address: 0x2000003cacc0 with size: 0.000183 MiB 00:24:28.695 element at address: 0x2000003cae80 with size: 0.000183 MiB 00:24:28.695 element at address: 0x2000003caf40 with size: 0.000183 MiB 00:24:28.695 element at address: 0x2000003cb100 with size: 0.000183 MiB 00:24:28.695 element at address: 0x2000003ce780 with size: 0.000183 MiB 00:24:28.695 element at address: 0x2000003ce940 with size: 0.000183 MiB 00:24:28.695 element at address: 0x2000003cea00 with size: 0.000183 MiB 00:24:28.695 element at address: 0x2000003cebc0 with size: 0.000183 MiB 00:24:28.695 element at address: 0x2000003d2240 with size: 0.000183 MiB 00:24:28.695 element at address: 0x2000003d2400 with size: 0.000183 MiB 00:24:28.695 element at address: 0x2000003d24c0 with size: 0.000183 MiB 00:24:28.695 element at address: 0x2000003d2680 with size: 0.000183 MiB 00:24:28.695 element at address: 0x2000003d5dc0 with size: 0.000183 MiB 00:24:28.695 element at address: 0x2000003d64c0 with size: 0.000183 MiB 00:24:28.695 element at address: 0x2000003d6580 with size: 0.000183 MiB 00:24:28.695 element at address: 0x2000003d6880 with size: 0.000183 MiB 00:24:28.695 element at address: 0x20000087c8c0 with size: 0.000183 MiB 00:24:28.695 element at address: 0x20000087c980 with size: 0.000183 MiB 00:24:28.695 element at address: 0x20000087ca40 with size: 0.000183 MiB 00:24:28.695 element at address: 0x20000087cb00 with size: 0.000183 MiB 00:24:28.695 element at address: 0x20000087cbc0 with size: 0.000183 MiB 00:24:28.695 element at address: 0x20000087cc80 with size: 0.000183 MiB 00:24:28.695 element at address: 0x20000087cd40 with size: 0.000183 MiB 00:24:28.695 element at address: 0x20000087ce00 with size: 0.000183 MiB 00:24:28.695 element at address: 0x20000087cec0 with size: 0.000183 MiB 00:24:28.695 element at address: 0x2000008fd180 with size: 0.000183 MiB 00:24:28.695 element at address: 0x2000070fdd80 with size: 0.000183 MiB 00:24:28.695 element at address: 0x20001aa94840 with size: 0.000183 MiB 00:24:28.695 element at address: 0x20001aa94900 with size: 0.000183 MiB 00:24:28.695 element at address: 0x20001aa949c0 with size: 0.000183 MiB 00:24:28.695 element at address: 0x20001aa94a80 with size: 0.000183 MiB 00:24:28.695 element at address: 0x20001aa94b40 with size: 0.000183 MiB 00:24:28.695 element at address: 0x20001aa94c00 with size: 0.000183 MiB 00:24:28.695 element at address: 0x20001aa94cc0 with size: 0.000183 MiB 00:24:28.695 element at address: 0x20001aa94d80 with size: 0.000183 MiB 00:24:28.695 element at address: 0x20001aa94e40 with size: 0.000183 MiB 00:24:28.695 element at address: 0x20001aa94f00 with size: 0.000183 MiB 00:24:28.695 element at address: 0x20001aa94fc0 with size: 0.000183 MiB 00:24:28.696 element at address: 0x20001aa95080 with size: 0.000183 MiB 00:24:28.696 element at address: 0x20001aa95140 with size: 0.000183 MiB 00:24:28.696 element at address: 0x20001aa95200 with size: 0.000183 MiB 00:24:28.696 element at address: 0x20001aa952c0 with size: 0.000183 MiB 00:24:28.696 element at address: 0x20001aa95380 with size: 0.000183 MiB 00:24:28.696 element at address: 0x20001aa95440 with size: 0.000183 MiB 00:24:28.696 element at address: 0x200027e66400 with size: 0.000183 MiB 00:24:28.696 element at address: 0x200027e664c0 with size: 0.000183 MiB 00:24:28.696 element at address: 0x200027e6d0c0 with size: 0.000183 MiB 00:24:28.696 element at address: 0x200027e6d2c0 with size: 0.000183 MiB 00:24:28.696 element at address: 0x200027e6d380 with size: 0.000183 MiB 00:24:28.696 element at address: 0x200027e6d440 with size: 0.000183 MiB 00:24:28.696 element at address: 0x200027e6d500 with size: 0.000183 MiB 00:24:28.696 element at address: 0x200027e6d5c0 with size: 0.000183 MiB 00:24:28.696 element at address: 0x200027e6d680 with size: 0.000183 MiB 00:24:28.696 element at address: 0x200027e6d740 with size: 0.000183 MiB 00:24:28.696 element at address: 0x200027e6d800 with size: 0.000183 MiB 00:24:28.696 element at address: 0x200027e6d8c0 with size: 0.000183 MiB 00:24:28.696 element at address: 0x200027e6d980 with size: 0.000183 MiB 00:24:28.696 element at address: 0x200027e6da40 with size: 0.000183 MiB 00:24:28.696 element at address: 0x200027e6db00 with size: 0.000183 MiB 00:24:28.696 element at address: 0x200027e6dbc0 with size: 0.000183 MiB 00:24:28.696 element at address: 0x200027e6dc80 with size: 0.000183 MiB 00:24:28.696 element at address: 0x200027e6dd40 with size: 0.000183 MiB 00:24:28.696 element at address: 0x200027e6de00 with size: 0.000183 MiB 00:24:28.696 element at address: 0x200027e6dec0 with size: 0.000183 MiB 00:24:28.696 element at address: 0x200027e6df80 with size: 0.000183 MiB 00:24:28.696 element at address: 0x200027e6e040 with size: 0.000183 MiB 00:24:28.696 element at address: 0x200027e6e100 with size: 0.000183 MiB 00:24:28.696 element at address: 0x200027e6e1c0 with size: 0.000183 MiB 00:24:28.696 element at address: 0x200027e6e280 with size: 0.000183 MiB 00:24:28.696 element at address: 0x200027e6e340 with size: 0.000183 MiB 00:24:28.696 element at address: 0x200027e6e400 with size: 0.000183 MiB 00:24:28.696 element at address: 0x200027e6e4c0 with size: 0.000183 MiB 00:24:28.696 element at address: 0x200027e6e580 with size: 0.000183 MiB 00:24:28.696 element at address: 0x200027e6e640 with size: 0.000183 MiB 00:24:28.696 element at address: 0x200027e6e700 with size: 0.000183 MiB 00:24:28.696 element at address: 0x200027e6e7c0 with size: 0.000183 MiB 00:24:28.696 element at address: 0x200027e6e880 with size: 0.000183 MiB 00:24:28.696 element at address: 0x200027e6e940 with size: 0.000183 MiB 00:24:28.696 element at address: 0x200027e6ea00 with size: 0.000183 MiB 00:24:28.696 element at address: 0x200027e6eac0 with size: 0.000183 MiB 00:24:28.696 element at address: 0x200027e6eb80 with size: 0.000183 MiB 00:24:28.696 element at address: 0x200027e6ec40 with size: 0.000183 MiB 00:24:28.696 element at address: 0x200027e6ed00 with size: 0.000183 MiB 00:24:28.696 element at address: 0x200027e6edc0 with size: 0.000183 MiB 00:24:28.696 element at address: 0x200027e6ee80 with size: 0.000183 MiB 00:24:28.696 element at address: 0x200027e6ef40 with size: 0.000183 MiB 00:24:28.696 element at address: 0x200027e6f000 with size: 0.000183 MiB 00:24:28.696 element at address: 0x200027e6f0c0 with size: 0.000183 MiB 00:24:28.696 element at address: 0x200027e6f180 with size: 0.000183 MiB 00:24:28.696 element at address: 0x200027e6f240 with size: 0.000183 MiB 00:24:28.696 element at address: 0x200027e6f300 with size: 0.000183 MiB 00:24:28.696 element at address: 0x200027e6f3c0 with size: 0.000183 MiB 00:24:28.696 element at address: 0x200027e6f480 with size: 0.000183 MiB 00:24:28.696 element at address: 0x200027e6f540 with size: 0.000183 MiB 00:24:28.696 element at address: 0x200027e6f600 with size: 0.000183 MiB 00:24:28.696 element at address: 0x200027e6f6c0 with size: 0.000183 MiB 00:24:28.696 element at address: 0x200027e6f780 with size: 0.000183 MiB 00:24:28.696 element at address: 0x200027e6f840 with size: 0.000183 MiB 00:24:28.696 element at address: 0x200027e6f900 with size: 0.000183 MiB 00:24:28.696 element at address: 0x200027e6f9c0 with size: 0.000183 MiB 00:24:28.696 element at address: 0x200027e6fa80 with size: 0.000183 MiB 00:24:28.696 element at address: 0x200027e6fb40 with size: 0.000183 MiB 00:24:28.696 element at address: 0x200027e6fc00 with size: 0.000183 MiB 00:24:28.696 element at address: 0x200027e6fcc0 with size: 0.000183 MiB 00:24:28.696 element at address: 0x200027e6fd80 with size: 0.000183 MiB 00:24:28.696 element at address: 0x200027e6fe40 with size: 0.000183 MiB 00:24:28.696 element at address: 0x200027e6ff00 with size: 0.000183 MiB 00:24:28.696 list of memzone associated elements. size: 602.308289 MiB 00:24:28.696 element at address: 0x20001aa95500 with size: 211.416748 MiB 00:24:28.696 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:24:28.696 element at address: 0x200027e6ffc0 with size: 157.562561 MiB 00:24:28.696 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:24:28.696 element at address: 0x2000139fab80 with size: 84.020630 MiB 00:24:28.696 associated memzone info: size: 84.020508 MiB name: MP_bdev_io_100480_0 00:24:28.696 element at address: 0x2000009ff380 with size: 48.003052 MiB 00:24:28.696 associated memzone info: size: 48.002930 MiB name: MP_evtpool_100480_0 00:24:28.696 element at address: 0x200003fff380 with size: 48.003052 MiB 00:24:28.696 associated memzone info: size: 48.002930 MiB name: MP_msgpool_100480_0 00:24:28.696 element at address: 0x2000195be940 with size: 20.255554 MiB 00:24:28.696 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:24:28.696 element at address: 0x200031dfeb40 with size: 18.005066 MiB 00:24:28.696 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:24:28.696 element at address: 0x2000005ffe00 with size: 2.000488 MiB 00:24:28.696 associated memzone info: size: 2.000366 MiB name: RG_MP_evtpool_100480 00:24:28.696 element at address: 0x200003bffe00 with size: 2.000488 MiB 00:24:28.696 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_100480 00:24:28.696 element at address: 0x20000022c5c0 with size: 1.008118 MiB 00:24:28.696 associated memzone info: size: 1.007996 MiB name: MP_evtpool_100480 00:24:28.696 element at address: 0x20000b2fde40 with size: 1.008118 MiB 00:24:28.696 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:24:28.696 element at address: 0x2000194bc800 with size: 1.008118 MiB 00:24:28.696 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:24:28.696 element at address: 0x2000070fde40 with size: 1.008118 MiB 00:24:28.696 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:24:28.696 element at address: 0x2000008fd240 with size: 1.008118 MiB 00:24:28.696 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:24:28.696 element at address: 0x200003eff180 with size: 1.000488 MiB 00:24:28.696 associated memzone info: size: 1.000366 MiB name: RG_ring_0_100480 00:24:28.696 element at address: 0x200003affc00 with size: 1.000488 MiB 00:24:28.696 associated memzone info: size: 1.000366 MiB name: RG_ring_1_100480 00:24:28.696 element at address: 0x2000138fa980 with size: 1.000488 MiB 00:24:28.696 associated memzone info: size: 1.000366 MiB name: RG_ring_4_100480 00:24:28.696 element at address: 0x200031cfe940 with size: 1.000488 MiB 00:24:28.696 associated memzone info: size: 1.000366 MiB name: RG_ring_5_100480 00:24:28.696 element at address: 0x200003a7fa00 with size: 0.500488 MiB 00:24:28.696 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_100480 00:24:28.696 element at address: 0x20000b27dc40 with size: 0.500488 MiB 00:24:28.696 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:24:28.696 element at address: 0x20000087cf80 with size: 0.500488 MiB 00:24:28.696 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:24:28.696 element at address: 0x20001947c600 with size: 0.250488 MiB 00:24:28.696 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:24:28.696 element at address: 0x20000020a840 with size: 0.125488 MiB 00:24:28.696 associated memzone info: size: 0.125366 MiB name: RG_ring_2_100480 00:24:28.696 element at address: 0x2000070f5b80 with size: 0.031738 MiB 00:24:28.696 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:24:28.696 element at address: 0x200027e66580 with size: 0.023743 MiB 00:24:28.697 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:24:28.697 element at address: 0x200000206580 with size: 0.016113 MiB 00:24:28.697 associated memzone info: size: 0.015991 MiB name: RG_ring_3_100480 00:24:28.697 element at address: 0x200027e6c6c0 with size: 0.002441 MiB 00:24:28.697 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:24:28.697 element at address: 0x2000003d5f80 with size: 0.001282 MiB 00:24:28.697 associated memzone info: size: 0.001160 MiB name: QAT_SYM_CAPA_GEN_1 00:24:28.697 element at address: 0x2000003d6a40 with size: 0.000427 MiB 00:24:28.697 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.0_qat 00:24:28.697 element at address: 0x2000003d2840 with size: 0.000427 MiB 00:24:28.697 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.1_qat 00:24:28.697 element at address: 0x2000003ced80 with size: 0.000427 MiB 00:24:28.697 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.2_qat 00:24:28.697 element at address: 0x2000003cb2c0 with size: 0.000427 MiB 00:24:28.697 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.3_qat 00:24:28.697 element at address: 0x2000003c7800 with size: 0.000427 MiB 00:24:28.697 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.4_qat 00:24:28.697 element at address: 0x2000003c3d40 with size: 0.000427 MiB 00:24:28.697 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.5_qat 00:24:28.697 element at address: 0x2000003c0280 with size: 0.000427 MiB 00:24:28.697 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.6_qat 00:24:28.697 element at address: 0x2000003bc7c0 with size: 0.000427 MiB 00:24:28.697 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.7_qat 00:24:28.697 element at address: 0x2000003b8d00 with size: 0.000427 MiB 00:24:28.697 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.0_qat 00:24:28.697 element at address: 0x2000003b5240 with size: 0.000427 MiB 00:24:28.697 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.1_qat 00:24:28.697 element at address: 0x2000003b1780 with size: 0.000427 MiB 00:24:28.697 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.2_qat 00:24:28.697 element at address: 0x2000003adcc0 with size: 0.000427 MiB 00:24:28.697 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.3_qat 00:24:28.697 element at address: 0x2000003aa200 with size: 0.000427 MiB 00:24:28.697 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.4_qat 00:24:28.697 element at address: 0x2000003a6740 with size: 0.000427 MiB 00:24:28.697 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.5_qat 00:24:28.697 element at address: 0x2000003a2c80 with size: 0.000427 MiB 00:24:28.697 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.6_qat 00:24:28.697 element at address: 0x20000039f1c0 with size: 0.000427 MiB 00:24:28.697 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.7_qat 00:24:28.697 element at address: 0x20000039b700 with size: 0.000427 MiB 00:24:28.697 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.0_qat 00:24:28.697 element at address: 0x200000397c40 with size: 0.000427 MiB 00:24:28.697 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.1_qat 00:24:28.697 element at address: 0x200000394180 with size: 0.000427 MiB 00:24:28.697 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.2_qat 00:24:28.697 element at address: 0x2000003906c0 with size: 0.000427 MiB 00:24:28.697 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.3_qat 00:24:28.697 element at address: 0x20000038cc00 with size: 0.000427 MiB 00:24:28.697 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.4_qat 00:24:28.697 element at address: 0x200000389140 with size: 0.000427 MiB 00:24:28.697 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.5_qat 00:24:28.697 element at address: 0x200000385680 with size: 0.000427 MiB 00:24:28.697 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.6_qat 00:24:28.697 element at address: 0x200000381bc0 with size: 0.000427 MiB 00:24:28.697 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.7_qat 00:24:28.697 element at address: 0x20000037e100 with size: 0.000427 MiB 00:24:28.697 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.0_qat 00:24:28.697 element at address: 0x20000037a640 with size: 0.000427 MiB 00:24:28.697 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.1_qat 00:24:28.697 element at address: 0x200000376b80 with size: 0.000427 MiB 00:24:28.697 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.2_qat 00:24:28.697 element at address: 0x2000003730c0 with size: 0.000427 MiB 00:24:28.697 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.3_qat 00:24:28.697 element at address: 0x20000036f600 with size: 0.000427 MiB 00:24:28.697 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.4_qat 00:24:28.697 element at address: 0x20000036bb40 with size: 0.000427 MiB 00:24:28.697 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.5_qat 00:24:28.697 element at address: 0x200000368080 with size: 0.000427 MiB 00:24:28.697 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.6_qat 00:24:28.697 element at address: 0x2000003645c0 with size: 0.000427 MiB 00:24:28.697 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.7_qat 00:24:28.697 element at address: 0x200000360b00 with size: 0.000427 MiB 00:24:28.697 associated memzone info: size: 0.000305 MiB name: 0000:da:01.0_qat 00:24:28.697 element at address: 0x20000035d580 with size: 0.000427 MiB 00:24:28.697 associated memzone info: size: 0.000305 MiB name: 0000:da:01.1_qat 00:24:28.697 element at address: 0x20000035a000 with size: 0.000427 MiB 00:24:28.697 associated memzone info: size: 0.000305 MiB name: 0000:da:01.2_qat 00:24:28.697 element at address: 0x200000356a80 with size: 0.000427 MiB 00:24:28.697 associated memzone info: size: 0.000305 MiB name: 0000:da:01.3_qat 00:24:28.697 element at address: 0x200000353500 with size: 0.000427 MiB 00:24:28.697 associated memzone info: size: 0.000305 MiB name: 0000:da:01.4_qat 00:24:28.697 element at address: 0x20000034ff80 with size: 0.000427 MiB 00:24:28.697 associated memzone info: size: 0.000305 MiB name: 0000:da:01.5_qat 00:24:28.697 element at address: 0x20000034ca00 with size: 0.000427 MiB 00:24:28.697 associated memzone info: size: 0.000305 MiB name: 0000:da:01.6_qat 00:24:28.697 element at address: 0x200000349480 with size: 0.000427 MiB 00:24:28.697 associated memzone info: size: 0.000305 MiB name: 0000:da:01.7_qat 00:24:28.697 element at address: 0x200000345f00 with size: 0.000427 MiB 00:24:28.697 associated memzone info: size: 0.000305 MiB name: 0000:da:02.0_qat 00:24:28.697 element at address: 0x200000342980 with size: 0.000427 MiB 00:24:28.697 associated memzone info: size: 0.000305 MiB name: 0000:da:02.1_qat 00:24:28.697 element at address: 0x20000033f400 with size: 0.000427 MiB 00:24:28.697 associated memzone info: size: 0.000305 MiB name: 0000:da:02.2_qat 00:24:28.697 element at address: 0x20000033be80 with size: 0.000427 MiB 00:24:28.697 associated memzone info: size: 0.000305 MiB name: 0000:da:02.3_qat 00:24:28.697 element at address: 0x200000338900 with size: 0.000427 MiB 00:24:28.697 associated memzone info: size: 0.000305 MiB name: 0000:da:02.4_qat 00:24:28.697 element at address: 0x200000335380 with size: 0.000427 MiB 00:24:28.697 associated memzone info: size: 0.000305 MiB name: 0000:da:02.5_qat 00:24:28.697 element at address: 0x200000331e00 with size: 0.000427 MiB 00:24:28.697 associated memzone info: size: 0.000305 MiB name: 0000:da:02.6_qat 00:24:28.697 element at address: 0x20000032e880 with size: 0.000427 MiB 00:24:28.697 associated memzone info: size: 0.000305 MiB name: 0000:da:02.7_qat 00:24:28.697 element at address: 0x2000003d6740 with size: 0.000305 MiB 00:24:28.697 associated memzone info: size: 0.000183 MiB name: QAT_ASYM_CAPA_GEN_1 00:24:28.697 element at address: 0x20000022b7c0 with size: 0.000305 MiB 00:24:28.697 associated memzone info: size: 0.000183 MiB name: MP_msgpool_100480 00:24:28.697 element at address: 0x200000206380 with size: 0.000305 MiB 00:24:28.697 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_100480 00:24:28.697 element at address: 0x200027e6d180 with size: 0.000305 MiB 00:24:28.697 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:24:28.697 element at address: 0x2000003d6940 with size: 0.000244 MiB 00:24:28.697 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_0 00:24:28.697 element at address: 0x2000003d6640 with size: 0.000244 MiB 00:24:28.697 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_1 00:24:28.697 element at address: 0x2000003d5e80 with size: 0.000244 MiB 00:24:28.697 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_0 00:24:28.697 element at address: 0x2000003d2740 with size: 0.000244 MiB 00:24:28.697 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_2 00:24:28.697 element at address: 0x2000003d2580 with size: 0.000244 MiB 00:24:28.697 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_3 00:24:28.697 element at address: 0x2000003d2300 with size: 0.000244 MiB 00:24:28.697 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_1 00:24:28.697 element at address: 0x2000003cec80 with size: 0.000244 MiB 00:24:28.697 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_4 00:24:28.697 element at address: 0x2000003ceac0 with size: 0.000244 MiB 00:24:28.697 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_5 00:24:28.697 element at address: 0x2000003ce840 with size: 0.000244 MiB 00:24:28.697 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_2 00:24:28.697 element at address: 0x2000003cb1c0 with size: 0.000244 MiB 00:24:28.697 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_6 00:24:28.697 element at address: 0x2000003cb000 with size: 0.000244 MiB 00:24:28.697 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_7 00:24:28.697 element at address: 0x2000003cad80 with size: 0.000244 MiB 00:24:28.697 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_3 00:24:28.697 element at address: 0x2000003c7700 with size: 0.000244 MiB 00:24:28.697 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_8 00:24:28.697 element at address: 0x2000003c7540 with size: 0.000244 MiB 00:24:28.697 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_9 00:24:28.697 element at address: 0x2000003c72c0 with size: 0.000244 MiB 00:24:28.697 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_4 00:24:28.697 element at address: 0x2000003c3c40 with size: 0.000244 MiB 00:24:28.697 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_10 00:24:28.697 element at address: 0x2000003c3a80 with size: 0.000244 MiB 00:24:28.697 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_11 00:24:28.698 element at address: 0x2000003c3800 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_5 00:24:28.698 element at address: 0x2000003c0180 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_12 00:24:28.698 element at address: 0x2000003bffc0 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_13 00:24:28.698 element at address: 0x2000003bfd40 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_6 00:24:28.698 element at address: 0x2000003bc6c0 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_14 00:24:28.698 element at address: 0x2000003bc500 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_15 00:24:28.698 element at address: 0x2000003bc280 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_7 00:24:28.698 element at address: 0x2000003b8c00 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_16 00:24:28.698 element at address: 0x2000003b8a40 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_17 00:24:28.698 element at address: 0x2000003b87c0 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_8 00:24:28.698 element at address: 0x2000003b5140 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_18 00:24:28.698 element at address: 0x2000003b4f80 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_19 00:24:28.698 element at address: 0x2000003b4d00 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_9 00:24:28.698 element at address: 0x2000003b1680 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_20 00:24:28.698 element at address: 0x2000003b14c0 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_21 00:24:28.698 element at address: 0x2000003b1240 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_10 00:24:28.698 element at address: 0x2000003adbc0 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_22 00:24:28.698 element at address: 0x2000003ada00 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_23 00:24:28.698 element at address: 0x2000003ad780 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_11 00:24:28.698 element at address: 0x2000003aa100 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_24 00:24:28.698 element at address: 0x2000003a9f40 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_25 00:24:28.698 element at address: 0x2000003a9cc0 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_12 00:24:28.698 element at address: 0x2000003a6640 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_26 00:24:28.698 element at address: 0x2000003a6480 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_27 00:24:28.698 element at address: 0x2000003a6200 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_13 00:24:28.698 element at address: 0x2000003a2b80 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_28 00:24:28.698 element at address: 0x2000003a29c0 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_29 00:24:28.698 element at address: 0x2000003a2740 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_14 00:24:28.698 element at address: 0x20000039f0c0 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_30 00:24:28.698 element at address: 0x20000039ef00 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_31 00:24:28.698 element at address: 0x20000039ec80 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_15 00:24:28.698 element at address: 0x20000039b600 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_32 00:24:28.698 element at address: 0x20000039b440 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_33 00:24:28.698 element at address: 0x20000039b1c0 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_16 00:24:28.698 element at address: 0x200000397b40 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_34 00:24:28.698 element at address: 0x200000397980 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_35 00:24:28.698 element at address: 0x200000397700 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_17 00:24:28.698 element at address: 0x200000394080 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_36 00:24:28.698 element at address: 0x200000393ec0 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_37 00:24:28.698 element at address: 0x200000393c40 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_18 00:24:28.698 element at address: 0x2000003905c0 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_38 00:24:28.698 element at address: 0x200000390400 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_39 00:24:28.698 element at address: 0x200000390180 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_19 00:24:28.698 element at address: 0x20000038cb00 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_40 00:24:28.698 element at address: 0x20000038c940 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_41 00:24:28.698 element at address: 0x20000038c6c0 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_20 00:24:28.698 element at address: 0x200000389040 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_42 00:24:28.698 element at address: 0x200000388e80 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_43 00:24:28.698 element at address: 0x200000388c00 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_21 00:24:28.698 element at address: 0x200000385580 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_44 00:24:28.698 element at address: 0x2000003853c0 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_45 00:24:28.698 element at address: 0x200000385140 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_22 00:24:28.698 element at address: 0x200000381ac0 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_46 00:24:28.698 element at address: 0x200000381900 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_47 00:24:28.698 element at address: 0x200000381680 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_23 00:24:28.698 element at address: 0x20000037e000 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_48 00:24:28.698 element at address: 0x20000037de40 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_49 00:24:28.698 element at address: 0x20000037dbc0 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_24 00:24:28.698 element at address: 0x20000037a540 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_50 00:24:28.698 element at address: 0x20000037a380 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_51 00:24:28.698 element at address: 0x20000037a100 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_25 00:24:28.698 element at address: 0x200000376a80 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_52 00:24:28.698 element at address: 0x2000003768c0 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_53 00:24:28.698 element at address: 0x200000376640 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_26 00:24:28.698 element at address: 0x200000372fc0 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_54 00:24:28.698 element at address: 0x200000372e00 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_55 00:24:28.698 element at address: 0x200000372b80 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_27 00:24:28.698 element at address: 0x20000036f500 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_56 00:24:28.698 element at address: 0x20000036f340 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_57 00:24:28.698 element at address: 0x20000036f0c0 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_28 00:24:28.698 element at address: 0x20000036ba40 with size: 0.000244 MiB 00:24:28.698 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_58 00:24:28.698 element at address: 0x20000036b880 with size: 0.000244 MiB 00:24:28.699 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_59 00:24:28.699 element at address: 0x20000036b600 with size: 0.000244 MiB 00:24:28.699 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_29 00:24:28.699 element at address: 0x200000367f80 with size: 0.000244 MiB 00:24:28.699 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_60 00:24:28.699 element at address: 0x200000367dc0 with size: 0.000244 MiB 00:24:28.699 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_61 00:24:28.699 element at address: 0x200000367b40 with size: 0.000244 MiB 00:24:28.699 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_30 00:24:28.699 element at address: 0x2000003644c0 with size: 0.000244 MiB 00:24:28.699 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_62 00:24:28.699 element at address: 0x200000364300 with size: 0.000244 MiB 00:24:28.699 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_63 00:24:28.699 element at address: 0x200000364080 with size: 0.000244 MiB 00:24:28.699 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_31 00:24:28.699 element at address: 0x2000003d5d00 with size: 0.000183 MiB 00:24:28.699 associated memzone info: size: 0.000061 MiB name: QAT_COMP_CAPA_GEN_1 00:24:28.699 11:35:12 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:24:28.699 11:35:12 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 100480 00:24:28.699 11:35:12 dpdk_mem_utility -- common/autotest_common.sh@949 -- # '[' -z 100480 ']' 00:24:28.699 11:35:12 dpdk_mem_utility -- common/autotest_common.sh@953 -- # kill -0 100480 00:24:28.699 11:35:12 dpdk_mem_utility -- common/autotest_common.sh@954 -- # uname 00:24:28.699 11:35:12 dpdk_mem_utility -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:24:28.699 11:35:12 dpdk_mem_utility -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 100480 00:24:28.699 11:35:12 dpdk_mem_utility -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:24:28.699 11:35:12 dpdk_mem_utility -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:24:28.699 11:35:12 dpdk_mem_utility -- common/autotest_common.sh@967 -- # echo 'killing process with pid 100480' 00:24:28.699 killing process with pid 100480 00:24:28.699 11:35:12 dpdk_mem_utility -- common/autotest_common.sh@968 -- # kill 100480 00:24:28.699 11:35:12 dpdk_mem_utility -- common/autotest_common.sh@973 -- # wait 100480 00:24:28.959 00:24:28.959 real 0m1.505s 00:24:28.959 user 0m1.508s 00:24:28.959 sys 0m0.502s 00:24:28.959 11:35:12 dpdk_mem_utility -- common/autotest_common.sh@1125 -- # xtrace_disable 00:24:28.959 11:35:12 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:24:28.959 ************************************ 00:24:28.959 END TEST dpdk_mem_utility 00:24:28.959 ************************************ 00:24:29.218 11:35:12 -- spdk/autotest.sh@181 -- # run_test event /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event.sh 00:24:29.218 11:35:12 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:24:29.218 11:35:12 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:24:29.218 11:35:12 -- common/autotest_common.sh@10 -- # set +x 00:24:29.218 ************************************ 00:24:29.218 START TEST event 00:24:29.218 ************************************ 00:24:29.218 11:35:12 event -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event.sh 00:24:29.218 * Looking for test storage... 00:24:29.218 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event 00:24:29.218 11:35:13 event -- event/event.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:24:29.218 11:35:13 event -- bdev/nbd_common.sh@6 -- # set -e 00:24:29.218 11:35:13 event -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:24:29.218 11:35:13 event -- common/autotest_common.sh@1100 -- # '[' 6 -le 1 ']' 00:24:29.218 11:35:13 event -- common/autotest_common.sh@1106 -- # xtrace_disable 00:24:29.218 11:35:13 event -- common/autotest_common.sh@10 -- # set +x 00:24:29.218 ************************************ 00:24:29.218 START TEST event_perf 00:24:29.218 ************************************ 00:24:29.218 11:35:13 event.event_perf -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:24:29.218 Running I/O for 1 seconds...[2024-06-10 11:35:13.128700] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:24:29.218 [2024-06-10 11:35:13.128762] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid100742 ] 00:24:29.478 [2024-06-10 11:35:13.217463] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:24:29.478 [2024-06-10 11:35:13.302539] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:24:29.478 [2024-06-10 11:35:13.302625] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:24:29.478 [2024-06-10 11:35:13.302702] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 3 00:24:29.478 [2024-06-10 11:35:13.302704] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:24:30.855 Running I/O for 1 seconds... 00:24:30.855 lcore 0: 210940 00:24:30.855 lcore 1: 210939 00:24:30.855 lcore 2: 210940 00:24:30.855 lcore 3: 210940 00:24:30.855 done. 00:24:30.855 00:24:30.855 real 0m1.277s 00:24:30.855 user 0m4.161s 00:24:30.855 sys 0m0.110s 00:24:30.855 11:35:14 event.event_perf -- common/autotest_common.sh@1125 -- # xtrace_disable 00:24:30.855 11:35:14 event.event_perf -- common/autotest_common.sh@10 -- # set +x 00:24:30.855 ************************************ 00:24:30.855 END TEST event_perf 00:24:30.855 ************************************ 00:24:30.855 11:35:14 event -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:24:30.855 11:35:14 event -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:24:30.855 11:35:14 event -- common/autotest_common.sh@1106 -- # xtrace_disable 00:24:30.855 11:35:14 event -- common/autotest_common.sh@10 -- # set +x 00:24:30.855 ************************************ 00:24:30.855 START TEST event_reactor 00:24:30.855 ************************************ 00:24:30.855 11:35:14 event.event_reactor -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:24:30.855 [2024-06-10 11:35:14.478143] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:24:30.855 [2024-06-10 11:35:14.478200] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid100944 ] 00:24:30.855 [2024-06-10 11:35:14.564166] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:30.855 [2024-06-10 11:35:14.645710] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:24:31.792 test_start 00:24:31.792 oneshot 00:24:31.792 tick 100 00:24:31.792 tick 100 00:24:31.792 tick 250 00:24:31.792 tick 100 00:24:31.792 tick 100 00:24:31.792 tick 250 00:24:31.792 tick 100 00:24:31.792 tick 500 00:24:31.792 tick 100 00:24:31.792 tick 100 00:24:31.792 tick 250 00:24:31.792 tick 100 00:24:31.792 tick 100 00:24:31.792 test_end 00:24:31.792 00:24:31.792 real 0m1.271s 00:24:31.792 user 0m1.158s 00:24:31.792 sys 0m0.108s 00:24:31.792 11:35:15 event.event_reactor -- common/autotest_common.sh@1125 -- # xtrace_disable 00:24:31.792 11:35:15 event.event_reactor -- common/autotest_common.sh@10 -- # set +x 00:24:31.792 ************************************ 00:24:31.792 END TEST event_reactor 00:24:31.792 ************************************ 00:24:32.051 11:35:15 event -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:24:32.051 11:35:15 event -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:24:32.051 11:35:15 event -- common/autotest_common.sh@1106 -- # xtrace_disable 00:24:32.051 11:35:15 event -- common/autotest_common.sh@10 -- # set +x 00:24:32.051 ************************************ 00:24:32.051 START TEST event_reactor_perf 00:24:32.051 ************************************ 00:24:32.051 11:35:15 event.event_reactor_perf -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:24:32.051 [2024-06-10 11:35:15.835416] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:24:32.051 [2024-06-10 11:35:15.835487] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid101141 ] 00:24:32.051 [2024-06-10 11:35:15.923341] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:32.309 [2024-06-10 11:35:16.009371] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:24:33.246 test_start 00:24:33.246 test_end 00:24:33.246 Performance: 511824 events per second 00:24:33.246 00:24:33.246 real 0m1.281s 00:24:33.246 user 0m1.173s 00:24:33.246 sys 0m0.103s 00:24:33.247 11:35:17 event.event_reactor_perf -- common/autotest_common.sh@1125 -- # xtrace_disable 00:24:33.247 11:35:17 event.event_reactor_perf -- common/autotest_common.sh@10 -- # set +x 00:24:33.247 ************************************ 00:24:33.247 END TEST event_reactor_perf 00:24:33.247 ************************************ 00:24:33.247 11:35:17 event -- event/event.sh@49 -- # uname -s 00:24:33.247 11:35:17 event -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:24:33.247 11:35:17 event -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:24:33.247 11:35:17 event -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:24:33.247 11:35:17 event -- common/autotest_common.sh@1106 -- # xtrace_disable 00:24:33.247 11:35:17 event -- common/autotest_common.sh@10 -- # set +x 00:24:33.247 ************************************ 00:24:33.247 START TEST event_scheduler 00:24:33.247 ************************************ 00:24:33.247 11:35:17 event.event_scheduler -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:24:33.506 * Looking for test storage... 00:24:33.506 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler 00:24:33.506 11:35:17 event.event_scheduler -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:24:33.506 11:35:17 event.event_scheduler -- scheduler/scheduler.sh@35 -- # scheduler_pid=101359 00:24:33.506 11:35:17 event.event_scheduler -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:24:33.506 11:35:17 event.event_scheduler -- scheduler/scheduler.sh@37 -- # waitforlisten 101359 00:24:33.506 11:35:17 event.event_scheduler -- common/autotest_common.sh@830 -- # '[' -z 101359 ']' 00:24:33.506 11:35:17 event.event_scheduler -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:24:33.506 11:35:17 event.event_scheduler -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:24:33.506 11:35:17 event.event_scheduler -- common/autotest_common.sh@835 -- # local max_retries=100 00:24:33.506 11:35:17 event.event_scheduler -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:24:33.506 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:24:33.506 11:35:17 event.event_scheduler -- common/autotest_common.sh@839 -- # xtrace_disable 00:24:33.506 11:35:17 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:24:33.506 [2024-06-10 11:35:17.330903] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:24:33.506 [2024-06-10 11:35:17.330955] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid101359 ] 00:24:33.506 [2024-06-10 11:35:17.414789] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:24:33.765 [2024-06-10 11:35:17.501088] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:24:33.765 [2024-06-10 11:35:17.501164] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:24:33.765 [2024-06-10 11:35:17.501238] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 3 00:24:33.765 [2024-06-10 11:35:17.501240] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:24:34.333 11:35:18 event.event_scheduler -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:24:34.333 11:35:18 event.event_scheduler -- common/autotest_common.sh@863 -- # return 0 00:24:34.333 11:35:18 event.event_scheduler -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:24:34.333 11:35:18 event.event_scheduler -- common/autotest_common.sh@560 -- # xtrace_disable 00:24:34.333 11:35:18 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:24:34.333 POWER: Env isn't set yet! 00:24:34.333 POWER: Attempting to initialise ACPI cpufreq power management... 00:24:34.333 POWER: Failed to write /sys/devices/system/cpu/cpu%u/cpufreq/scaling_governor 00:24:34.333 POWER: Cannot set governor of lcore 0 to userspace 00:24:34.333 POWER: Attempting to initialise PSTAT power management... 00:24:34.333 POWER: Power management governor of lcore 0 has been set to 'performance' successfully 00:24:34.333 POWER: Initialized successfully for lcore 0 power management 00:24:34.333 POWER: Power management governor of lcore 1 has been set to 'performance' successfully 00:24:34.333 POWER: Initialized successfully for lcore 1 power management 00:24:34.333 POWER: Power management governor of lcore 2 has been set to 'performance' successfully 00:24:34.333 POWER: Initialized successfully for lcore 2 power management 00:24:34.333 POWER: Power management governor of lcore 3 has been set to 'performance' successfully 00:24:34.333 POWER: Initialized successfully for lcore 3 power management 00:24:34.333 [2024-06-10 11:35:18.205037] scheduler_dynamic.c: 382:set_opts: *NOTICE*: Setting scheduler load limit to 20 00:24:34.333 [2024-06-10 11:35:18.205052] scheduler_dynamic.c: 384:set_opts: *NOTICE*: Setting scheduler core limit to 80 00:24:34.333 [2024-06-10 11:35:18.205062] scheduler_dynamic.c: 386:set_opts: *NOTICE*: Setting scheduler core busy to 95 00:24:34.333 11:35:18 event.event_scheduler -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:24:34.333 11:35:18 event.event_scheduler -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:24:34.333 11:35:18 event.event_scheduler -- common/autotest_common.sh@560 -- # xtrace_disable 00:24:34.333 11:35:18 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:24:34.592 [2024-06-10 11:35:18.296066] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:24:34.592 11:35:18 event.event_scheduler -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:24:34.592 11:35:18 event.event_scheduler -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:24:34.592 11:35:18 event.event_scheduler -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:24:34.592 11:35:18 event.event_scheduler -- common/autotest_common.sh@1106 -- # xtrace_disable 00:24:34.592 11:35:18 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:24:34.592 ************************************ 00:24:34.592 START TEST scheduler_create_thread 00:24:34.592 ************************************ 00:24:34.592 11:35:18 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1124 -- # scheduler_create_thread 00:24:34.593 11:35:18 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:24:34.593 11:35:18 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@560 -- # xtrace_disable 00:24:34.593 11:35:18 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:24:34.593 2 00:24:34.593 11:35:18 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:24:34.593 11:35:18 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:24:34.593 11:35:18 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@560 -- # xtrace_disable 00:24:34.593 11:35:18 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:24:34.593 3 00:24:34.593 11:35:18 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:24:34.593 11:35:18 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:24:34.593 11:35:18 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@560 -- # xtrace_disable 00:24:34.593 11:35:18 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:24:34.593 4 00:24:34.593 11:35:18 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:24:34.593 11:35:18 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:24:34.593 11:35:18 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@560 -- # xtrace_disable 00:24:34.593 11:35:18 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:24:34.593 5 00:24:34.593 11:35:18 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:24:34.593 11:35:18 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:24:34.593 11:35:18 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@560 -- # xtrace_disable 00:24:34.593 11:35:18 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:24:34.593 6 00:24:34.593 11:35:18 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:24:34.593 11:35:18 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:24:34.593 11:35:18 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@560 -- # xtrace_disable 00:24:34.593 11:35:18 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:24:34.593 7 00:24:34.593 11:35:18 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:24:34.593 11:35:18 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:24:34.593 11:35:18 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@560 -- # xtrace_disable 00:24:34.593 11:35:18 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:24:34.593 8 00:24:34.593 11:35:18 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:24:34.593 11:35:18 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:24:34.593 11:35:18 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@560 -- # xtrace_disable 00:24:34.593 11:35:18 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:24:34.593 9 00:24:34.593 11:35:18 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:24:34.593 11:35:18 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:24:34.593 11:35:18 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@560 -- # xtrace_disable 00:24:34.593 11:35:18 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:24:34.593 10 00:24:34.593 11:35:18 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:24:34.593 11:35:18 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:24:34.593 11:35:18 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@560 -- # xtrace_disable 00:24:34.593 11:35:18 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:24:34.593 11:35:18 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:24:34.593 11:35:18 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # thread_id=11 00:24:34.593 11:35:18 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:24:34.593 11:35:18 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@560 -- # xtrace_disable 00:24:34.593 11:35:18 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:24:35.161 11:35:18 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:24:35.161 11:35:18 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:24:35.161 11:35:18 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@560 -- # xtrace_disable 00:24:35.161 11:35:18 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:24:36.540 11:35:20 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:24:36.540 11:35:20 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # thread_id=12 00:24:36.540 11:35:20 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:24:36.540 11:35:20 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@560 -- # xtrace_disable 00:24:36.540 11:35:20 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:24:37.918 11:35:21 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:24:37.918 00:24:37.918 real 0m3.103s 00:24:37.918 user 0m0.026s 00:24:37.918 sys 0m0.005s 00:24:37.918 11:35:21 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1125 -- # xtrace_disable 00:24:37.918 11:35:21 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:24:37.918 ************************************ 00:24:37.918 END TEST scheduler_create_thread 00:24:37.918 ************************************ 00:24:37.918 11:35:21 event.event_scheduler -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:24:37.918 11:35:21 event.event_scheduler -- scheduler/scheduler.sh@46 -- # killprocess 101359 00:24:37.918 11:35:21 event.event_scheduler -- common/autotest_common.sh@949 -- # '[' -z 101359 ']' 00:24:37.918 11:35:21 event.event_scheduler -- common/autotest_common.sh@953 -- # kill -0 101359 00:24:37.918 11:35:21 event.event_scheduler -- common/autotest_common.sh@954 -- # uname 00:24:37.918 11:35:21 event.event_scheduler -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:24:37.918 11:35:21 event.event_scheduler -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 101359 00:24:37.918 11:35:21 event.event_scheduler -- common/autotest_common.sh@955 -- # process_name=reactor_2 00:24:37.918 11:35:21 event.event_scheduler -- common/autotest_common.sh@959 -- # '[' reactor_2 = sudo ']' 00:24:37.918 11:35:21 event.event_scheduler -- common/autotest_common.sh@967 -- # echo 'killing process with pid 101359' 00:24:37.918 killing process with pid 101359 00:24:37.918 11:35:21 event.event_scheduler -- common/autotest_common.sh@968 -- # kill 101359 00:24:37.918 11:35:21 event.event_scheduler -- common/autotest_common.sh@973 -- # wait 101359 00:24:37.918 [2024-06-10 11:35:21.819569] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:24:38.178 POWER: Power management governor of lcore 0 has been set to 'powersave' successfully 00:24:38.178 POWER: Power management of lcore 0 has exited from 'performance' mode and been set back to the original 00:24:38.178 POWER: Power management governor of lcore 1 has been set to 'powersave' successfully 00:24:38.178 POWER: Power management of lcore 1 has exited from 'performance' mode and been set back to the original 00:24:38.178 POWER: Power management governor of lcore 2 has been set to 'powersave' successfully 00:24:38.178 POWER: Power management of lcore 2 has exited from 'performance' mode and been set back to the original 00:24:38.178 POWER: Power management governor of lcore 3 has been set to 'powersave' successfully 00:24:38.178 POWER: Power management of lcore 3 has exited from 'performance' mode and been set back to the original 00:24:38.178 00:24:38.178 real 0m4.871s 00:24:38.178 user 0m9.405s 00:24:38.178 sys 0m0.436s 00:24:38.178 11:35:22 event.event_scheduler -- common/autotest_common.sh@1125 -- # xtrace_disable 00:24:38.178 11:35:22 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:24:38.178 ************************************ 00:24:38.178 END TEST event_scheduler 00:24:38.178 ************************************ 00:24:38.178 11:35:22 event -- event/event.sh@51 -- # modprobe -n nbd 00:24:38.178 11:35:22 event -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:24:38.178 11:35:22 event -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:24:38.178 11:35:22 event -- common/autotest_common.sh@1106 -- # xtrace_disable 00:24:38.178 11:35:22 event -- common/autotest_common.sh@10 -- # set +x 00:24:38.437 ************************************ 00:24:38.437 START TEST app_repeat 00:24:38.437 ************************************ 00:24:38.437 11:35:22 event.app_repeat -- common/autotest_common.sh@1124 -- # app_repeat_test 00:24:38.437 11:35:22 event.app_repeat -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:24:38.437 11:35:22 event.app_repeat -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:24:38.437 11:35:22 event.app_repeat -- event/event.sh@13 -- # local nbd_list 00:24:38.437 11:35:22 event.app_repeat -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:24:38.437 11:35:22 event.app_repeat -- event/event.sh@14 -- # local bdev_list 00:24:38.437 11:35:22 event.app_repeat -- event/event.sh@15 -- # local repeat_times=4 00:24:38.437 11:35:22 event.app_repeat -- event/event.sh@17 -- # modprobe nbd 00:24:38.437 11:35:22 event.app_repeat -- event/event.sh@18 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:24:38.437 11:35:22 event.app_repeat -- event/event.sh@19 -- # repeat_pid=102032 00:24:38.437 11:35:22 event.app_repeat -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:24:38.437 11:35:22 event.app_repeat -- event/event.sh@21 -- # echo 'Process app_repeat pid: 102032' 00:24:38.437 Process app_repeat pid: 102032 00:24:38.437 11:35:22 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:24:38.437 11:35:22 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:24:38.437 spdk_app_start Round 0 00:24:38.437 11:35:22 event.app_repeat -- event/event.sh@25 -- # waitforlisten 102032 /var/tmp/spdk-nbd.sock 00:24:38.437 11:35:22 event.app_repeat -- common/autotest_common.sh@830 -- # '[' -z 102032 ']' 00:24:38.437 11:35:22 event.app_repeat -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:24:38.437 11:35:22 event.app_repeat -- common/autotest_common.sh@835 -- # local max_retries=100 00:24:38.437 11:35:22 event.app_repeat -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:24:38.437 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:24:38.437 11:35:22 event.app_repeat -- common/autotest_common.sh@839 -- # xtrace_disable 00:24:38.437 11:35:22 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:24:38.437 [2024-06-10 11:35:22.177974] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:24:38.437 [2024-06-10 11:35:22.178021] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid102032 ] 00:24:38.437 [2024-06-10 11:35:22.264951] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:24:38.437 [2024-06-10 11:35:22.355835] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:24:38.437 [2024-06-10 11:35:22.355838] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:24:39.375 11:35:23 event.app_repeat -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:24:39.375 11:35:23 event.app_repeat -- common/autotest_common.sh@863 -- # return 0 00:24:39.375 11:35:23 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:24:39.375 Malloc0 00:24:39.375 11:35:23 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:24:39.635 Malloc1 00:24:39.635 11:35:23 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:24:39.635 11:35:23 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:24:39.635 11:35:23 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:24:39.635 11:35:23 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:24:39.635 11:35:23 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:24:39.635 11:35:23 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:24:39.635 11:35:23 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:24:39.635 11:35:23 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:24:39.635 11:35:23 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:24:39.635 11:35:23 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:39.635 11:35:23 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:24:39.635 11:35:23 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:39.635 11:35:23 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:24:39.635 11:35:23 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:39.635 11:35:23 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:39.635 11:35:23 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:24:39.635 /dev/nbd0 00:24:39.894 11:35:23 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:24:39.894 11:35:23 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:24:39.894 11:35:23 event.app_repeat -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:24:39.894 11:35:23 event.app_repeat -- common/autotest_common.sh@868 -- # local i 00:24:39.894 11:35:23 event.app_repeat -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:24:39.894 11:35:23 event.app_repeat -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:24:39.894 11:35:23 event.app_repeat -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:24:39.894 11:35:23 event.app_repeat -- common/autotest_common.sh@872 -- # break 00:24:39.894 11:35:23 event.app_repeat -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:24:39.894 11:35:23 event.app_repeat -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:24:39.894 11:35:23 event.app_repeat -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:24:39.894 1+0 records in 00:24:39.894 1+0 records out 00:24:39.894 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000248934 s, 16.5 MB/s 00:24:39.894 11:35:23 event.app_repeat -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:24:39.894 11:35:23 event.app_repeat -- common/autotest_common.sh@885 -- # size=4096 00:24:39.894 11:35:23 event.app_repeat -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:24:39.894 11:35:23 event.app_repeat -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:24:39.894 11:35:23 event.app_repeat -- common/autotest_common.sh@888 -- # return 0 00:24:39.894 11:35:23 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:39.894 11:35:23 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:39.894 11:35:23 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:24:39.894 /dev/nbd1 00:24:39.894 11:35:23 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:24:39.894 11:35:23 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:24:39.894 11:35:23 event.app_repeat -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:24:39.894 11:35:23 event.app_repeat -- common/autotest_common.sh@868 -- # local i 00:24:39.894 11:35:23 event.app_repeat -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:24:39.894 11:35:23 event.app_repeat -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:24:39.894 11:35:23 event.app_repeat -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:24:39.894 11:35:23 event.app_repeat -- common/autotest_common.sh@872 -- # break 00:24:39.894 11:35:23 event.app_repeat -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:24:39.894 11:35:23 event.app_repeat -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:24:39.894 11:35:23 event.app_repeat -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:24:39.894 1+0 records in 00:24:39.894 1+0 records out 00:24:39.894 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000349537 s, 11.7 MB/s 00:24:39.894 11:35:23 event.app_repeat -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:24:39.894 11:35:23 event.app_repeat -- common/autotest_common.sh@885 -- # size=4096 00:24:39.895 11:35:23 event.app_repeat -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:24:39.895 11:35:23 event.app_repeat -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:24:39.895 11:35:23 event.app_repeat -- common/autotest_common.sh@888 -- # return 0 00:24:39.895 11:35:23 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:39.895 11:35:23 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:39.895 11:35:23 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:24:39.895 11:35:23 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:24:39.895 11:35:23 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:24:40.154 11:35:23 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:24:40.154 { 00:24:40.154 "nbd_device": "/dev/nbd0", 00:24:40.154 "bdev_name": "Malloc0" 00:24:40.154 }, 00:24:40.154 { 00:24:40.154 "nbd_device": "/dev/nbd1", 00:24:40.154 "bdev_name": "Malloc1" 00:24:40.154 } 00:24:40.154 ]' 00:24:40.154 11:35:23 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:24:40.154 { 00:24:40.154 "nbd_device": "/dev/nbd0", 00:24:40.154 "bdev_name": "Malloc0" 00:24:40.154 }, 00:24:40.154 { 00:24:40.154 "nbd_device": "/dev/nbd1", 00:24:40.154 "bdev_name": "Malloc1" 00:24:40.154 } 00:24:40.154 ]' 00:24:40.154 11:35:23 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:24:40.154 11:35:24 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:24:40.154 /dev/nbd1' 00:24:40.154 11:35:24 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:24:40.154 /dev/nbd1' 00:24:40.154 11:35:24 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:24:40.154 11:35:24 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:24:40.154 11:35:24 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:24:40.154 11:35:24 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:24:40.154 11:35:24 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:24:40.154 11:35:24 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:24:40.154 11:35:24 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:24:40.154 11:35:24 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:24:40.154 11:35:24 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:24:40.154 11:35:24 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:24:40.154 11:35:24 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:24:40.154 11:35:24 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:24:40.154 256+0 records in 00:24:40.154 256+0 records out 00:24:40.154 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0103934 s, 101 MB/s 00:24:40.154 11:35:24 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:24:40.154 11:35:24 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:24:40.154 256+0 records in 00:24:40.154 256+0 records out 00:24:40.154 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0199289 s, 52.6 MB/s 00:24:40.154 11:35:24 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:24:40.154 11:35:24 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:24:40.413 256+0 records in 00:24:40.413 256+0 records out 00:24:40.413 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0214723 s, 48.8 MB/s 00:24:40.413 11:35:24 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:24:40.413 11:35:24 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:24:40.413 11:35:24 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:24:40.413 11:35:24 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:24:40.413 11:35:24 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:24:40.413 11:35:24 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:24:40.413 11:35:24 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:24:40.413 11:35:24 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:24:40.413 11:35:24 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:24:40.413 11:35:24 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:24:40.413 11:35:24 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:24:40.413 11:35:24 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:24:40.413 11:35:24 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:24:40.413 11:35:24 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:24:40.413 11:35:24 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:24:40.413 11:35:24 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:40.413 11:35:24 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:24:40.413 11:35:24 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:40.413 11:35:24 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:24:40.413 11:35:24 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:24:40.413 11:35:24 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:24:40.413 11:35:24 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:24:40.413 11:35:24 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:40.413 11:35:24 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:40.413 11:35:24 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:40.413 11:35:24 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:24:40.413 11:35:24 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:24:40.413 11:35:24 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:40.413 11:35:24 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:24:40.672 11:35:24 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:24:40.672 11:35:24 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:24:40.672 11:35:24 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:24:40.672 11:35:24 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:40.672 11:35:24 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:40.672 11:35:24 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:24:40.672 11:35:24 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:24:40.672 11:35:24 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:24:40.672 11:35:24 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:24:40.672 11:35:24 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:24:40.672 11:35:24 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:24:40.931 11:35:24 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:24:40.931 11:35:24 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:24:40.931 11:35:24 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:24:40.931 11:35:24 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:24:40.931 11:35:24 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:24:40.931 11:35:24 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:24:40.931 11:35:24 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:24:40.931 11:35:24 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:24:40.931 11:35:24 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:24:40.931 11:35:24 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:24:40.931 11:35:24 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:24:40.931 11:35:24 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:24:40.931 11:35:24 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:24:41.190 11:35:24 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:24:41.448 [2024-06-10 11:35:25.168688] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:24:41.448 [2024-06-10 11:35:25.246203] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:24:41.448 [2024-06-10 11:35:25.246206] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:24:41.448 [2024-06-10 11:35:25.291515] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:24:41.448 [2024-06-10 11:35:25.291561] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:24:44.729 11:35:27 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:24:44.729 11:35:27 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:24:44.729 spdk_app_start Round 1 00:24:44.729 11:35:27 event.app_repeat -- event/event.sh@25 -- # waitforlisten 102032 /var/tmp/spdk-nbd.sock 00:24:44.729 11:35:27 event.app_repeat -- common/autotest_common.sh@830 -- # '[' -z 102032 ']' 00:24:44.729 11:35:27 event.app_repeat -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:24:44.729 11:35:27 event.app_repeat -- common/autotest_common.sh@835 -- # local max_retries=100 00:24:44.729 11:35:27 event.app_repeat -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:24:44.729 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:24:44.729 11:35:27 event.app_repeat -- common/autotest_common.sh@839 -- # xtrace_disable 00:24:44.729 11:35:27 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:24:44.729 11:35:28 event.app_repeat -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:24:44.729 11:35:28 event.app_repeat -- common/autotest_common.sh@863 -- # return 0 00:24:44.729 11:35:28 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:24:44.729 Malloc0 00:24:44.729 11:35:28 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:24:44.729 Malloc1 00:24:44.729 11:35:28 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:24:44.730 11:35:28 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:24:44.730 11:35:28 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:24:44.730 11:35:28 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:24:44.730 11:35:28 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:24:44.730 11:35:28 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:24:44.730 11:35:28 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:24:44.730 11:35:28 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:24:44.730 11:35:28 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:24:44.730 11:35:28 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:44.730 11:35:28 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:24:44.730 11:35:28 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:44.730 11:35:28 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:24:44.730 11:35:28 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:44.730 11:35:28 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:44.730 11:35:28 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:24:44.730 /dev/nbd0 00:24:45.025 11:35:28 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:24:45.025 11:35:28 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:24:45.025 11:35:28 event.app_repeat -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:24:45.025 11:35:28 event.app_repeat -- common/autotest_common.sh@868 -- # local i 00:24:45.025 11:35:28 event.app_repeat -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:24:45.025 11:35:28 event.app_repeat -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:24:45.025 11:35:28 event.app_repeat -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:24:45.025 11:35:28 event.app_repeat -- common/autotest_common.sh@872 -- # break 00:24:45.025 11:35:28 event.app_repeat -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:24:45.025 11:35:28 event.app_repeat -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:24:45.025 11:35:28 event.app_repeat -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:24:45.025 1+0 records in 00:24:45.025 1+0 records out 00:24:45.025 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000252166 s, 16.2 MB/s 00:24:45.025 11:35:28 event.app_repeat -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:24:45.025 11:35:28 event.app_repeat -- common/autotest_common.sh@885 -- # size=4096 00:24:45.025 11:35:28 event.app_repeat -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:24:45.025 11:35:28 event.app_repeat -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:24:45.025 11:35:28 event.app_repeat -- common/autotest_common.sh@888 -- # return 0 00:24:45.025 11:35:28 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:45.025 11:35:28 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:45.025 11:35:28 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:24:45.025 /dev/nbd1 00:24:45.025 11:35:28 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:24:45.025 11:35:28 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:24:45.025 11:35:28 event.app_repeat -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:24:45.025 11:35:28 event.app_repeat -- common/autotest_common.sh@868 -- # local i 00:24:45.025 11:35:28 event.app_repeat -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:24:45.025 11:35:28 event.app_repeat -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:24:45.025 11:35:28 event.app_repeat -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:24:45.025 11:35:28 event.app_repeat -- common/autotest_common.sh@872 -- # break 00:24:45.025 11:35:28 event.app_repeat -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:24:45.025 11:35:28 event.app_repeat -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:24:45.025 11:35:28 event.app_repeat -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:24:45.025 1+0 records in 00:24:45.025 1+0 records out 00:24:45.025 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000121482 s, 33.7 MB/s 00:24:45.025 11:35:28 event.app_repeat -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:24:45.025 11:35:28 event.app_repeat -- common/autotest_common.sh@885 -- # size=4096 00:24:45.025 11:35:28 event.app_repeat -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:24:45.025 11:35:28 event.app_repeat -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:24:45.025 11:35:28 event.app_repeat -- common/autotest_common.sh@888 -- # return 0 00:24:45.025 11:35:28 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:45.025 11:35:28 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:45.025 11:35:28 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:24:45.025 11:35:28 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:24:45.025 11:35:28 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:24:45.319 11:35:29 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:24:45.319 { 00:24:45.319 "nbd_device": "/dev/nbd0", 00:24:45.319 "bdev_name": "Malloc0" 00:24:45.319 }, 00:24:45.319 { 00:24:45.319 "nbd_device": "/dev/nbd1", 00:24:45.319 "bdev_name": "Malloc1" 00:24:45.319 } 00:24:45.319 ]' 00:24:45.319 11:35:29 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:24:45.319 { 00:24:45.319 "nbd_device": "/dev/nbd0", 00:24:45.319 "bdev_name": "Malloc0" 00:24:45.319 }, 00:24:45.319 { 00:24:45.319 "nbd_device": "/dev/nbd1", 00:24:45.319 "bdev_name": "Malloc1" 00:24:45.319 } 00:24:45.319 ]' 00:24:45.319 11:35:29 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:24:45.319 11:35:29 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:24:45.319 /dev/nbd1' 00:24:45.319 11:35:29 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:24:45.319 /dev/nbd1' 00:24:45.319 11:35:29 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:24:45.319 11:35:29 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:24:45.319 11:35:29 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:24:45.319 11:35:29 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:24:45.319 11:35:29 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:24:45.319 11:35:29 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:24:45.319 11:35:29 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:24:45.319 11:35:29 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:24:45.319 11:35:29 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:24:45.319 11:35:29 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:24:45.319 11:35:29 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:24:45.319 11:35:29 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:24:45.319 256+0 records in 00:24:45.319 256+0 records out 00:24:45.319 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0113717 s, 92.2 MB/s 00:24:45.319 11:35:29 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:24:45.319 11:35:29 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:24:45.319 256+0 records in 00:24:45.319 256+0 records out 00:24:45.319 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0203032 s, 51.6 MB/s 00:24:45.319 11:35:29 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:24:45.319 11:35:29 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:24:45.319 256+0 records in 00:24:45.319 256+0 records out 00:24:45.319 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0212672 s, 49.3 MB/s 00:24:45.319 11:35:29 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:24:45.319 11:35:29 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:24:45.319 11:35:29 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:24:45.319 11:35:29 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:24:45.319 11:35:29 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:24:45.319 11:35:29 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:24:45.319 11:35:29 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:24:45.319 11:35:29 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:24:45.319 11:35:29 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:24:45.319 11:35:29 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:24:45.319 11:35:29 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:24:45.320 11:35:29 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:24:45.320 11:35:29 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:24:45.320 11:35:29 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:24:45.320 11:35:29 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:24:45.320 11:35:29 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:45.320 11:35:29 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:24:45.320 11:35:29 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:45.320 11:35:29 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:24:45.578 11:35:29 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:24:45.578 11:35:29 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:24:45.578 11:35:29 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:24:45.578 11:35:29 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:45.578 11:35:29 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:45.578 11:35:29 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:45.578 11:35:29 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:24:45.578 11:35:29 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:24:45.578 11:35:29 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:45.578 11:35:29 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:24:45.836 11:35:29 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:24:45.836 11:35:29 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:24:45.836 11:35:29 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:24:45.836 11:35:29 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:45.836 11:35:29 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:45.836 11:35:29 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:24:45.836 11:35:29 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:24:45.836 11:35:29 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:24:45.836 11:35:29 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:24:45.836 11:35:29 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:24:45.836 11:35:29 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:24:46.094 11:35:29 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:24:46.094 11:35:29 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:24:46.094 11:35:29 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:24:46.094 11:35:29 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:24:46.094 11:35:29 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:24:46.094 11:35:29 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:24:46.094 11:35:29 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:24:46.094 11:35:29 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:24:46.094 11:35:29 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:24:46.094 11:35:29 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:24:46.094 11:35:29 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:24:46.094 11:35:29 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:24:46.094 11:35:29 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:24:46.352 11:35:30 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:24:46.352 [2024-06-10 11:35:30.258193] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:24:46.611 [2024-06-10 11:35:30.340674] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:24:46.611 [2024-06-10 11:35:30.340677] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:24:46.611 [2024-06-10 11:35:30.390413] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:24:46.611 [2024-06-10 11:35:30.390458] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:24:49.140 11:35:33 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:24:49.140 11:35:33 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:24:49.140 spdk_app_start Round 2 00:24:49.140 11:35:33 event.app_repeat -- event/event.sh@25 -- # waitforlisten 102032 /var/tmp/spdk-nbd.sock 00:24:49.140 11:35:33 event.app_repeat -- common/autotest_common.sh@830 -- # '[' -z 102032 ']' 00:24:49.140 11:35:33 event.app_repeat -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:24:49.140 11:35:33 event.app_repeat -- common/autotest_common.sh@835 -- # local max_retries=100 00:24:49.140 11:35:33 event.app_repeat -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:24:49.140 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:24:49.140 11:35:33 event.app_repeat -- common/autotest_common.sh@839 -- # xtrace_disable 00:24:49.140 11:35:33 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:24:49.398 11:35:33 event.app_repeat -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:24:49.398 11:35:33 event.app_repeat -- common/autotest_common.sh@863 -- # return 0 00:24:49.398 11:35:33 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:24:49.656 Malloc0 00:24:49.656 11:35:33 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:24:49.656 Malloc1 00:24:49.656 11:35:33 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:24:49.656 11:35:33 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:24:49.656 11:35:33 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:24:49.656 11:35:33 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:24:49.656 11:35:33 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:24:49.656 11:35:33 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:24:49.656 11:35:33 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:24:49.656 11:35:33 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:24:49.656 11:35:33 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:24:49.656 11:35:33 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:49.656 11:35:33 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:24:49.656 11:35:33 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:49.656 11:35:33 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:24:49.656 11:35:33 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:49.656 11:35:33 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:49.656 11:35:33 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:24:49.915 /dev/nbd0 00:24:49.915 11:35:33 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:24:49.915 11:35:33 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:24:49.915 11:35:33 event.app_repeat -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:24:49.915 11:35:33 event.app_repeat -- common/autotest_common.sh@868 -- # local i 00:24:49.915 11:35:33 event.app_repeat -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:24:49.915 11:35:33 event.app_repeat -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:24:49.915 11:35:33 event.app_repeat -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:24:49.915 11:35:33 event.app_repeat -- common/autotest_common.sh@872 -- # break 00:24:49.915 11:35:33 event.app_repeat -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:24:49.915 11:35:33 event.app_repeat -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:24:49.915 11:35:33 event.app_repeat -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:24:49.915 1+0 records in 00:24:49.915 1+0 records out 00:24:49.915 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000211523 s, 19.4 MB/s 00:24:49.915 11:35:33 event.app_repeat -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:24:49.915 11:35:33 event.app_repeat -- common/autotest_common.sh@885 -- # size=4096 00:24:49.915 11:35:33 event.app_repeat -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:24:49.915 11:35:33 event.app_repeat -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:24:49.915 11:35:33 event.app_repeat -- common/autotest_common.sh@888 -- # return 0 00:24:49.915 11:35:33 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:49.915 11:35:33 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:49.915 11:35:33 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:24:50.173 /dev/nbd1 00:24:50.173 11:35:33 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:24:50.173 11:35:34 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:24:50.173 11:35:34 event.app_repeat -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:24:50.173 11:35:34 event.app_repeat -- common/autotest_common.sh@868 -- # local i 00:24:50.173 11:35:34 event.app_repeat -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:24:50.173 11:35:34 event.app_repeat -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:24:50.173 11:35:34 event.app_repeat -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:24:50.173 11:35:34 event.app_repeat -- common/autotest_common.sh@872 -- # break 00:24:50.173 11:35:34 event.app_repeat -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:24:50.173 11:35:34 event.app_repeat -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:24:50.173 11:35:34 event.app_repeat -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:24:50.173 1+0 records in 00:24:50.173 1+0 records out 00:24:50.173 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000184192 s, 22.2 MB/s 00:24:50.173 11:35:34 event.app_repeat -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:24:50.173 11:35:34 event.app_repeat -- common/autotest_common.sh@885 -- # size=4096 00:24:50.173 11:35:34 event.app_repeat -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:24:50.173 11:35:34 event.app_repeat -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:24:50.173 11:35:34 event.app_repeat -- common/autotest_common.sh@888 -- # return 0 00:24:50.173 11:35:34 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:50.173 11:35:34 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:50.173 11:35:34 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:24:50.173 11:35:34 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:24:50.173 11:35:34 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:24:50.432 11:35:34 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:24:50.432 { 00:24:50.432 "nbd_device": "/dev/nbd0", 00:24:50.432 "bdev_name": "Malloc0" 00:24:50.432 }, 00:24:50.432 { 00:24:50.432 "nbd_device": "/dev/nbd1", 00:24:50.432 "bdev_name": "Malloc1" 00:24:50.432 } 00:24:50.432 ]' 00:24:50.432 11:35:34 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:24:50.432 { 00:24:50.432 "nbd_device": "/dev/nbd0", 00:24:50.432 "bdev_name": "Malloc0" 00:24:50.432 }, 00:24:50.432 { 00:24:50.432 "nbd_device": "/dev/nbd1", 00:24:50.432 "bdev_name": "Malloc1" 00:24:50.432 } 00:24:50.432 ]' 00:24:50.432 11:35:34 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:24:50.432 11:35:34 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:24:50.432 /dev/nbd1' 00:24:50.432 11:35:34 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:24:50.432 /dev/nbd1' 00:24:50.432 11:35:34 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:24:50.432 11:35:34 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:24:50.432 11:35:34 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:24:50.432 11:35:34 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:24:50.432 11:35:34 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:24:50.432 11:35:34 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:24:50.432 11:35:34 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:24:50.432 11:35:34 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:24:50.432 11:35:34 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:24:50.432 11:35:34 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:24:50.432 11:35:34 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:24:50.432 11:35:34 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:24:50.432 256+0 records in 00:24:50.432 256+0 records out 00:24:50.432 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0103931 s, 101 MB/s 00:24:50.432 11:35:34 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:24:50.432 11:35:34 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:24:50.432 256+0 records in 00:24:50.432 256+0 records out 00:24:50.432 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0200165 s, 52.4 MB/s 00:24:50.432 11:35:34 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:24:50.432 11:35:34 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:24:50.432 256+0 records in 00:24:50.432 256+0 records out 00:24:50.432 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0215778 s, 48.6 MB/s 00:24:50.432 11:35:34 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:24:50.432 11:35:34 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:24:50.432 11:35:34 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:24:50.432 11:35:34 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:24:50.432 11:35:34 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:24:50.432 11:35:34 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:24:50.432 11:35:34 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:24:50.432 11:35:34 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:24:50.432 11:35:34 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:24:50.432 11:35:34 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:24:50.432 11:35:34 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:24:50.432 11:35:34 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:24:50.432 11:35:34 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:24:50.432 11:35:34 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:24:50.432 11:35:34 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:24:50.432 11:35:34 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:50.432 11:35:34 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:24:50.432 11:35:34 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:50.432 11:35:34 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:24:50.691 11:35:34 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:24:50.691 11:35:34 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:24:50.691 11:35:34 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:24:50.691 11:35:34 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:50.691 11:35:34 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:50.691 11:35:34 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:50.691 11:35:34 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:24:50.691 11:35:34 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:24:50.691 11:35:34 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:50.691 11:35:34 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:24:50.950 11:35:34 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:24:50.950 11:35:34 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:24:50.950 11:35:34 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:24:50.950 11:35:34 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:50.950 11:35:34 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:50.950 11:35:34 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:24:50.950 11:35:34 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:24:50.950 11:35:34 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:24:50.950 11:35:34 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:24:50.950 11:35:34 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:24:50.950 11:35:34 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:24:51.209 11:35:34 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:24:51.209 11:35:34 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:24:51.209 11:35:34 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:24:51.209 11:35:34 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:24:51.209 11:35:34 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:24:51.209 11:35:34 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:24:51.209 11:35:34 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:24:51.209 11:35:34 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:24:51.209 11:35:34 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:24:51.209 11:35:34 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:24:51.209 11:35:34 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:24:51.209 11:35:34 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:24:51.209 11:35:34 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:24:51.468 11:35:35 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:24:51.468 [2024-06-10 11:35:35.372840] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:24:51.726 [2024-06-10 11:35:35.452962] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:24:51.726 [2024-06-10 11:35:35.452965] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:24:51.726 [2024-06-10 11:35:35.501392] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:24:51.727 [2024-06-10 11:35:35.501437] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:24:54.259 11:35:38 event.app_repeat -- event/event.sh@38 -- # waitforlisten 102032 /var/tmp/spdk-nbd.sock 00:24:54.259 11:35:38 event.app_repeat -- common/autotest_common.sh@830 -- # '[' -z 102032 ']' 00:24:54.259 11:35:38 event.app_repeat -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:24:54.259 11:35:38 event.app_repeat -- common/autotest_common.sh@835 -- # local max_retries=100 00:24:54.259 11:35:38 event.app_repeat -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:24:54.259 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:24:54.259 11:35:38 event.app_repeat -- common/autotest_common.sh@839 -- # xtrace_disable 00:24:54.259 11:35:38 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:24:54.518 11:35:38 event.app_repeat -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:24:54.518 11:35:38 event.app_repeat -- common/autotest_common.sh@863 -- # return 0 00:24:54.518 11:35:38 event.app_repeat -- event/event.sh@39 -- # killprocess 102032 00:24:54.518 11:35:38 event.app_repeat -- common/autotest_common.sh@949 -- # '[' -z 102032 ']' 00:24:54.518 11:35:38 event.app_repeat -- common/autotest_common.sh@953 -- # kill -0 102032 00:24:54.518 11:35:38 event.app_repeat -- common/autotest_common.sh@954 -- # uname 00:24:54.518 11:35:38 event.app_repeat -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:24:54.518 11:35:38 event.app_repeat -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 102032 00:24:54.518 11:35:38 event.app_repeat -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:24:54.518 11:35:38 event.app_repeat -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:24:54.518 11:35:38 event.app_repeat -- common/autotest_common.sh@967 -- # echo 'killing process with pid 102032' 00:24:54.518 killing process with pid 102032 00:24:54.518 11:35:38 event.app_repeat -- common/autotest_common.sh@968 -- # kill 102032 00:24:54.518 11:35:38 event.app_repeat -- common/autotest_common.sh@973 -- # wait 102032 00:24:54.778 spdk_app_start is called in Round 0. 00:24:54.778 Shutdown signal received, stop current app iteration 00:24:54.778 Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 reinitialization... 00:24:54.778 spdk_app_start is called in Round 1. 00:24:54.778 Shutdown signal received, stop current app iteration 00:24:54.778 Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 reinitialization... 00:24:54.778 spdk_app_start is called in Round 2. 00:24:54.778 Shutdown signal received, stop current app iteration 00:24:54.778 Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 reinitialization... 00:24:54.778 spdk_app_start is called in Round 3. 00:24:54.778 Shutdown signal received, stop current app iteration 00:24:54.778 11:35:38 event.app_repeat -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:24:54.778 11:35:38 event.app_repeat -- event/event.sh@42 -- # return 0 00:24:54.778 00:24:54.778 real 0m16.427s 00:24:54.778 user 0m34.768s 00:24:54.778 sys 0m3.131s 00:24:54.778 11:35:38 event.app_repeat -- common/autotest_common.sh@1125 -- # xtrace_disable 00:24:54.778 11:35:38 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:24:54.778 ************************************ 00:24:54.778 END TEST app_repeat 00:24:54.778 ************************************ 00:24:54.778 11:35:38 event -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:24:54.778 00:24:54.778 real 0m25.667s 00:24:54.778 user 0m50.849s 00:24:54.778 sys 0m4.287s 00:24:54.778 11:35:38 event -- common/autotest_common.sh@1125 -- # xtrace_disable 00:24:54.778 11:35:38 event -- common/autotest_common.sh@10 -- # set +x 00:24:54.778 ************************************ 00:24:54.778 END TEST event 00:24:54.778 ************************************ 00:24:54.778 11:35:38 -- spdk/autotest.sh@182 -- # run_test thread /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/thread.sh 00:24:54.778 11:35:38 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:24:54.778 11:35:38 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:24:54.778 11:35:38 -- common/autotest_common.sh@10 -- # set +x 00:24:54.778 ************************************ 00:24:54.778 START TEST thread 00:24:54.778 ************************************ 00:24:54.778 11:35:38 thread -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/thread.sh 00:24:55.039 * Looking for test storage... 00:24:55.039 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread 00:24:55.039 11:35:38 thread -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:24:55.039 11:35:38 thread -- common/autotest_common.sh@1100 -- # '[' 8 -le 1 ']' 00:24:55.039 11:35:38 thread -- common/autotest_common.sh@1106 -- # xtrace_disable 00:24:55.039 11:35:38 thread -- common/autotest_common.sh@10 -- # set +x 00:24:55.039 ************************************ 00:24:55.039 START TEST thread_poller_perf 00:24:55.039 ************************************ 00:24:55.039 11:35:38 thread.thread_poller_perf -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:24:55.039 [2024-06-10 11:35:38.844743] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:24:55.039 [2024-06-10 11:35:38.844799] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid104466 ] 00:24:55.039 [2024-06-10 11:35:38.931466] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:55.298 [2024-06-10 11:35:39.011454] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:24:55.298 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:24:56.234 ====================================== 00:24:56.234 busy:2304821346 (cyc) 00:24:56.234 total_run_count: 417000 00:24:56.234 tsc_hz: 2300000000 (cyc) 00:24:56.234 ====================================== 00:24:56.234 poller_cost: 5527 (cyc), 2403 (nsec) 00:24:56.234 00:24:56.234 real 0m1.275s 00:24:56.234 user 0m1.172s 00:24:56.234 sys 0m0.098s 00:24:56.234 11:35:40 thread.thread_poller_perf -- common/autotest_common.sh@1125 -- # xtrace_disable 00:24:56.234 11:35:40 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:24:56.234 ************************************ 00:24:56.234 END TEST thread_poller_perf 00:24:56.234 ************************************ 00:24:56.234 11:35:40 thread -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:24:56.234 11:35:40 thread -- common/autotest_common.sh@1100 -- # '[' 8 -le 1 ']' 00:24:56.234 11:35:40 thread -- common/autotest_common.sh@1106 -- # xtrace_disable 00:24:56.234 11:35:40 thread -- common/autotest_common.sh@10 -- # set +x 00:24:56.234 ************************************ 00:24:56.234 START TEST thread_poller_perf 00:24:56.234 ************************************ 00:24:56.234 11:35:40 thread.thread_poller_perf -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:24:56.493 [2024-06-10 11:35:40.186928] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:24:56.493 [2024-06-10 11:35:40.186982] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid104683 ] 00:24:56.493 [2024-06-10 11:35:40.275664] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:56.493 [2024-06-10 11:35:40.356736] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:24:56.493 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:24:57.870 ====================================== 00:24:57.870 busy:2301636244 (cyc) 00:24:57.870 total_run_count: 5545000 00:24:57.870 tsc_hz: 2300000000 (cyc) 00:24:57.870 ====================================== 00:24:57.870 poller_cost: 415 (cyc), 180 (nsec) 00:24:57.870 00:24:57.870 real 0m1.273s 00:24:57.870 user 0m1.154s 00:24:57.870 sys 0m0.113s 00:24:57.870 11:35:41 thread.thread_poller_perf -- common/autotest_common.sh@1125 -- # xtrace_disable 00:24:57.870 11:35:41 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:24:57.870 ************************************ 00:24:57.870 END TEST thread_poller_perf 00:24:57.870 ************************************ 00:24:57.870 11:35:41 thread -- thread/thread.sh@17 -- # [[ y != \y ]] 00:24:57.870 00:24:57.870 real 0m2.778s 00:24:57.870 user 0m2.407s 00:24:57.870 sys 0m0.379s 00:24:57.870 11:35:41 thread -- common/autotest_common.sh@1125 -- # xtrace_disable 00:24:57.870 11:35:41 thread -- common/autotest_common.sh@10 -- # set +x 00:24:57.871 ************************************ 00:24:57.871 END TEST thread 00:24:57.871 ************************************ 00:24:57.871 11:35:41 -- spdk/autotest.sh@183 -- # run_test accel /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel.sh 00:24:57.871 11:35:41 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:24:57.871 11:35:41 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:24:57.871 11:35:41 -- common/autotest_common.sh@10 -- # set +x 00:24:57.871 ************************************ 00:24:57.871 START TEST accel 00:24:57.871 ************************************ 00:24:57.871 11:35:41 accel -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel.sh 00:24:57.871 * Looking for test storage... 00:24:57.871 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel 00:24:57.871 11:35:41 accel -- accel/accel.sh@81 -- # declare -A expected_opcs 00:24:57.871 11:35:41 accel -- accel/accel.sh@82 -- # get_expected_opcs 00:24:57.871 11:35:41 accel -- accel/accel.sh@60 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:24:57.871 11:35:41 accel -- accel/accel.sh@62 -- # spdk_tgt_pid=104917 00:24:57.871 11:35:41 accel -- accel/accel.sh@63 -- # waitforlisten 104917 00:24:57.871 11:35:41 accel -- common/autotest_common.sh@830 -- # '[' -z 104917 ']' 00:24:57.871 11:35:41 accel -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:24:57.871 11:35:41 accel -- accel/accel.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:24:57.871 11:35:41 accel -- common/autotest_common.sh@835 -- # local max_retries=100 00:24:57.871 11:35:41 accel -- accel/accel.sh@61 -- # build_accel_config 00:24:57.871 11:35:41 accel -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:24:57.871 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:24:57.871 11:35:41 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:24:57.871 11:35:41 accel -- common/autotest_common.sh@839 -- # xtrace_disable 00:24:57.871 11:35:41 accel -- common/autotest_common.sh@10 -- # set +x 00:24:57.871 11:35:41 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:24:57.871 11:35:41 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:24:57.871 11:35:41 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:24:57.871 11:35:41 accel -- accel/accel.sh@36 -- # [[ -n '' ]] 00:24:57.871 11:35:41 accel -- accel/accel.sh@40 -- # local IFS=, 00:24:57.871 11:35:41 accel -- accel/accel.sh@41 -- # jq -r . 00:24:57.871 [2024-06-10 11:35:41.707719] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:24:57.871 [2024-06-10 11:35:41.707776] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid104917 ] 00:24:57.871 [2024-06-10 11:35:41.792440] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:58.129 [2024-06-10 11:35:41.873177] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:24:58.696 11:35:42 accel -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:24:58.696 11:35:42 accel -- common/autotest_common.sh@863 -- # return 0 00:24:58.696 11:35:42 accel -- accel/accel.sh@65 -- # [[ 0 -gt 0 ]] 00:24:58.696 11:35:42 accel -- accel/accel.sh@66 -- # [[ 0 -gt 0 ]] 00:24:58.696 11:35:42 accel -- accel/accel.sh@67 -- # [[ 0 -gt 0 ]] 00:24:58.696 11:35:42 accel -- accel/accel.sh@68 -- # [[ -n '' ]] 00:24:58.696 11:35:42 accel -- accel/accel.sh@70 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:24:58.696 11:35:42 accel -- accel/accel.sh@70 -- # rpc_cmd accel_get_opc_assignments 00:24:58.696 11:35:42 accel -- common/autotest_common.sh@560 -- # xtrace_disable 00:24:58.696 11:35:42 accel -- accel/accel.sh@70 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:24:58.696 11:35:42 accel -- common/autotest_common.sh@10 -- # set +x 00:24:58.696 11:35:42 accel -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:24:58.696 11:35:42 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:24:58.696 11:35:42 accel -- accel/accel.sh@72 -- # IFS== 00:24:58.696 11:35:42 accel -- accel/accel.sh@72 -- # read -r opc module 00:24:58.696 11:35:42 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:24:58.696 11:35:42 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:24:58.696 11:35:42 accel -- accel/accel.sh@72 -- # IFS== 00:24:58.696 11:35:42 accel -- accel/accel.sh@72 -- # read -r opc module 00:24:58.696 11:35:42 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:24:58.696 11:35:42 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:24:58.696 11:35:42 accel -- accel/accel.sh@72 -- # IFS== 00:24:58.696 11:35:42 accel -- accel/accel.sh@72 -- # read -r opc module 00:24:58.696 11:35:42 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:24:58.696 11:35:42 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:24:58.696 11:35:42 accel -- accel/accel.sh@72 -- # IFS== 00:24:58.696 11:35:42 accel -- accel/accel.sh@72 -- # read -r opc module 00:24:58.696 11:35:42 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:24:58.696 11:35:42 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:24:58.696 11:35:42 accel -- accel/accel.sh@72 -- # IFS== 00:24:58.696 11:35:42 accel -- accel/accel.sh@72 -- # read -r opc module 00:24:58.696 11:35:42 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:24:58.696 11:35:42 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:24:58.696 11:35:42 accel -- accel/accel.sh@72 -- # IFS== 00:24:58.696 11:35:42 accel -- accel/accel.sh@72 -- # read -r opc module 00:24:58.696 11:35:42 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:24:58.696 11:35:42 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:24:58.696 11:35:42 accel -- accel/accel.sh@72 -- # IFS== 00:24:58.696 11:35:42 accel -- accel/accel.sh@72 -- # read -r opc module 00:24:58.696 11:35:42 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:24:58.696 11:35:42 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:24:58.696 11:35:42 accel -- accel/accel.sh@72 -- # IFS== 00:24:58.696 11:35:42 accel -- accel/accel.sh@72 -- # read -r opc module 00:24:58.696 11:35:42 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:24:58.696 11:35:42 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:24:58.696 11:35:42 accel -- accel/accel.sh@72 -- # IFS== 00:24:58.696 11:35:42 accel -- accel/accel.sh@72 -- # read -r opc module 00:24:58.696 11:35:42 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:24:58.696 11:35:42 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:24:58.696 11:35:42 accel -- accel/accel.sh@72 -- # IFS== 00:24:58.696 11:35:42 accel -- accel/accel.sh@72 -- # read -r opc module 00:24:58.696 11:35:42 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:24:58.696 11:35:42 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:24:58.696 11:35:42 accel -- accel/accel.sh@72 -- # IFS== 00:24:58.696 11:35:42 accel -- accel/accel.sh@72 -- # read -r opc module 00:24:58.696 11:35:42 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:24:58.696 11:35:42 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:24:58.696 11:35:42 accel -- accel/accel.sh@72 -- # IFS== 00:24:58.696 11:35:42 accel -- accel/accel.sh@72 -- # read -r opc module 00:24:58.696 11:35:42 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:24:58.697 11:35:42 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:24:58.697 11:35:42 accel -- accel/accel.sh@72 -- # IFS== 00:24:58.697 11:35:42 accel -- accel/accel.sh@72 -- # read -r opc module 00:24:58.697 11:35:42 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:24:58.697 11:35:42 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:24:58.697 11:35:42 accel -- accel/accel.sh@72 -- # IFS== 00:24:58.697 11:35:42 accel -- accel/accel.sh@72 -- # read -r opc module 00:24:58.697 11:35:42 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:24:58.697 11:35:42 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:24:58.697 11:35:42 accel -- accel/accel.sh@72 -- # IFS== 00:24:58.697 11:35:42 accel -- accel/accel.sh@72 -- # read -r opc module 00:24:58.697 11:35:42 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:24:58.697 11:35:42 accel -- accel/accel.sh@75 -- # killprocess 104917 00:24:58.697 11:35:42 accel -- common/autotest_common.sh@949 -- # '[' -z 104917 ']' 00:24:58.697 11:35:42 accel -- common/autotest_common.sh@953 -- # kill -0 104917 00:24:58.697 11:35:42 accel -- common/autotest_common.sh@954 -- # uname 00:24:58.697 11:35:42 accel -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:24:58.697 11:35:42 accel -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 104917 00:24:58.697 11:35:42 accel -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:24:58.697 11:35:42 accel -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:24:58.697 11:35:42 accel -- common/autotest_common.sh@967 -- # echo 'killing process with pid 104917' 00:24:58.697 killing process with pid 104917 00:24:58.697 11:35:42 accel -- common/autotest_common.sh@968 -- # kill 104917 00:24:58.697 11:35:42 accel -- common/autotest_common.sh@973 -- # wait 104917 00:24:59.265 11:35:42 accel -- accel/accel.sh@76 -- # trap - ERR 00:24:59.265 11:35:42 accel -- accel/accel.sh@89 -- # run_test accel_help accel_perf -h 00:24:59.265 11:35:42 accel -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:24:59.265 11:35:42 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:24:59.265 11:35:42 accel -- common/autotest_common.sh@10 -- # set +x 00:24:59.265 11:35:42 accel.accel_help -- common/autotest_common.sh@1124 -- # accel_perf -h 00:24:59.265 11:35:42 accel.accel_help -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -h 00:24:59.265 11:35:42 accel.accel_help -- accel/accel.sh@12 -- # build_accel_config 00:24:59.265 11:35:42 accel.accel_help -- accel/accel.sh@31 -- # accel_json_cfg=() 00:24:59.265 11:35:42 accel.accel_help -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:24:59.265 11:35:42 accel.accel_help -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:24:59.265 11:35:42 accel.accel_help -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:24:59.265 11:35:42 accel.accel_help -- accel/accel.sh@36 -- # [[ -n '' ]] 00:24:59.265 11:35:42 accel.accel_help -- accel/accel.sh@40 -- # local IFS=, 00:24:59.265 11:35:43 accel.accel_help -- accel/accel.sh@41 -- # jq -r . 00:24:59.265 11:35:43 accel.accel_help -- common/autotest_common.sh@1125 -- # xtrace_disable 00:24:59.265 11:35:43 accel.accel_help -- common/autotest_common.sh@10 -- # set +x 00:24:59.265 11:35:43 accel -- accel/accel.sh@91 -- # run_test accel_missing_filename NOT accel_perf -t 1 -w compress 00:24:59.265 11:35:43 accel -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:24:59.265 11:35:43 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:24:59.265 11:35:43 accel -- common/autotest_common.sh@10 -- # set +x 00:24:59.265 ************************************ 00:24:59.265 START TEST accel_missing_filename 00:24:59.265 ************************************ 00:24:59.265 11:35:43 accel.accel_missing_filename -- common/autotest_common.sh@1124 -- # NOT accel_perf -t 1 -w compress 00:24:59.265 11:35:43 accel.accel_missing_filename -- common/autotest_common.sh@649 -- # local es=0 00:24:59.265 11:35:43 accel.accel_missing_filename -- common/autotest_common.sh@651 -- # valid_exec_arg accel_perf -t 1 -w compress 00:24:59.265 11:35:43 accel.accel_missing_filename -- common/autotest_common.sh@637 -- # local arg=accel_perf 00:24:59.265 11:35:43 accel.accel_missing_filename -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:24:59.265 11:35:43 accel.accel_missing_filename -- common/autotest_common.sh@641 -- # type -t accel_perf 00:24:59.265 11:35:43 accel.accel_missing_filename -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:24:59.265 11:35:43 accel.accel_missing_filename -- common/autotest_common.sh@652 -- # accel_perf -t 1 -w compress 00:24:59.265 11:35:43 accel.accel_missing_filename -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress 00:24:59.265 11:35:43 accel.accel_missing_filename -- accel/accel.sh@12 -- # build_accel_config 00:24:59.265 11:35:43 accel.accel_missing_filename -- accel/accel.sh@31 -- # accel_json_cfg=() 00:24:59.265 11:35:43 accel.accel_missing_filename -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:24:59.265 11:35:43 accel.accel_missing_filename -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:24:59.265 11:35:43 accel.accel_missing_filename -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:24:59.265 11:35:43 accel.accel_missing_filename -- accel/accel.sh@36 -- # [[ -n '' ]] 00:24:59.265 11:35:43 accel.accel_missing_filename -- accel/accel.sh@40 -- # local IFS=, 00:24:59.265 11:35:43 accel.accel_missing_filename -- accel/accel.sh@41 -- # jq -r . 00:24:59.265 [2024-06-10 11:35:43.150443] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:24:59.265 [2024-06-10 11:35:43.150503] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid105139 ] 00:24:59.524 [2024-06-10 11:35:43.237751] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:59.524 [2024-06-10 11:35:43.319651] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:24:59.524 [2024-06-10 11:35:43.379182] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:24:59.524 [2024-06-10 11:35:43.449021] accel_perf.c:1464:main: *ERROR*: ERROR starting application 00:24:59.784 A filename is required. 00:24:59.784 11:35:43 accel.accel_missing_filename -- common/autotest_common.sh@652 -- # es=234 00:24:59.784 11:35:43 accel.accel_missing_filename -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:24:59.784 11:35:43 accel.accel_missing_filename -- common/autotest_common.sh@661 -- # es=106 00:24:59.784 11:35:43 accel.accel_missing_filename -- common/autotest_common.sh@662 -- # case "$es" in 00:24:59.784 11:35:43 accel.accel_missing_filename -- common/autotest_common.sh@669 -- # es=1 00:24:59.784 11:35:43 accel.accel_missing_filename -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:24:59.784 00:24:59.784 real 0m0.416s 00:24:59.784 user 0m0.285s 00:24:59.784 sys 0m0.161s 00:24:59.784 11:35:43 accel.accel_missing_filename -- common/autotest_common.sh@1125 -- # xtrace_disable 00:24:59.784 11:35:43 accel.accel_missing_filename -- common/autotest_common.sh@10 -- # set +x 00:24:59.784 ************************************ 00:24:59.784 END TEST accel_missing_filename 00:24:59.784 ************************************ 00:24:59.784 11:35:43 accel -- accel/accel.sh@93 -- # run_test accel_compress_verify NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:24:59.784 11:35:43 accel -- common/autotest_common.sh@1100 -- # '[' 10 -le 1 ']' 00:24:59.784 11:35:43 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:24:59.784 11:35:43 accel -- common/autotest_common.sh@10 -- # set +x 00:24:59.784 ************************************ 00:24:59.784 START TEST accel_compress_verify 00:24:59.784 ************************************ 00:24:59.784 11:35:43 accel.accel_compress_verify -- common/autotest_common.sh@1124 -- # NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:24:59.784 11:35:43 accel.accel_compress_verify -- common/autotest_common.sh@649 -- # local es=0 00:24:59.784 11:35:43 accel.accel_compress_verify -- common/autotest_common.sh@651 -- # valid_exec_arg accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:24:59.784 11:35:43 accel.accel_compress_verify -- common/autotest_common.sh@637 -- # local arg=accel_perf 00:24:59.784 11:35:43 accel.accel_compress_verify -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:24:59.784 11:35:43 accel.accel_compress_verify -- common/autotest_common.sh@641 -- # type -t accel_perf 00:24:59.784 11:35:43 accel.accel_compress_verify -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:24:59.784 11:35:43 accel.accel_compress_verify -- common/autotest_common.sh@652 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:24:59.784 11:35:43 accel.accel_compress_verify -- accel/accel.sh@12 -- # build_accel_config 00:24:59.784 11:35:43 accel.accel_compress_verify -- accel/accel.sh@31 -- # accel_json_cfg=() 00:24:59.784 11:35:43 accel.accel_compress_verify -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:24:59.784 11:35:43 accel.accel_compress_verify -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:24:59.784 11:35:43 accel.accel_compress_verify -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:24:59.784 11:35:43 accel.accel_compress_verify -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:24:59.784 11:35:43 accel.accel_compress_verify -- accel/accel.sh@36 -- # [[ -n '' ]] 00:24:59.784 11:35:43 accel.accel_compress_verify -- accel/accel.sh@40 -- # local IFS=, 00:24:59.784 11:35:43 accel.accel_compress_verify -- accel/accel.sh@41 -- # jq -r . 00:24:59.784 [2024-06-10 11:35:43.623399] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:24:59.784 [2024-06-10 11:35:43.623449] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid105182 ] 00:24:59.784 [2024-06-10 11:35:43.708114] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:00.043 [2024-06-10 11:35:43.792897] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:25:00.043 [2024-06-10 11:35:43.858797] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:25:00.043 [2024-06-10 11:35:43.929097] accel_perf.c:1464:main: *ERROR*: ERROR starting application 00:25:00.303 00:25:00.303 Compression does not support the verify option, aborting. 00:25:00.303 11:35:44 accel.accel_compress_verify -- common/autotest_common.sh@652 -- # es=161 00:25:00.303 11:35:44 accel.accel_compress_verify -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:25:00.303 11:35:44 accel.accel_compress_verify -- common/autotest_common.sh@661 -- # es=33 00:25:00.303 11:35:44 accel.accel_compress_verify -- common/autotest_common.sh@662 -- # case "$es" in 00:25:00.303 11:35:44 accel.accel_compress_verify -- common/autotest_common.sh@669 -- # es=1 00:25:00.303 11:35:44 accel.accel_compress_verify -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:25:00.303 00:25:00.303 real 0m0.411s 00:25:00.303 user 0m0.264s 00:25:00.303 sys 0m0.151s 00:25:00.303 11:35:44 accel.accel_compress_verify -- common/autotest_common.sh@1125 -- # xtrace_disable 00:25:00.303 11:35:44 accel.accel_compress_verify -- common/autotest_common.sh@10 -- # set +x 00:25:00.303 ************************************ 00:25:00.303 END TEST accel_compress_verify 00:25:00.303 ************************************ 00:25:00.303 11:35:44 accel -- accel/accel.sh@95 -- # run_test accel_wrong_workload NOT accel_perf -t 1 -w foobar 00:25:00.303 11:35:44 accel -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:25:00.303 11:35:44 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:25:00.303 11:35:44 accel -- common/autotest_common.sh@10 -- # set +x 00:25:00.303 ************************************ 00:25:00.303 START TEST accel_wrong_workload 00:25:00.303 ************************************ 00:25:00.303 11:35:44 accel.accel_wrong_workload -- common/autotest_common.sh@1124 -- # NOT accel_perf -t 1 -w foobar 00:25:00.303 11:35:44 accel.accel_wrong_workload -- common/autotest_common.sh@649 -- # local es=0 00:25:00.303 11:35:44 accel.accel_wrong_workload -- common/autotest_common.sh@651 -- # valid_exec_arg accel_perf -t 1 -w foobar 00:25:00.303 11:35:44 accel.accel_wrong_workload -- common/autotest_common.sh@637 -- # local arg=accel_perf 00:25:00.303 11:35:44 accel.accel_wrong_workload -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:25:00.303 11:35:44 accel.accel_wrong_workload -- common/autotest_common.sh@641 -- # type -t accel_perf 00:25:00.303 11:35:44 accel.accel_wrong_workload -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:25:00.303 11:35:44 accel.accel_wrong_workload -- common/autotest_common.sh@652 -- # accel_perf -t 1 -w foobar 00:25:00.303 11:35:44 accel.accel_wrong_workload -- accel/accel.sh@12 -- # build_accel_config 00:25:00.303 11:35:44 accel.accel_wrong_workload -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w foobar 00:25:00.303 11:35:44 accel.accel_wrong_workload -- accel/accel.sh@31 -- # accel_json_cfg=() 00:25:00.303 11:35:44 accel.accel_wrong_workload -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:25:00.303 11:35:44 accel.accel_wrong_workload -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:25:00.303 11:35:44 accel.accel_wrong_workload -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:25:00.303 11:35:44 accel.accel_wrong_workload -- accel/accel.sh@36 -- # [[ -n '' ]] 00:25:00.303 11:35:44 accel.accel_wrong_workload -- accel/accel.sh@40 -- # local IFS=, 00:25:00.303 11:35:44 accel.accel_wrong_workload -- accel/accel.sh@41 -- # jq -r . 00:25:00.303 Unsupported workload type: foobar 00:25:00.303 [2024-06-10 11:35:44.101180] app.c:1451:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'w' failed: 1 00:25:00.303 accel_perf options: 00:25:00.303 [-h help message] 00:25:00.303 [-q queue depth per core] 00:25:00.303 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:25:00.303 [-T number of threads per core 00:25:00.303 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:25:00.303 [-t time in seconds] 00:25:00.303 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:25:00.303 [ dif_verify, dif_verify_copy, dif_generate, dif_generate_copy 00:25:00.303 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:25:00.303 [-l for compress/decompress workloads, name of uncompressed input file 00:25:00.303 [-S for crc32c workload, use this seed value (default 0) 00:25:00.303 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:25:00.303 [-f for fill workload, use this BYTE value (default 255) 00:25:00.303 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:25:00.303 [-y verify result if this switch is on] 00:25:00.303 [-a tasks to allocate per core (default: same value as -q)] 00:25:00.303 Can be used to spread operations across a wider range of memory. 00:25:00.303 11:35:44 accel.accel_wrong_workload -- common/autotest_common.sh@652 -- # es=1 00:25:00.303 11:35:44 accel.accel_wrong_workload -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:25:00.303 11:35:44 accel.accel_wrong_workload -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:25:00.303 11:35:44 accel.accel_wrong_workload -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:25:00.303 00:25:00.303 real 0m0.024s 00:25:00.303 user 0m0.012s 00:25:00.303 sys 0m0.013s 00:25:00.303 11:35:44 accel.accel_wrong_workload -- common/autotest_common.sh@1125 -- # xtrace_disable 00:25:00.303 11:35:44 accel.accel_wrong_workload -- common/autotest_common.sh@10 -- # set +x 00:25:00.303 ************************************ 00:25:00.303 END TEST accel_wrong_workload 00:25:00.303 ************************************ 00:25:00.303 Error: writing output failed: Broken pipe 00:25:00.303 11:35:44 accel -- accel/accel.sh@97 -- # run_test accel_negative_buffers NOT accel_perf -t 1 -w xor -y -x -1 00:25:00.303 11:35:44 accel -- common/autotest_common.sh@1100 -- # '[' 10 -le 1 ']' 00:25:00.303 11:35:44 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:25:00.303 11:35:44 accel -- common/autotest_common.sh@10 -- # set +x 00:25:00.303 ************************************ 00:25:00.303 START TEST accel_negative_buffers 00:25:00.303 ************************************ 00:25:00.303 11:35:44 accel.accel_negative_buffers -- common/autotest_common.sh@1124 -- # NOT accel_perf -t 1 -w xor -y -x -1 00:25:00.303 11:35:44 accel.accel_negative_buffers -- common/autotest_common.sh@649 -- # local es=0 00:25:00.303 11:35:44 accel.accel_negative_buffers -- common/autotest_common.sh@651 -- # valid_exec_arg accel_perf -t 1 -w xor -y -x -1 00:25:00.303 11:35:44 accel.accel_negative_buffers -- common/autotest_common.sh@637 -- # local arg=accel_perf 00:25:00.303 11:35:44 accel.accel_negative_buffers -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:25:00.303 11:35:44 accel.accel_negative_buffers -- common/autotest_common.sh@641 -- # type -t accel_perf 00:25:00.303 11:35:44 accel.accel_negative_buffers -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:25:00.303 11:35:44 accel.accel_negative_buffers -- common/autotest_common.sh@652 -- # accel_perf -t 1 -w xor -y -x -1 00:25:00.303 11:35:44 accel.accel_negative_buffers -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x -1 00:25:00.303 11:35:44 accel.accel_negative_buffers -- accel/accel.sh@12 -- # build_accel_config 00:25:00.303 11:35:44 accel.accel_negative_buffers -- accel/accel.sh@31 -- # accel_json_cfg=() 00:25:00.303 11:35:44 accel.accel_negative_buffers -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:25:00.303 11:35:44 accel.accel_negative_buffers -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:25:00.303 11:35:44 accel.accel_negative_buffers -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:25:00.303 11:35:44 accel.accel_negative_buffers -- accel/accel.sh@36 -- # [[ -n '' ]] 00:25:00.303 11:35:44 accel.accel_negative_buffers -- accel/accel.sh@40 -- # local IFS=, 00:25:00.303 11:35:44 accel.accel_negative_buffers -- accel/accel.sh@41 -- # jq -r . 00:25:00.303 -x option must be non-negative. 00:25:00.303 [2024-06-10 11:35:44.221515] app.c:1451:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'x' failed: 1 00:25:00.303 accel_perf options: 00:25:00.303 [-h help message] 00:25:00.303 [-q queue depth per core] 00:25:00.303 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:25:00.303 [-T number of threads per core 00:25:00.303 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:25:00.303 [-t time in seconds] 00:25:00.303 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:25:00.304 [ dif_verify, dif_verify_copy, dif_generate, dif_generate_copy 00:25:00.304 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:25:00.304 [-l for compress/decompress workloads, name of uncompressed input file 00:25:00.304 [-S for crc32c workload, use this seed value (default 0) 00:25:00.304 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:25:00.304 [-f for fill workload, use this BYTE value (default 255) 00:25:00.304 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:25:00.304 [-y verify result if this switch is on] 00:25:00.304 [-a tasks to allocate per core (default: same value as -q)] 00:25:00.304 Can be used to spread operations across a wider range of memory. 00:25:00.304 11:35:44 accel.accel_negative_buffers -- common/autotest_common.sh@652 -- # es=1 00:25:00.304 11:35:44 accel.accel_negative_buffers -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:25:00.304 11:35:44 accel.accel_negative_buffers -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:25:00.304 11:35:44 accel.accel_negative_buffers -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:25:00.304 00:25:00.304 real 0m0.041s 00:25:00.304 user 0m0.022s 00:25:00.304 sys 0m0.019s 00:25:00.304 11:35:44 accel.accel_negative_buffers -- common/autotest_common.sh@1125 -- # xtrace_disable 00:25:00.304 11:35:44 accel.accel_negative_buffers -- common/autotest_common.sh@10 -- # set +x 00:25:00.304 ************************************ 00:25:00.304 END TEST accel_negative_buffers 00:25:00.304 ************************************ 00:25:00.304 Error: writing output failed: Broken pipe 00:25:00.563 11:35:44 accel -- accel/accel.sh@101 -- # run_test accel_crc32c accel_test -t 1 -w crc32c -S 32 -y 00:25:00.563 11:35:44 accel -- common/autotest_common.sh@1100 -- # '[' 9 -le 1 ']' 00:25:00.563 11:35:44 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:25:00.563 11:35:44 accel -- common/autotest_common.sh@10 -- # set +x 00:25:00.563 ************************************ 00:25:00.563 START TEST accel_crc32c 00:25:00.563 ************************************ 00:25:00.563 11:35:44 accel.accel_crc32c -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w crc32c -S 32 -y 00:25:00.563 11:35:44 accel.accel_crc32c -- accel/accel.sh@16 -- # local accel_opc 00:25:00.563 11:35:44 accel.accel_crc32c -- accel/accel.sh@17 -- # local accel_module 00:25:00.563 11:35:44 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:25:00.563 11:35:44 accel.accel_crc32c -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:25:00.563 11:35:44 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:25:00.563 11:35:44 accel.accel_crc32c -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:25:00.563 11:35:44 accel.accel_crc32c -- accel/accel.sh@12 -- # build_accel_config 00:25:00.563 11:35:44 accel.accel_crc32c -- accel/accel.sh@31 -- # accel_json_cfg=() 00:25:00.563 11:35:44 accel.accel_crc32c -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:25:00.563 11:35:44 accel.accel_crc32c -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:25:00.563 11:35:44 accel.accel_crc32c -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:25:00.563 11:35:44 accel.accel_crc32c -- accel/accel.sh@36 -- # [[ -n '' ]] 00:25:00.563 11:35:44 accel.accel_crc32c -- accel/accel.sh@40 -- # local IFS=, 00:25:00.563 11:35:44 accel.accel_crc32c -- accel/accel.sh@41 -- # jq -r . 00:25:00.563 [2024-06-10 11:35:44.337364] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:25:00.563 [2024-06-10 11:35:44.337416] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid105390 ] 00:25:00.563 [2024-06-10 11:35:44.422997] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:00.563 [2024-06-10 11:35:44.504234] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:25:00.822 11:35:44 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:25:00.822 11:35:44 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:25:00.822 11:35:44 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:25:00.822 11:35:44 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:25:00.822 11:35:44 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:25:00.822 11:35:44 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:25:00.823 11:35:44 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:25:00.823 11:35:44 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:25:00.823 11:35:44 accel.accel_crc32c -- accel/accel.sh@20 -- # val=0x1 00:25:00.823 11:35:44 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:25:00.823 11:35:44 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:25:00.823 11:35:44 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:25:00.823 11:35:44 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:25:00.823 11:35:44 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:25:00.823 11:35:44 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:25:00.823 11:35:44 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:25:00.823 11:35:44 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:25:00.823 11:35:44 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:25:00.823 11:35:44 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:25:00.823 11:35:44 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:25:00.823 11:35:44 accel.accel_crc32c -- accel/accel.sh@20 -- # val=crc32c 00:25:00.823 11:35:44 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:25:00.823 11:35:44 accel.accel_crc32c -- accel/accel.sh@23 -- # accel_opc=crc32c 00:25:00.823 11:35:44 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:25:00.823 11:35:44 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:25:00.823 11:35:44 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:25:00.823 11:35:44 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:25:00.823 11:35:44 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:25:00.823 11:35:44 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:25:00.823 11:35:44 accel.accel_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:25:00.823 11:35:44 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:25:00.823 11:35:44 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:25:00.823 11:35:44 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:25:00.823 11:35:44 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:25:00.823 11:35:44 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:25:00.823 11:35:44 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:25:00.823 11:35:44 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:25:00.823 11:35:44 accel.accel_crc32c -- accel/accel.sh@20 -- # val=software 00:25:00.823 11:35:44 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:25:00.823 11:35:44 accel.accel_crc32c -- accel/accel.sh@22 -- # accel_module=software 00:25:00.823 11:35:44 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:25:00.823 11:35:44 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:25:00.823 11:35:44 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:25:00.823 11:35:44 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:25:00.823 11:35:44 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:25:00.823 11:35:44 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:25:00.823 11:35:44 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:25:00.823 11:35:44 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:25:00.823 11:35:44 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:25:00.823 11:35:44 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:25:00.823 11:35:44 accel.accel_crc32c -- accel/accel.sh@20 -- # val=1 00:25:00.823 11:35:44 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:25:00.823 11:35:44 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:25:00.823 11:35:44 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:25:00.823 11:35:44 accel.accel_crc32c -- accel/accel.sh@20 -- # val='1 seconds' 00:25:00.823 11:35:44 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:25:00.823 11:35:44 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:25:00.823 11:35:44 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:25:00.823 11:35:44 accel.accel_crc32c -- accel/accel.sh@20 -- # val=Yes 00:25:00.823 11:35:44 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:25:00.823 11:35:44 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:25:00.823 11:35:44 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:25:00.823 11:35:44 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:25:00.823 11:35:44 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:25:00.823 11:35:44 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:25:00.823 11:35:44 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:25:00.823 11:35:44 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:25:00.823 11:35:44 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:25:00.823 11:35:44 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:25:00.823 11:35:44 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:25:02.200 11:35:45 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:25:02.200 11:35:45 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:25:02.200 11:35:45 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:25:02.200 11:35:45 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:25:02.200 11:35:45 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:25:02.200 11:35:45 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:25:02.200 11:35:45 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:25:02.200 11:35:45 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:25:02.200 11:35:45 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:25:02.200 11:35:45 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:25:02.200 11:35:45 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:25:02.200 11:35:45 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:25:02.200 11:35:45 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:25:02.200 11:35:45 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:25:02.200 11:35:45 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:25:02.200 11:35:45 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:25:02.200 11:35:45 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:25:02.200 11:35:45 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:25:02.200 11:35:45 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:25:02.200 11:35:45 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:25:02.200 11:35:45 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:25:02.200 11:35:45 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:25:02.200 11:35:45 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:25:02.200 11:35:45 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:25:02.200 11:35:45 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ -n software ]] 00:25:02.200 11:35:45 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:25:02.200 11:35:45 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:25:02.200 00:25:02.200 real 0m1.420s 00:25:02.200 user 0m1.266s 00:25:02.200 sys 0m0.160s 00:25:02.200 11:35:45 accel.accel_crc32c -- common/autotest_common.sh@1125 -- # xtrace_disable 00:25:02.200 11:35:45 accel.accel_crc32c -- common/autotest_common.sh@10 -- # set +x 00:25:02.200 ************************************ 00:25:02.200 END TEST accel_crc32c 00:25:02.200 ************************************ 00:25:02.200 11:35:45 accel -- accel/accel.sh@102 -- # run_test accel_crc32c_C2 accel_test -t 1 -w crc32c -y -C 2 00:25:02.200 11:35:45 accel -- common/autotest_common.sh@1100 -- # '[' 9 -le 1 ']' 00:25:02.200 11:35:45 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:25:02.200 11:35:45 accel -- common/autotest_common.sh@10 -- # set +x 00:25:02.200 ************************************ 00:25:02.200 START TEST accel_crc32c_C2 00:25:02.200 ************************************ 00:25:02.200 11:35:45 accel.accel_crc32c_C2 -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w crc32c -y -C 2 00:25:02.200 11:35:45 accel.accel_crc32c_C2 -- accel/accel.sh@16 -- # local accel_opc 00:25:02.200 11:35:45 accel.accel_crc32c_C2 -- accel/accel.sh@17 -- # local accel_module 00:25:02.200 11:35:45 accel.accel_crc32c_C2 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -y -C 2 00:25:02.200 11:35:45 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:25:02.200 11:35:45 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:25:02.200 11:35:45 accel.accel_crc32c_C2 -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:25:02.200 11:35:45 accel.accel_crc32c_C2 -- accel/accel.sh@12 -- # build_accel_config 00:25:02.200 11:35:45 accel.accel_crc32c_C2 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:25:02.200 11:35:45 accel.accel_crc32c_C2 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:25:02.200 11:35:45 accel.accel_crc32c_C2 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:25:02.200 11:35:45 accel.accel_crc32c_C2 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:25:02.200 11:35:45 accel.accel_crc32c_C2 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:25:02.200 11:35:45 accel.accel_crc32c_C2 -- accel/accel.sh@40 -- # local IFS=, 00:25:02.200 11:35:45 accel.accel_crc32c_C2 -- accel/accel.sh@41 -- # jq -r . 00:25:02.200 [2024-06-10 11:35:45.810099] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:25:02.200 [2024-06-10 11:35:45.810141] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid105587 ] 00:25:02.200 [2024-06-10 11:35:45.894783] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:02.200 [2024-06-10 11:35:45.976048] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:25:02.200 11:35:46 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:25:02.200 11:35:46 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:25:02.200 11:35:46 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:25:02.200 11:35:46 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:25:02.200 11:35:46 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:25:02.200 11:35:46 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:25:02.200 11:35:46 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:25:02.200 11:35:46 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:25:02.200 11:35:46 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=0x1 00:25:02.200 11:35:46 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:25:02.200 11:35:46 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:25:02.200 11:35:46 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:25:02.200 11:35:46 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:25:02.200 11:35:46 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:25:02.200 11:35:46 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:25:02.200 11:35:46 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:25:02.200 11:35:46 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:25:02.200 11:35:46 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:25:02.200 11:35:46 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:25:02.200 11:35:46 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:25:02.200 11:35:46 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=crc32c 00:25:02.200 11:35:46 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:25:02.200 11:35:46 accel.accel_crc32c_C2 -- accel/accel.sh@23 -- # accel_opc=crc32c 00:25:02.200 11:35:46 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:25:02.200 11:35:46 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:25:02.200 11:35:46 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=0 00:25:02.200 11:35:46 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:25:02.200 11:35:46 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:25:02.200 11:35:46 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:25:02.200 11:35:46 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val='4096 bytes' 00:25:02.200 11:35:46 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:25:02.200 11:35:46 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:25:02.200 11:35:46 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:25:02.200 11:35:46 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:25:02.200 11:35:46 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:25:02.200 11:35:46 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:25:02.200 11:35:46 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:25:02.200 11:35:46 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=software 00:25:02.201 11:35:46 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:25:02.201 11:35:46 accel.accel_crc32c_C2 -- accel/accel.sh@22 -- # accel_module=software 00:25:02.201 11:35:46 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:25:02.201 11:35:46 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:25:02.201 11:35:46 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:25:02.201 11:35:46 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:25:02.201 11:35:46 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:25:02.201 11:35:46 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:25:02.201 11:35:46 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:25:02.201 11:35:46 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:25:02.201 11:35:46 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:25:02.201 11:35:46 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:25:02.201 11:35:46 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=1 00:25:02.201 11:35:46 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:25:02.201 11:35:46 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:25:02.201 11:35:46 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:25:02.201 11:35:46 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val='1 seconds' 00:25:02.201 11:35:46 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:25:02.201 11:35:46 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:25:02.201 11:35:46 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:25:02.201 11:35:46 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=Yes 00:25:02.201 11:35:46 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:25:02.201 11:35:46 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:25:02.201 11:35:46 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:25:02.201 11:35:46 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:25:02.201 11:35:46 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:25:02.201 11:35:46 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:25:02.201 11:35:46 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:25:02.201 11:35:46 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:25:02.201 11:35:46 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:25:02.201 11:35:46 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:25:02.201 11:35:46 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:25:03.577 11:35:47 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:25:03.577 11:35:47 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:25:03.577 11:35:47 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:25:03.577 11:35:47 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:25:03.577 11:35:47 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:25:03.577 11:35:47 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:25:03.577 11:35:47 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:25:03.577 11:35:47 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:25:03.577 11:35:47 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:25:03.577 11:35:47 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:25:03.577 11:35:47 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:25:03.577 11:35:47 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:25:03.577 11:35:47 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:25:03.577 11:35:47 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:25:03.577 11:35:47 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:25:03.577 11:35:47 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:25:03.577 11:35:47 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:25:03.577 11:35:47 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:25:03.577 11:35:47 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:25:03.577 11:35:47 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:25:03.577 11:35:47 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:25:03.577 11:35:47 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:25:03.577 11:35:47 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:25:03.577 11:35:47 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:25:03.577 11:35:47 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n software ]] 00:25:03.577 11:35:47 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:25:03.577 11:35:47 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:25:03.577 00:25:03.577 real 0m1.410s 00:25:03.577 user 0m1.264s 00:25:03.577 sys 0m0.146s 00:25:03.577 11:35:47 accel.accel_crc32c_C2 -- common/autotest_common.sh@1125 -- # xtrace_disable 00:25:03.577 11:35:47 accel.accel_crc32c_C2 -- common/autotest_common.sh@10 -- # set +x 00:25:03.577 ************************************ 00:25:03.577 END TEST accel_crc32c_C2 00:25:03.577 ************************************ 00:25:03.577 11:35:47 accel -- accel/accel.sh@103 -- # run_test accel_copy accel_test -t 1 -w copy -y 00:25:03.577 11:35:47 accel -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:25:03.577 11:35:47 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:25:03.577 11:35:47 accel -- common/autotest_common.sh@10 -- # set +x 00:25:03.577 ************************************ 00:25:03.577 START TEST accel_copy 00:25:03.577 ************************************ 00:25:03.577 11:35:47 accel.accel_copy -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w copy -y 00:25:03.577 11:35:47 accel.accel_copy -- accel/accel.sh@16 -- # local accel_opc 00:25:03.577 11:35:47 accel.accel_copy -- accel/accel.sh@17 -- # local accel_module 00:25:03.577 11:35:47 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:25:03.577 11:35:47 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:25:03.577 11:35:47 accel.accel_copy -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy -y 00:25:03.577 11:35:47 accel.accel_copy -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:25:03.577 11:35:47 accel.accel_copy -- accel/accel.sh@12 -- # build_accel_config 00:25:03.577 11:35:47 accel.accel_copy -- accel/accel.sh@31 -- # accel_json_cfg=() 00:25:03.577 11:35:47 accel.accel_copy -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:25:03.577 11:35:47 accel.accel_copy -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:25:03.577 11:35:47 accel.accel_copy -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:25:03.577 11:35:47 accel.accel_copy -- accel/accel.sh@36 -- # [[ -n '' ]] 00:25:03.577 11:35:47 accel.accel_copy -- accel/accel.sh@40 -- # local IFS=, 00:25:03.577 11:35:47 accel.accel_copy -- accel/accel.sh@41 -- # jq -r . 00:25:03.577 [2024-06-10 11:35:47.293813] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:25:03.577 [2024-06-10 11:35:47.293943] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid105784 ] 00:25:03.577 [2024-06-10 11:35:47.377921] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:03.577 [2024-06-10 11:35:47.460251] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:25:03.577 11:35:47 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:25:03.577 11:35:47 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:25:03.577 11:35:47 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:25:03.577 11:35:47 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:25:03.577 11:35:47 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:25:03.577 11:35:47 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:25:03.577 11:35:47 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:25:03.577 11:35:47 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:25:03.577 11:35:47 accel.accel_copy -- accel/accel.sh@20 -- # val=0x1 00:25:03.836 11:35:47 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:25:03.836 11:35:47 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:25:03.836 11:35:47 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:25:03.836 11:35:47 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:25:03.836 11:35:47 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:25:03.836 11:35:47 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:25:03.836 11:35:47 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:25:03.836 11:35:47 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:25:03.836 11:35:47 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:25:03.836 11:35:47 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:25:03.836 11:35:47 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:25:03.836 11:35:47 accel.accel_copy -- accel/accel.sh@20 -- # val=copy 00:25:03.836 11:35:47 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:25:03.836 11:35:47 accel.accel_copy -- accel/accel.sh@23 -- # accel_opc=copy 00:25:03.836 11:35:47 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:25:03.836 11:35:47 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:25:03.836 11:35:47 accel.accel_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:25:03.836 11:35:47 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:25:03.836 11:35:47 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:25:03.836 11:35:47 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:25:03.836 11:35:47 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:25:03.836 11:35:47 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:25:03.836 11:35:47 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:25:03.836 11:35:47 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:25:03.836 11:35:47 accel.accel_copy -- accel/accel.sh@20 -- # val=software 00:25:03.836 11:35:47 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:25:03.836 11:35:47 accel.accel_copy -- accel/accel.sh@22 -- # accel_module=software 00:25:03.836 11:35:47 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:25:03.836 11:35:47 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:25:03.836 11:35:47 accel.accel_copy -- accel/accel.sh@20 -- # val=32 00:25:03.836 11:35:47 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:25:03.836 11:35:47 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:25:03.836 11:35:47 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:25:03.836 11:35:47 accel.accel_copy -- accel/accel.sh@20 -- # val=32 00:25:03.836 11:35:47 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:25:03.836 11:35:47 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:25:03.836 11:35:47 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:25:03.836 11:35:47 accel.accel_copy -- accel/accel.sh@20 -- # val=1 00:25:03.836 11:35:47 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:25:03.836 11:35:47 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:25:03.836 11:35:47 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:25:03.836 11:35:47 accel.accel_copy -- accel/accel.sh@20 -- # val='1 seconds' 00:25:03.836 11:35:47 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:25:03.836 11:35:47 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:25:03.836 11:35:47 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:25:03.836 11:35:47 accel.accel_copy -- accel/accel.sh@20 -- # val=Yes 00:25:03.836 11:35:47 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:25:03.836 11:35:47 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:25:03.836 11:35:47 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:25:03.836 11:35:47 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:25:03.836 11:35:47 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:25:03.836 11:35:47 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:25:03.836 11:35:47 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:25:03.836 11:35:47 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:25:03.836 11:35:47 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:25:03.837 11:35:47 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:25:03.837 11:35:47 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:25:04.773 11:35:48 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:25:04.773 11:35:48 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:25:04.773 11:35:48 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:25:04.773 11:35:48 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:25:04.773 11:35:48 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:25:04.773 11:35:48 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:25:04.773 11:35:48 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:25:04.773 11:35:48 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:25:04.773 11:35:48 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:25:04.773 11:35:48 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:25:04.773 11:35:48 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:25:04.773 11:35:48 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:25:04.773 11:35:48 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:25:04.773 11:35:48 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:25:04.773 11:35:48 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:25:04.773 11:35:48 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:25:04.773 11:35:48 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:25:04.773 11:35:48 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:25:04.773 11:35:48 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:25:04.773 11:35:48 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:25:04.773 11:35:48 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:25:04.773 11:35:48 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:25:04.773 11:35:48 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:25:04.773 11:35:48 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:25:04.773 11:35:48 accel.accel_copy -- accel/accel.sh@27 -- # [[ -n software ]] 00:25:04.773 11:35:48 accel.accel_copy -- accel/accel.sh@27 -- # [[ -n copy ]] 00:25:04.773 11:35:48 accel.accel_copy -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:25:04.773 00:25:04.773 real 0m1.415s 00:25:04.773 user 0m1.258s 00:25:04.773 sys 0m0.162s 00:25:04.773 11:35:48 accel.accel_copy -- common/autotest_common.sh@1125 -- # xtrace_disable 00:25:04.773 11:35:48 accel.accel_copy -- common/autotest_common.sh@10 -- # set +x 00:25:04.773 ************************************ 00:25:04.773 END TEST accel_copy 00:25:04.773 ************************************ 00:25:04.773 11:35:48 accel -- accel/accel.sh@104 -- # run_test accel_fill accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:25:04.773 11:35:48 accel -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:25:04.773 11:35:48 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:25:04.773 11:35:48 accel -- common/autotest_common.sh@10 -- # set +x 00:25:05.032 ************************************ 00:25:05.032 START TEST accel_fill 00:25:05.032 ************************************ 00:25:05.032 11:35:48 accel.accel_fill -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:25:05.032 11:35:48 accel.accel_fill -- accel/accel.sh@16 -- # local accel_opc 00:25:05.032 11:35:48 accel.accel_fill -- accel/accel.sh@17 -- # local accel_module 00:25:05.032 11:35:48 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:25:05.032 11:35:48 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:25:05.032 11:35:48 accel.accel_fill -- accel/accel.sh@15 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:25:05.032 11:35:48 accel.accel_fill -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:25:05.032 11:35:48 accel.accel_fill -- accel/accel.sh@12 -- # build_accel_config 00:25:05.032 11:35:48 accel.accel_fill -- accel/accel.sh@31 -- # accel_json_cfg=() 00:25:05.032 11:35:48 accel.accel_fill -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:25:05.032 11:35:48 accel.accel_fill -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:25:05.032 11:35:48 accel.accel_fill -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:25:05.032 11:35:48 accel.accel_fill -- accel/accel.sh@36 -- # [[ -n '' ]] 00:25:05.032 11:35:48 accel.accel_fill -- accel/accel.sh@40 -- # local IFS=, 00:25:05.032 11:35:48 accel.accel_fill -- accel/accel.sh@41 -- # jq -r . 00:25:05.032 [2024-06-10 11:35:48.786651] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:25:05.032 [2024-06-10 11:35:48.786706] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid105977 ] 00:25:05.032 [2024-06-10 11:35:48.872991] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:05.032 [2024-06-10 11:35:48.954728] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:25:05.291 11:35:49 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:25:05.291 11:35:49 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:25:05.291 11:35:49 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:25:05.291 11:35:49 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:25:05.291 11:35:49 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:25:05.291 11:35:49 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:25:05.291 11:35:49 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:25:05.291 11:35:49 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:25:05.291 11:35:49 accel.accel_fill -- accel/accel.sh@20 -- # val=0x1 00:25:05.291 11:35:49 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:25:05.291 11:35:49 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:25:05.291 11:35:49 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:25:05.291 11:35:49 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:25:05.291 11:35:49 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:25:05.291 11:35:49 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:25:05.291 11:35:49 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:25:05.291 11:35:49 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:25:05.291 11:35:49 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:25:05.291 11:35:49 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:25:05.291 11:35:49 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:25:05.291 11:35:49 accel.accel_fill -- accel/accel.sh@20 -- # val=fill 00:25:05.291 11:35:49 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:25:05.291 11:35:49 accel.accel_fill -- accel/accel.sh@23 -- # accel_opc=fill 00:25:05.291 11:35:49 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:25:05.291 11:35:49 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:25:05.291 11:35:49 accel.accel_fill -- accel/accel.sh@20 -- # val=0x80 00:25:05.291 11:35:49 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:25:05.291 11:35:49 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:25:05.291 11:35:49 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:25:05.291 11:35:49 accel.accel_fill -- accel/accel.sh@20 -- # val='4096 bytes' 00:25:05.291 11:35:49 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:25:05.291 11:35:49 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:25:05.291 11:35:49 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:25:05.291 11:35:49 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:25:05.291 11:35:49 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:25:05.291 11:35:49 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:25:05.291 11:35:49 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:25:05.291 11:35:49 accel.accel_fill -- accel/accel.sh@20 -- # val=software 00:25:05.291 11:35:49 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:25:05.291 11:35:49 accel.accel_fill -- accel/accel.sh@22 -- # accel_module=software 00:25:05.291 11:35:49 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:25:05.291 11:35:49 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:25:05.291 11:35:49 accel.accel_fill -- accel/accel.sh@20 -- # val=64 00:25:05.291 11:35:49 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:25:05.291 11:35:49 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:25:05.291 11:35:49 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:25:05.291 11:35:49 accel.accel_fill -- accel/accel.sh@20 -- # val=64 00:25:05.291 11:35:49 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:25:05.291 11:35:49 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:25:05.291 11:35:49 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:25:05.291 11:35:49 accel.accel_fill -- accel/accel.sh@20 -- # val=1 00:25:05.291 11:35:49 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:25:05.291 11:35:49 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:25:05.291 11:35:49 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:25:05.291 11:35:49 accel.accel_fill -- accel/accel.sh@20 -- # val='1 seconds' 00:25:05.291 11:35:49 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:25:05.291 11:35:49 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:25:05.291 11:35:49 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:25:05.291 11:35:49 accel.accel_fill -- accel/accel.sh@20 -- # val=Yes 00:25:05.291 11:35:49 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:25:05.291 11:35:49 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:25:05.291 11:35:49 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:25:05.291 11:35:49 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:25:05.291 11:35:49 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:25:05.291 11:35:49 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:25:05.291 11:35:49 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:25:05.291 11:35:49 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:25:05.291 11:35:49 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:25:05.291 11:35:49 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:25:05.291 11:35:49 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:25:06.228 11:35:50 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:25:06.228 11:35:50 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:25:06.228 11:35:50 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:25:06.228 11:35:50 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:25:06.228 11:35:50 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:25:06.228 11:35:50 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:25:06.228 11:35:50 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:25:06.228 11:35:50 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:25:06.228 11:35:50 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:25:06.228 11:35:50 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:25:06.228 11:35:50 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:25:06.228 11:35:50 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:25:06.228 11:35:50 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:25:06.228 11:35:50 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:25:06.228 11:35:50 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:25:06.228 11:35:50 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:25:06.228 11:35:50 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:25:06.228 11:35:50 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:25:06.228 11:35:50 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:25:06.228 11:35:50 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:25:06.228 11:35:50 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:25:06.228 11:35:50 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:25:06.228 11:35:50 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:25:06.228 11:35:50 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:25:06.487 11:35:50 accel.accel_fill -- accel/accel.sh@27 -- # [[ -n software ]] 00:25:06.487 11:35:50 accel.accel_fill -- accel/accel.sh@27 -- # [[ -n fill ]] 00:25:06.487 11:35:50 accel.accel_fill -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:25:06.487 00:25:06.487 real 0m1.421s 00:25:06.487 user 0m1.261s 00:25:06.487 sys 0m0.166s 00:25:06.487 11:35:50 accel.accel_fill -- common/autotest_common.sh@1125 -- # xtrace_disable 00:25:06.487 11:35:50 accel.accel_fill -- common/autotest_common.sh@10 -- # set +x 00:25:06.487 ************************************ 00:25:06.487 END TEST accel_fill 00:25:06.487 ************************************ 00:25:06.487 11:35:50 accel -- accel/accel.sh@105 -- # run_test accel_copy_crc32c accel_test -t 1 -w copy_crc32c -y 00:25:06.487 11:35:50 accel -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:25:06.487 11:35:50 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:25:06.487 11:35:50 accel -- common/autotest_common.sh@10 -- # set +x 00:25:06.487 ************************************ 00:25:06.487 START TEST accel_copy_crc32c 00:25:06.487 ************************************ 00:25:06.487 11:35:50 accel.accel_copy_crc32c -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w copy_crc32c -y 00:25:06.487 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@16 -- # local accel_opc 00:25:06.487 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@17 -- # local accel_module 00:25:06.487 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:25:06.487 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:25:06.487 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y 00:25:06.487 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:25:06.487 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@12 -- # build_accel_config 00:25:06.487 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@31 -- # accel_json_cfg=() 00:25:06.487 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:25:06.487 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:25:06.487 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:25:06.487 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@36 -- # [[ -n '' ]] 00:25:06.487 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@40 -- # local IFS=, 00:25:06.487 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@41 -- # jq -r . 00:25:06.487 [2024-06-10 11:35:50.286362] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:25:06.487 [2024-06-10 11:35:50.286414] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid106177 ] 00:25:06.487 [2024-06-10 11:35:50.374308] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:06.746 [2024-06-10 11:35:50.455808] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:25:06.746 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:25:06.746 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:25:06.746 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:25:06.746 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:25:06.746 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:25:06.746 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:25:06.746 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:25:06.746 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:25:06.746 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=0x1 00:25:06.746 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:25:06.746 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:25:06.746 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:25:06.746 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:25:06.746 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:25:06.746 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:25:06.746 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:25:06.746 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:25:06.746 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:25:06.746 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:25:06.746 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:25:06.746 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=copy_crc32c 00:25:06.747 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:25:06.747 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:25:06.747 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:25:06.747 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:25:06.747 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=0 00:25:06.747 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:25:06.747 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:25:06.747 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:25:06.747 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:25:06.747 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:25:06.747 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:25:06.747 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:25:06.747 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:25:06.747 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:25:06.747 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:25:06.747 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:25:06.747 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:25:06.747 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:25:06.747 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:25:06.747 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:25:06.747 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=software 00:25:06.747 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:25:06.747 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@22 -- # accel_module=software 00:25:06.747 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:25:06.747 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:25:06.747 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=32 00:25:06.747 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:25:06.747 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:25:06.747 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:25:06.747 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=32 00:25:06.747 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:25:06.747 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:25:06.747 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:25:06.747 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=1 00:25:06.747 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:25:06.747 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:25:06.747 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:25:06.747 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='1 seconds' 00:25:06.747 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:25:06.747 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:25:06.747 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:25:06.747 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=Yes 00:25:06.747 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:25:06.747 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:25:06.747 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:25:06.747 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:25:06.747 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:25:06.747 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:25:06.747 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:25:06.747 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:25:06.747 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:25:06.747 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:25:06.747 11:35:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:25:08.134 11:35:51 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:25:08.134 11:35:51 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:25:08.134 11:35:51 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:25:08.134 11:35:51 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:25:08.134 11:35:51 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:25:08.134 11:35:51 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:25:08.134 11:35:51 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:25:08.134 11:35:51 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:25:08.134 11:35:51 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:25:08.134 11:35:51 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:25:08.134 11:35:51 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:25:08.134 11:35:51 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:25:08.134 11:35:51 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:25:08.134 11:35:51 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:25:08.134 11:35:51 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:25:08.134 11:35:51 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:25:08.134 11:35:51 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:25:08.134 11:35:51 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:25:08.134 11:35:51 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:25:08.134 11:35:51 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:25:08.134 11:35:51 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:25:08.134 11:35:51 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:25:08.134 11:35:51 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:25:08.134 11:35:51 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:25:08.134 11:35:51 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ -n software ]] 00:25:08.134 11:35:51 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:25:08.134 11:35:51 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:25:08.134 00:25:08.134 real 0m1.414s 00:25:08.134 user 0m1.261s 00:25:08.134 sys 0m0.154s 00:25:08.134 11:35:51 accel.accel_copy_crc32c -- common/autotest_common.sh@1125 -- # xtrace_disable 00:25:08.134 11:35:51 accel.accel_copy_crc32c -- common/autotest_common.sh@10 -- # set +x 00:25:08.134 ************************************ 00:25:08.134 END TEST accel_copy_crc32c 00:25:08.134 ************************************ 00:25:08.134 11:35:51 accel -- accel/accel.sh@106 -- # run_test accel_copy_crc32c_C2 accel_test -t 1 -w copy_crc32c -y -C 2 00:25:08.134 11:35:51 accel -- common/autotest_common.sh@1100 -- # '[' 9 -le 1 ']' 00:25:08.134 11:35:51 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:25:08.134 11:35:51 accel -- common/autotest_common.sh@10 -- # set +x 00:25:08.134 ************************************ 00:25:08.134 START TEST accel_copy_crc32c_C2 00:25:08.134 ************************************ 00:25:08.134 11:35:51 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w copy_crc32c -y -C 2 00:25:08.134 11:35:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@16 -- # local accel_opc 00:25:08.134 11:35:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@17 -- # local accel_module 00:25:08.134 11:35:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:25:08.134 11:35:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:25:08.134 11:35:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:25:08.134 11:35:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:25:08.134 11:35:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@12 -- # build_accel_config 00:25:08.134 11:35:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:25:08.134 11:35:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:25:08.134 11:35:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:25:08.134 11:35:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:25:08.134 11:35:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:25:08.134 11:35:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@40 -- # local IFS=, 00:25:08.134 11:35:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@41 -- # jq -r . 00:25:08.134 [2024-06-10 11:35:51.772713] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:25:08.134 [2024-06-10 11:35:51.772770] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid106423 ] 00:25:08.134 [2024-06-10 11:35:51.858943] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:08.134 [2024-06-10 11:35:51.941669] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:25:08.134 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:25:08.134 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:25:08.134 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:25:08.134 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:25:08.134 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:25:08.134 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:25:08.134 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:25:08.134 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:25:08.134 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=0x1 00:25:08.134 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:25:08.134 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:25:08.134 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:25:08.134 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:25:08.134 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:25:08.134 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:25:08.134 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:25:08.134 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:25:08.134 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:25:08.134 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:25:08.134 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:25:08.134 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=copy_crc32c 00:25:08.134 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:25:08.134 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:25:08.134 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:25:08.134 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:25:08.134 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=0 00:25:08.134 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:25:08.134 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:25:08.134 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:25:08.134 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='4096 bytes' 00:25:08.134 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:25:08.134 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:25:08.134 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:25:08.134 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='8192 bytes' 00:25:08.134 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:25:08.134 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:25:08.134 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:25:08.134 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:25:08.134 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:25:08.135 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:25:08.135 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:25:08.135 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=software 00:25:08.135 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:25:08.135 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@22 -- # accel_module=software 00:25:08.135 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:25:08.135 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:25:08.135 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:25:08.135 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:25:08.135 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:25:08.135 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:25:08.135 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:25:08.135 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:25:08.135 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:25:08.135 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:25:08.135 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=1 00:25:08.135 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:25:08.135 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:25:08.135 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:25:08.135 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='1 seconds' 00:25:08.135 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:25:08.135 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:25:08.135 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:25:08.135 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=Yes 00:25:08.135 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:25:08.135 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:25:08.135 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:25:08.135 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:25:08.135 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:25:08.135 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:25:08.135 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:25:08.135 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:25:08.135 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:25:08.135 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:25:08.135 11:35:52 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:25:09.512 11:35:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:25:09.512 11:35:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:25:09.512 11:35:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:25:09.512 11:35:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:25:09.512 11:35:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:25:09.512 11:35:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:25:09.512 11:35:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:25:09.512 11:35:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:25:09.512 11:35:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:25:09.512 11:35:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:25:09.512 11:35:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:25:09.512 11:35:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:25:09.512 11:35:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:25:09.512 11:35:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:25:09.512 11:35:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:25:09.512 11:35:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:25:09.512 11:35:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:25:09.512 11:35:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:25:09.512 11:35:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:25:09.512 11:35:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:25:09.512 11:35:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:25:09.512 11:35:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:25:09.512 11:35:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:25:09.512 11:35:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:25:09.512 11:35:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n software ]] 00:25:09.512 11:35:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:25:09.512 11:35:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:25:09.512 00:25:09.512 real 0m1.430s 00:25:09.512 user 0m1.264s 00:25:09.512 sys 0m0.163s 00:25:09.512 11:35:53 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@1125 -- # xtrace_disable 00:25:09.512 11:35:53 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@10 -- # set +x 00:25:09.512 ************************************ 00:25:09.512 END TEST accel_copy_crc32c_C2 00:25:09.512 ************************************ 00:25:09.512 11:35:53 accel -- accel/accel.sh@107 -- # run_test accel_dualcast accel_test -t 1 -w dualcast -y 00:25:09.512 11:35:53 accel -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:25:09.512 11:35:53 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:25:09.512 11:35:53 accel -- common/autotest_common.sh@10 -- # set +x 00:25:09.512 ************************************ 00:25:09.512 START TEST accel_dualcast 00:25:09.512 ************************************ 00:25:09.512 11:35:53 accel.accel_dualcast -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w dualcast -y 00:25:09.512 11:35:53 accel.accel_dualcast -- accel/accel.sh@16 -- # local accel_opc 00:25:09.512 11:35:53 accel.accel_dualcast -- accel/accel.sh@17 -- # local accel_module 00:25:09.512 11:35:53 accel.accel_dualcast -- accel/accel.sh@15 -- # accel_perf -t 1 -w dualcast -y 00:25:09.512 11:35:53 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:25:09.512 11:35:53 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:25:09.512 11:35:53 accel.accel_dualcast -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:25:09.512 11:35:53 accel.accel_dualcast -- accel/accel.sh@12 -- # build_accel_config 00:25:09.512 11:35:53 accel.accel_dualcast -- accel/accel.sh@31 -- # accel_json_cfg=() 00:25:09.512 11:35:53 accel.accel_dualcast -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:25:09.512 11:35:53 accel.accel_dualcast -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:25:09.512 11:35:53 accel.accel_dualcast -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:25:09.512 11:35:53 accel.accel_dualcast -- accel/accel.sh@36 -- # [[ -n '' ]] 00:25:09.512 11:35:53 accel.accel_dualcast -- accel/accel.sh@40 -- # local IFS=, 00:25:09.512 11:35:53 accel.accel_dualcast -- accel/accel.sh@41 -- # jq -r . 00:25:09.512 [2024-06-10 11:35:53.261359] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:25:09.512 [2024-06-10 11:35:53.261402] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid106692 ] 00:25:09.512 [2024-06-10 11:35:53.348684] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:09.512 [2024-06-10 11:35:53.431313] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:25:09.772 11:35:53 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:25:09.772 11:35:53 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:25:09.772 11:35:53 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:25:09.772 11:35:53 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:25:09.772 11:35:53 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:25:09.772 11:35:53 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:25:09.772 11:35:53 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:25:09.772 11:35:53 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:25:09.772 11:35:53 accel.accel_dualcast -- accel/accel.sh@20 -- # val=0x1 00:25:09.772 11:35:53 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:25:09.772 11:35:53 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:25:09.772 11:35:53 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:25:09.772 11:35:53 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:25:09.772 11:35:53 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:25:09.772 11:35:53 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:25:09.772 11:35:53 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:25:09.772 11:35:53 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:25:09.772 11:35:53 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:25:09.772 11:35:53 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:25:09.772 11:35:53 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:25:09.772 11:35:53 accel.accel_dualcast -- accel/accel.sh@20 -- # val=dualcast 00:25:09.772 11:35:53 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:25:09.772 11:35:53 accel.accel_dualcast -- accel/accel.sh@23 -- # accel_opc=dualcast 00:25:09.772 11:35:53 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:25:09.772 11:35:53 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:25:09.772 11:35:53 accel.accel_dualcast -- accel/accel.sh@20 -- # val='4096 bytes' 00:25:09.772 11:35:53 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:25:09.772 11:35:53 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:25:09.772 11:35:53 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:25:09.772 11:35:53 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:25:09.772 11:35:53 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:25:09.772 11:35:53 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:25:09.772 11:35:53 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:25:09.772 11:35:53 accel.accel_dualcast -- accel/accel.sh@20 -- # val=software 00:25:09.772 11:35:53 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:25:09.772 11:35:53 accel.accel_dualcast -- accel/accel.sh@22 -- # accel_module=software 00:25:09.772 11:35:53 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:25:09.772 11:35:53 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:25:09.772 11:35:53 accel.accel_dualcast -- accel/accel.sh@20 -- # val=32 00:25:09.772 11:35:53 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:25:09.772 11:35:53 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:25:09.772 11:35:53 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:25:09.772 11:35:53 accel.accel_dualcast -- accel/accel.sh@20 -- # val=32 00:25:09.772 11:35:53 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:25:09.772 11:35:53 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:25:09.772 11:35:53 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:25:09.772 11:35:53 accel.accel_dualcast -- accel/accel.sh@20 -- # val=1 00:25:09.772 11:35:53 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:25:09.772 11:35:53 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:25:09.772 11:35:53 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:25:09.772 11:35:53 accel.accel_dualcast -- accel/accel.sh@20 -- # val='1 seconds' 00:25:09.772 11:35:53 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:25:09.772 11:35:53 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:25:09.772 11:35:53 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:25:09.772 11:35:53 accel.accel_dualcast -- accel/accel.sh@20 -- # val=Yes 00:25:09.772 11:35:53 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:25:09.772 11:35:53 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:25:09.772 11:35:53 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:25:09.772 11:35:53 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:25:09.772 11:35:53 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:25:09.772 11:35:53 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:25:09.772 11:35:53 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:25:09.772 11:35:53 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:25:09.772 11:35:53 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:25:09.772 11:35:53 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:25:09.772 11:35:53 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:25:10.710 11:35:54 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:25:10.710 11:35:54 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:25:10.710 11:35:54 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:25:10.710 11:35:54 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:25:10.710 11:35:54 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:25:10.710 11:35:54 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:25:10.710 11:35:54 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:25:10.710 11:35:54 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:25:10.710 11:35:54 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:25:10.710 11:35:54 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:25:10.710 11:35:54 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:25:10.710 11:35:54 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:25:10.710 11:35:54 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:25:10.710 11:35:54 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:25:10.710 11:35:54 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:25:10.710 11:35:54 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:25:10.710 11:35:54 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:25:10.710 11:35:54 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:25:10.710 11:35:54 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:25:10.710 11:35:54 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:25:10.710 11:35:54 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:25:10.710 11:35:54 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:25:10.710 11:35:54 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:25:10.710 11:35:54 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:25:10.710 11:35:54 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ -n software ]] 00:25:10.710 11:35:54 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ -n dualcast ]] 00:25:10.710 11:35:54 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:25:10.710 00:25:10.710 real 0m1.408s 00:25:10.710 user 0m1.254s 00:25:10.710 sys 0m0.159s 00:25:10.710 11:35:54 accel.accel_dualcast -- common/autotest_common.sh@1125 -- # xtrace_disable 00:25:10.710 11:35:54 accel.accel_dualcast -- common/autotest_common.sh@10 -- # set +x 00:25:10.710 ************************************ 00:25:10.710 END TEST accel_dualcast 00:25:10.710 ************************************ 00:25:10.969 11:35:54 accel -- accel/accel.sh@108 -- # run_test accel_compare accel_test -t 1 -w compare -y 00:25:10.969 11:35:54 accel -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:25:10.969 11:35:54 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:25:10.969 11:35:54 accel -- common/autotest_common.sh@10 -- # set +x 00:25:10.969 ************************************ 00:25:10.969 START TEST accel_compare 00:25:10.969 ************************************ 00:25:10.969 11:35:54 accel.accel_compare -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w compare -y 00:25:10.969 11:35:54 accel.accel_compare -- accel/accel.sh@16 -- # local accel_opc 00:25:10.969 11:35:54 accel.accel_compare -- accel/accel.sh@17 -- # local accel_module 00:25:10.969 11:35:54 accel.accel_compare -- accel/accel.sh@15 -- # accel_perf -t 1 -w compare -y 00:25:10.969 11:35:54 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:25:10.969 11:35:54 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:25:10.969 11:35:54 accel.accel_compare -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:25:10.969 11:35:54 accel.accel_compare -- accel/accel.sh@12 -- # build_accel_config 00:25:10.969 11:35:54 accel.accel_compare -- accel/accel.sh@31 -- # accel_json_cfg=() 00:25:10.969 11:35:54 accel.accel_compare -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:25:10.969 11:35:54 accel.accel_compare -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:25:10.969 11:35:54 accel.accel_compare -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:25:10.969 11:35:54 accel.accel_compare -- accel/accel.sh@36 -- # [[ -n '' ]] 00:25:10.969 11:35:54 accel.accel_compare -- accel/accel.sh@40 -- # local IFS=, 00:25:10.969 11:35:54 accel.accel_compare -- accel/accel.sh@41 -- # jq -r . 00:25:10.969 [2024-06-10 11:35:54.737652] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:25:10.969 [2024-06-10 11:35:54.737694] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid106925 ] 00:25:10.969 [2024-06-10 11:35:54.817890] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:10.969 [2024-06-10 11:35:54.899357] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:25:11.228 11:35:54 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:25:11.228 11:35:54 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:25:11.228 11:35:54 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:25:11.228 11:35:54 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:25:11.228 11:35:54 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:25:11.228 11:35:54 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:25:11.228 11:35:54 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:25:11.228 11:35:54 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:25:11.228 11:35:54 accel.accel_compare -- accel/accel.sh@20 -- # val=0x1 00:25:11.228 11:35:54 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:25:11.228 11:35:54 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:25:11.228 11:35:54 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:25:11.228 11:35:54 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:25:11.228 11:35:54 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:25:11.228 11:35:54 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:25:11.228 11:35:54 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:25:11.228 11:35:54 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:25:11.228 11:35:54 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:25:11.228 11:35:54 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:25:11.228 11:35:54 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:25:11.228 11:35:54 accel.accel_compare -- accel/accel.sh@20 -- # val=compare 00:25:11.228 11:35:54 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:25:11.228 11:35:54 accel.accel_compare -- accel/accel.sh@23 -- # accel_opc=compare 00:25:11.228 11:35:54 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:25:11.228 11:35:54 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:25:11.228 11:35:54 accel.accel_compare -- accel/accel.sh@20 -- # val='4096 bytes' 00:25:11.228 11:35:54 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:25:11.228 11:35:54 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:25:11.228 11:35:54 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:25:11.228 11:35:54 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:25:11.228 11:35:54 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:25:11.228 11:35:54 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:25:11.228 11:35:54 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:25:11.228 11:35:54 accel.accel_compare -- accel/accel.sh@20 -- # val=software 00:25:11.228 11:35:54 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:25:11.228 11:35:54 accel.accel_compare -- accel/accel.sh@22 -- # accel_module=software 00:25:11.228 11:35:54 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:25:11.228 11:35:54 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:25:11.228 11:35:54 accel.accel_compare -- accel/accel.sh@20 -- # val=32 00:25:11.228 11:35:54 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:25:11.228 11:35:54 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:25:11.228 11:35:54 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:25:11.228 11:35:54 accel.accel_compare -- accel/accel.sh@20 -- # val=32 00:25:11.228 11:35:54 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:25:11.228 11:35:54 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:25:11.229 11:35:54 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:25:11.229 11:35:54 accel.accel_compare -- accel/accel.sh@20 -- # val=1 00:25:11.229 11:35:54 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:25:11.229 11:35:54 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:25:11.229 11:35:54 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:25:11.229 11:35:54 accel.accel_compare -- accel/accel.sh@20 -- # val='1 seconds' 00:25:11.229 11:35:54 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:25:11.229 11:35:54 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:25:11.229 11:35:54 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:25:11.229 11:35:54 accel.accel_compare -- accel/accel.sh@20 -- # val=Yes 00:25:11.229 11:35:54 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:25:11.229 11:35:54 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:25:11.229 11:35:54 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:25:11.229 11:35:54 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:25:11.229 11:35:54 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:25:11.229 11:35:54 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:25:11.229 11:35:54 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:25:11.229 11:35:54 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:25:11.229 11:35:54 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:25:11.229 11:35:54 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:25:11.229 11:35:54 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:25:12.165 11:35:56 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:25:12.165 11:35:56 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:25:12.165 11:35:56 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:25:12.165 11:35:56 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:25:12.165 11:35:56 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:25:12.424 11:35:56 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:25:12.424 11:35:56 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:25:12.424 11:35:56 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:25:12.424 11:35:56 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:25:12.424 11:35:56 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:25:12.424 11:35:56 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:25:12.424 11:35:56 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:25:12.425 11:35:56 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:25:12.425 11:35:56 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:25:12.425 11:35:56 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:25:12.425 11:35:56 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:25:12.425 11:35:56 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:25:12.425 11:35:56 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:25:12.425 11:35:56 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:25:12.425 11:35:56 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:25:12.425 11:35:56 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:25:12.425 11:35:56 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:25:12.425 11:35:56 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:25:12.425 11:35:56 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:25:12.425 11:35:56 accel.accel_compare -- accel/accel.sh@27 -- # [[ -n software ]] 00:25:12.425 11:35:56 accel.accel_compare -- accel/accel.sh@27 -- # [[ -n compare ]] 00:25:12.425 11:35:56 accel.accel_compare -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:25:12.425 00:25:12.425 real 0m1.400s 00:25:12.425 user 0m1.246s 00:25:12.425 sys 0m0.160s 00:25:12.425 11:35:56 accel.accel_compare -- common/autotest_common.sh@1125 -- # xtrace_disable 00:25:12.425 11:35:56 accel.accel_compare -- common/autotest_common.sh@10 -- # set +x 00:25:12.425 ************************************ 00:25:12.425 END TEST accel_compare 00:25:12.425 ************************************ 00:25:12.425 11:35:56 accel -- accel/accel.sh@109 -- # run_test accel_xor accel_test -t 1 -w xor -y 00:25:12.425 11:35:56 accel -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:25:12.425 11:35:56 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:25:12.425 11:35:56 accel -- common/autotest_common.sh@10 -- # set +x 00:25:12.425 ************************************ 00:25:12.425 START TEST accel_xor 00:25:12.425 ************************************ 00:25:12.425 11:35:56 accel.accel_xor -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w xor -y 00:25:12.425 11:35:56 accel.accel_xor -- accel/accel.sh@16 -- # local accel_opc 00:25:12.425 11:35:56 accel.accel_xor -- accel/accel.sh@17 -- # local accel_module 00:25:12.425 11:35:56 accel.accel_xor -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y 00:25:12.425 11:35:56 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:25:12.425 11:35:56 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:25:12.425 11:35:56 accel.accel_xor -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:25:12.425 11:35:56 accel.accel_xor -- accel/accel.sh@12 -- # build_accel_config 00:25:12.425 11:35:56 accel.accel_xor -- accel/accel.sh@31 -- # accel_json_cfg=() 00:25:12.425 11:35:56 accel.accel_xor -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:25:12.425 11:35:56 accel.accel_xor -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:25:12.425 11:35:56 accel.accel_xor -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:25:12.425 11:35:56 accel.accel_xor -- accel/accel.sh@36 -- # [[ -n '' ]] 00:25:12.425 11:35:56 accel.accel_xor -- accel/accel.sh@40 -- # local IFS=, 00:25:12.425 11:35:56 accel.accel_xor -- accel/accel.sh@41 -- # jq -r . 00:25:12.425 [2024-06-10 11:35:56.210132] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:25:12.425 [2024-06-10 11:35:56.210179] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid107116 ] 00:25:12.425 [2024-06-10 11:35:56.296637] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:12.692 [2024-06-10 11:35:56.385207] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:25:12.692 11:35:56 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:25:12.692 11:35:56 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:25:12.692 11:35:56 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:25:12.692 11:35:56 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:25:12.692 11:35:56 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:25:12.692 11:35:56 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:25:12.692 11:35:56 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:25:12.692 11:35:56 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:25:12.692 11:35:56 accel.accel_xor -- accel/accel.sh@20 -- # val=0x1 00:25:12.692 11:35:56 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:25:12.692 11:35:56 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:25:12.692 11:35:56 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:25:12.692 11:35:56 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:25:12.692 11:35:56 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:25:12.692 11:35:56 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:25:12.692 11:35:56 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:25:12.692 11:35:56 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:25:12.692 11:35:56 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:25:12.692 11:35:56 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:25:12.692 11:35:56 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:25:12.692 11:35:56 accel.accel_xor -- accel/accel.sh@20 -- # val=xor 00:25:12.692 11:35:56 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:25:12.692 11:35:56 accel.accel_xor -- accel/accel.sh@23 -- # accel_opc=xor 00:25:12.692 11:35:56 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:25:12.692 11:35:56 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:25:12.692 11:35:56 accel.accel_xor -- accel/accel.sh@20 -- # val=2 00:25:12.692 11:35:56 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:25:12.692 11:35:56 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:25:12.692 11:35:56 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:25:12.692 11:35:56 accel.accel_xor -- accel/accel.sh@20 -- # val='4096 bytes' 00:25:12.692 11:35:56 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:25:12.692 11:35:56 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:25:12.692 11:35:56 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:25:12.692 11:35:56 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:25:12.692 11:35:56 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:25:12.692 11:35:56 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:25:12.692 11:35:56 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:25:12.692 11:35:56 accel.accel_xor -- accel/accel.sh@20 -- # val=software 00:25:12.692 11:35:56 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:25:12.692 11:35:56 accel.accel_xor -- accel/accel.sh@22 -- # accel_module=software 00:25:12.692 11:35:56 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:25:12.692 11:35:56 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:25:12.692 11:35:56 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:25:12.692 11:35:56 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:25:12.692 11:35:56 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:25:12.692 11:35:56 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:25:12.692 11:35:56 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:25:12.692 11:35:56 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:25:12.692 11:35:56 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:25:12.692 11:35:56 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:25:12.692 11:35:56 accel.accel_xor -- accel/accel.sh@20 -- # val=1 00:25:12.692 11:35:56 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:25:12.692 11:35:56 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:25:12.692 11:35:56 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:25:12.692 11:35:56 accel.accel_xor -- accel/accel.sh@20 -- # val='1 seconds' 00:25:12.692 11:35:56 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:25:12.692 11:35:56 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:25:12.692 11:35:56 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:25:12.692 11:35:56 accel.accel_xor -- accel/accel.sh@20 -- # val=Yes 00:25:12.692 11:35:56 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:25:12.692 11:35:56 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:25:12.692 11:35:56 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:25:12.692 11:35:56 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:25:12.692 11:35:56 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:25:12.692 11:35:56 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:25:12.692 11:35:56 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:25:12.692 11:35:56 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:25:12.692 11:35:56 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:25:12.692 11:35:56 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:25:12.692 11:35:56 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:25:13.710 11:35:57 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:25:13.710 11:35:57 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:25:13.710 11:35:57 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:25:13.710 11:35:57 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:25:13.710 11:35:57 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:25:13.710 11:35:57 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:25:13.710 11:35:57 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:25:13.710 11:35:57 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:25:13.710 11:35:57 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:25:13.710 11:35:57 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:25:13.710 11:35:57 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:25:13.710 11:35:57 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:25:13.710 11:35:57 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:25:13.710 11:35:57 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:25:13.710 11:35:57 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:25:13.710 11:35:57 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:25:13.710 11:35:57 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:25:13.710 11:35:57 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:25:13.710 11:35:57 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:25:13.710 11:35:57 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:25:13.710 11:35:57 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:25:13.710 11:35:57 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:25:13.710 11:35:57 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:25:13.710 11:35:57 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:25:13.710 11:35:57 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n software ]] 00:25:13.710 11:35:57 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n xor ]] 00:25:13.710 11:35:57 accel.accel_xor -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:25:13.710 00:25:13.710 real 0m1.410s 00:25:13.710 user 0m1.263s 00:25:13.710 sys 0m0.148s 00:25:13.710 11:35:57 accel.accel_xor -- common/autotest_common.sh@1125 -- # xtrace_disable 00:25:13.710 11:35:57 accel.accel_xor -- common/autotest_common.sh@10 -- # set +x 00:25:13.710 ************************************ 00:25:13.710 END TEST accel_xor 00:25:13.710 ************************************ 00:25:13.710 11:35:57 accel -- accel/accel.sh@110 -- # run_test accel_xor accel_test -t 1 -w xor -y -x 3 00:25:13.710 11:35:57 accel -- common/autotest_common.sh@1100 -- # '[' 9 -le 1 ']' 00:25:13.710 11:35:57 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:25:13.710 11:35:57 accel -- common/autotest_common.sh@10 -- # set +x 00:25:13.970 ************************************ 00:25:13.970 START TEST accel_xor 00:25:13.970 ************************************ 00:25:13.970 11:35:57 accel.accel_xor -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w xor -y -x 3 00:25:13.970 11:35:57 accel.accel_xor -- accel/accel.sh@16 -- # local accel_opc 00:25:13.970 11:35:57 accel.accel_xor -- accel/accel.sh@17 -- # local accel_module 00:25:13.970 11:35:57 accel.accel_xor -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y -x 3 00:25:13.970 11:35:57 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:25:13.970 11:35:57 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:25:13.970 11:35:57 accel.accel_xor -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:25:13.970 11:35:57 accel.accel_xor -- accel/accel.sh@12 -- # build_accel_config 00:25:13.970 11:35:57 accel.accel_xor -- accel/accel.sh@31 -- # accel_json_cfg=() 00:25:13.970 11:35:57 accel.accel_xor -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:25:13.970 11:35:57 accel.accel_xor -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:25:13.970 11:35:57 accel.accel_xor -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:25:13.970 11:35:57 accel.accel_xor -- accel/accel.sh@36 -- # [[ -n '' ]] 00:25:13.970 11:35:57 accel.accel_xor -- accel/accel.sh@40 -- # local IFS=, 00:25:13.970 11:35:57 accel.accel_xor -- accel/accel.sh@41 -- # jq -r . 00:25:13.970 [2024-06-10 11:35:57.701083] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:25:13.970 [2024-06-10 11:35:57.701127] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid107325 ] 00:25:13.970 [2024-06-10 11:35:57.787288] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:13.970 [2024-06-10 11:35:57.866388] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:25:14.229 11:35:57 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:25:14.229 11:35:57 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:25:14.229 11:35:57 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:25:14.229 11:35:57 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:25:14.229 11:35:57 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:25:14.229 11:35:57 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:25:14.229 11:35:57 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:25:14.229 11:35:57 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:25:14.229 11:35:57 accel.accel_xor -- accel/accel.sh@20 -- # val=0x1 00:25:14.229 11:35:57 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:25:14.229 11:35:57 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:25:14.229 11:35:57 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:25:14.229 11:35:57 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:25:14.229 11:35:57 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:25:14.229 11:35:57 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:25:14.229 11:35:57 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:25:14.229 11:35:57 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:25:14.229 11:35:57 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:25:14.229 11:35:57 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:25:14.229 11:35:57 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:25:14.229 11:35:57 accel.accel_xor -- accel/accel.sh@20 -- # val=xor 00:25:14.229 11:35:57 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:25:14.229 11:35:57 accel.accel_xor -- accel/accel.sh@23 -- # accel_opc=xor 00:25:14.229 11:35:57 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:25:14.229 11:35:57 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:25:14.229 11:35:57 accel.accel_xor -- accel/accel.sh@20 -- # val=3 00:25:14.229 11:35:57 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:25:14.229 11:35:57 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:25:14.229 11:35:57 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:25:14.229 11:35:57 accel.accel_xor -- accel/accel.sh@20 -- # val='4096 bytes' 00:25:14.229 11:35:57 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:25:14.229 11:35:57 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:25:14.229 11:35:57 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:25:14.229 11:35:57 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:25:14.229 11:35:57 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:25:14.229 11:35:57 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:25:14.229 11:35:57 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:25:14.229 11:35:57 accel.accel_xor -- accel/accel.sh@20 -- # val=software 00:25:14.229 11:35:57 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:25:14.229 11:35:57 accel.accel_xor -- accel/accel.sh@22 -- # accel_module=software 00:25:14.229 11:35:57 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:25:14.229 11:35:57 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:25:14.229 11:35:57 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:25:14.229 11:35:57 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:25:14.229 11:35:57 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:25:14.229 11:35:57 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:25:14.229 11:35:57 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:25:14.229 11:35:57 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:25:14.229 11:35:57 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:25:14.229 11:35:57 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:25:14.229 11:35:57 accel.accel_xor -- accel/accel.sh@20 -- # val=1 00:25:14.229 11:35:57 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:25:14.229 11:35:57 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:25:14.229 11:35:57 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:25:14.229 11:35:57 accel.accel_xor -- accel/accel.sh@20 -- # val='1 seconds' 00:25:14.229 11:35:57 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:25:14.230 11:35:57 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:25:14.230 11:35:57 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:25:14.230 11:35:57 accel.accel_xor -- accel/accel.sh@20 -- # val=Yes 00:25:14.230 11:35:57 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:25:14.230 11:35:57 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:25:14.230 11:35:57 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:25:14.230 11:35:57 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:25:14.230 11:35:57 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:25:14.230 11:35:57 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:25:14.230 11:35:57 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:25:14.230 11:35:57 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:25:14.230 11:35:57 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:25:14.230 11:35:57 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:25:14.230 11:35:57 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:25:15.167 11:35:59 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:25:15.167 11:35:59 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:25:15.167 11:35:59 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:25:15.167 11:35:59 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:25:15.167 11:35:59 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:25:15.167 11:35:59 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:25:15.167 11:35:59 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:25:15.167 11:35:59 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:25:15.167 11:35:59 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:25:15.167 11:35:59 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:25:15.167 11:35:59 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:25:15.167 11:35:59 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:25:15.167 11:35:59 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:25:15.167 11:35:59 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:25:15.167 11:35:59 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:25:15.167 11:35:59 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:25:15.167 11:35:59 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:25:15.167 11:35:59 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:25:15.167 11:35:59 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:25:15.167 11:35:59 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:25:15.167 11:35:59 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:25:15.167 11:35:59 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:25:15.167 11:35:59 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:25:15.167 11:35:59 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:25:15.167 11:35:59 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n software ]] 00:25:15.167 11:35:59 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n xor ]] 00:25:15.167 11:35:59 accel.accel_xor -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:25:15.167 00:25:15.167 real 0m1.393s 00:25:15.167 user 0m1.240s 00:25:15.167 sys 0m0.156s 00:25:15.168 11:35:59 accel.accel_xor -- common/autotest_common.sh@1125 -- # xtrace_disable 00:25:15.168 11:35:59 accel.accel_xor -- common/autotest_common.sh@10 -- # set +x 00:25:15.168 ************************************ 00:25:15.168 END TEST accel_xor 00:25:15.168 ************************************ 00:25:15.168 11:35:59 accel -- accel/accel.sh@111 -- # run_test accel_dif_verify accel_test -t 1 -w dif_verify 00:25:15.168 11:35:59 accel -- common/autotest_common.sh@1100 -- # '[' 6 -le 1 ']' 00:25:15.168 11:35:59 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:25:15.168 11:35:59 accel -- common/autotest_common.sh@10 -- # set +x 00:25:15.427 ************************************ 00:25:15.427 START TEST accel_dif_verify 00:25:15.427 ************************************ 00:25:15.427 11:35:59 accel.accel_dif_verify -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w dif_verify 00:25:15.427 11:35:59 accel.accel_dif_verify -- accel/accel.sh@16 -- # local accel_opc 00:25:15.427 11:35:59 accel.accel_dif_verify -- accel/accel.sh@17 -- # local accel_module 00:25:15.427 11:35:59 accel.accel_dif_verify -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_verify 00:25:15.427 11:35:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:25:15.427 11:35:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:25:15.427 11:35:59 accel.accel_dif_verify -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:25:15.427 11:35:59 accel.accel_dif_verify -- accel/accel.sh@12 -- # build_accel_config 00:25:15.427 11:35:59 accel.accel_dif_verify -- accel/accel.sh@31 -- # accel_json_cfg=() 00:25:15.427 11:35:59 accel.accel_dif_verify -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:25:15.427 11:35:59 accel.accel_dif_verify -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:25:15.427 11:35:59 accel.accel_dif_verify -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:25:15.427 11:35:59 accel.accel_dif_verify -- accel/accel.sh@36 -- # [[ -n '' ]] 00:25:15.427 11:35:59 accel.accel_dif_verify -- accel/accel.sh@40 -- # local IFS=, 00:25:15.427 11:35:59 accel.accel_dif_verify -- accel/accel.sh@41 -- # jq -r . 00:25:15.427 [2024-06-10 11:35:59.163594] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:25:15.427 [2024-06-10 11:35:59.163636] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid107517 ] 00:25:15.427 [2024-06-10 11:35:59.250247] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:15.427 [2024-06-10 11:35:59.332502] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:25:15.687 11:35:59 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:25:15.687 11:35:59 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:25:15.687 11:35:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:25:15.687 11:35:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:25:15.687 11:35:59 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:25:15.687 11:35:59 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:25:15.687 11:35:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:25:15.687 11:35:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:25:15.687 11:35:59 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=0x1 00:25:15.687 11:35:59 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:25:15.687 11:35:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:25:15.687 11:35:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:25:15.687 11:35:59 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:25:15.687 11:35:59 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:25:15.687 11:35:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:25:15.687 11:35:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:25:15.687 11:35:59 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:25:15.687 11:35:59 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:25:15.687 11:35:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:25:15.687 11:35:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:25:15.687 11:35:59 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=dif_verify 00:25:15.687 11:35:59 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:25:15.687 11:35:59 accel.accel_dif_verify -- accel/accel.sh@23 -- # accel_opc=dif_verify 00:25:15.687 11:35:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:25:15.687 11:35:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:25:15.687 11:35:59 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='4096 bytes' 00:25:15.687 11:35:59 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:25:15.687 11:35:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:25:15.687 11:35:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:25:15.687 11:35:59 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='4096 bytes' 00:25:15.687 11:35:59 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:25:15.687 11:35:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:25:15.687 11:35:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:25:15.687 11:35:59 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='512 bytes' 00:25:15.687 11:35:59 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:25:15.687 11:35:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:25:15.687 11:35:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:25:15.687 11:35:59 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='8 bytes' 00:25:15.687 11:35:59 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:25:15.687 11:35:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:25:15.687 11:35:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:25:15.687 11:35:59 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:25:15.687 11:35:59 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:25:15.687 11:35:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:25:15.687 11:35:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:25:15.687 11:35:59 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=software 00:25:15.687 11:35:59 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:25:15.687 11:35:59 accel.accel_dif_verify -- accel/accel.sh@22 -- # accel_module=software 00:25:15.687 11:35:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:25:15.687 11:35:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:25:15.687 11:35:59 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=32 00:25:15.687 11:35:59 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:25:15.687 11:35:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:25:15.687 11:35:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:25:15.687 11:35:59 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=32 00:25:15.687 11:35:59 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:25:15.687 11:35:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:25:15.687 11:35:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:25:15.687 11:35:59 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=1 00:25:15.687 11:35:59 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:25:15.687 11:35:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:25:15.687 11:35:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:25:15.687 11:35:59 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='1 seconds' 00:25:15.688 11:35:59 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:25:15.688 11:35:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:25:15.688 11:35:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:25:15.688 11:35:59 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=No 00:25:15.688 11:35:59 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:25:15.688 11:35:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:25:15.688 11:35:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:25:15.688 11:35:59 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:25:15.688 11:35:59 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:25:15.688 11:35:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:25:15.688 11:35:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:25:15.688 11:35:59 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:25:15.688 11:35:59 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:25:15.688 11:35:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:25:15.688 11:35:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:25:16.625 11:36:00 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:25:16.625 11:36:00 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:25:16.625 11:36:00 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:25:16.625 11:36:00 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:25:16.625 11:36:00 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:25:16.625 11:36:00 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:25:16.625 11:36:00 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:25:16.625 11:36:00 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:25:16.625 11:36:00 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:25:16.625 11:36:00 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:25:16.625 11:36:00 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:25:16.625 11:36:00 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:25:16.625 11:36:00 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:25:16.625 11:36:00 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:25:16.625 11:36:00 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:25:16.625 11:36:00 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:25:16.625 11:36:00 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:25:16.625 11:36:00 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:25:16.625 11:36:00 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:25:16.625 11:36:00 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:25:16.625 11:36:00 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:25:16.625 11:36:00 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:25:16.625 11:36:00 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:25:16.625 11:36:00 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:25:16.625 11:36:00 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ -n software ]] 00:25:16.625 11:36:00 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ -n dif_verify ]] 00:25:16.625 11:36:00 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:25:16.625 00:25:16.625 real 0m1.407s 00:25:16.625 user 0m1.260s 00:25:16.625 sys 0m0.151s 00:25:16.625 11:36:00 accel.accel_dif_verify -- common/autotest_common.sh@1125 -- # xtrace_disable 00:25:16.625 11:36:00 accel.accel_dif_verify -- common/autotest_common.sh@10 -- # set +x 00:25:16.625 ************************************ 00:25:16.625 END TEST accel_dif_verify 00:25:16.625 ************************************ 00:25:16.884 11:36:00 accel -- accel/accel.sh@112 -- # run_test accel_dif_generate accel_test -t 1 -w dif_generate 00:25:16.884 11:36:00 accel -- common/autotest_common.sh@1100 -- # '[' 6 -le 1 ']' 00:25:16.884 11:36:00 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:25:16.884 11:36:00 accel -- common/autotest_common.sh@10 -- # set +x 00:25:16.884 ************************************ 00:25:16.884 START TEST accel_dif_generate 00:25:16.884 ************************************ 00:25:16.884 11:36:00 accel.accel_dif_generate -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w dif_generate 00:25:16.884 11:36:00 accel.accel_dif_generate -- accel/accel.sh@16 -- # local accel_opc 00:25:16.884 11:36:00 accel.accel_dif_generate -- accel/accel.sh@17 -- # local accel_module 00:25:16.884 11:36:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:25:16.884 11:36:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:25:16.884 11:36:00 accel.accel_dif_generate -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate 00:25:16.884 11:36:00 accel.accel_dif_generate -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:25:16.884 11:36:00 accel.accel_dif_generate -- accel/accel.sh@12 -- # build_accel_config 00:25:16.884 11:36:00 accel.accel_dif_generate -- accel/accel.sh@31 -- # accel_json_cfg=() 00:25:16.884 11:36:00 accel.accel_dif_generate -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:25:16.884 11:36:00 accel.accel_dif_generate -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:25:16.884 11:36:00 accel.accel_dif_generate -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:25:16.884 11:36:00 accel.accel_dif_generate -- accel/accel.sh@36 -- # [[ -n '' ]] 00:25:16.885 11:36:00 accel.accel_dif_generate -- accel/accel.sh@40 -- # local IFS=, 00:25:16.885 11:36:00 accel.accel_dif_generate -- accel/accel.sh@41 -- # jq -r . 00:25:16.885 [2024-06-10 11:36:00.653247] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:25:16.885 [2024-06-10 11:36:00.653310] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid107760 ] 00:25:16.885 [2024-06-10 11:36:00.742945] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:17.144 [2024-06-10 11:36:00.833883] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:25:17.144 11:36:00 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:25:17.144 11:36:00 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:25:17.144 11:36:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:25:17.144 11:36:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:25:17.144 11:36:00 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:25:17.144 11:36:00 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:25:17.144 11:36:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:25:17.144 11:36:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:25:17.144 11:36:00 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=0x1 00:25:17.144 11:36:00 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:25:17.144 11:36:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:25:17.144 11:36:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:25:17.144 11:36:00 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:25:17.144 11:36:00 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:25:17.144 11:36:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:25:17.144 11:36:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:25:17.144 11:36:00 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:25:17.144 11:36:00 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:25:17.144 11:36:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:25:17.144 11:36:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:25:17.144 11:36:00 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=dif_generate 00:25:17.144 11:36:00 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:25:17.144 11:36:00 accel.accel_dif_generate -- accel/accel.sh@23 -- # accel_opc=dif_generate 00:25:17.144 11:36:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:25:17.144 11:36:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:25:17.144 11:36:00 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='4096 bytes' 00:25:17.144 11:36:00 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:25:17.144 11:36:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:25:17.144 11:36:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:25:17.144 11:36:00 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='4096 bytes' 00:25:17.144 11:36:00 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:25:17.144 11:36:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:25:17.144 11:36:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:25:17.144 11:36:00 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='512 bytes' 00:25:17.144 11:36:00 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:25:17.144 11:36:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:25:17.144 11:36:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:25:17.144 11:36:00 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='8 bytes' 00:25:17.144 11:36:00 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:25:17.144 11:36:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:25:17.144 11:36:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:25:17.144 11:36:00 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:25:17.144 11:36:00 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:25:17.144 11:36:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:25:17.144 11:36:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:25:17.144 11:36:00 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=software 00:25:17.144 11:36:00 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:25:17.144 11:36:00 accel.accel_dif_generate -- accel/accel.sh@22 -- # accel_module=software 00:25:17.144 11:36:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:25:17.144 11:36:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:25:17.144 11:36:00 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=32 00:25:17.144 11:36:00 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:25:17.144 11:36:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:25:17.145 11:36:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:25:17.145 11:36:00 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=32 00:25:17.145 11:36:00 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:25:17.145 11:36:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:25:17.145 11:36:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:25:17.145 11:36:00 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=1 00:25:17.145 11:36:00 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:25:17.145 11:36:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:25:17.145 11:36:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:25:17.145 11:36:00 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='1 seconds' 00:25:17.145 11:36:00 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:25:17.145 11:36:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:25:17.145 11:36:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:25:17.145 11:36:00 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=No 00:25:17.145 11:36:00 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:25:17.145 11:36:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:25:17.145 11:36:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:25:17.145 11:36:00 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:25:17.145 11:36:00 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:25:17.145 11:36:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:25:17.145 11:36:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:25:17.145 11:36:00 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:25:17.145 11:36:00 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:25:17.145 11:36:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:25:17.145 11:36:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:25:18.521 11:36:02 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:25:18.521 11:36:02 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:25:18.521 11:36:02 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:25:18.521 11:36:02 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:25:18.521 11:36:02 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:25:18.521 11:36:02 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:25:18.521 11:36:02 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:25:18.521 11:36:02 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:25:18.521 11:36:02 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:25:18.521 11:36:02 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:25:18.521 11:36:02 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:25:18.521 11:36:02 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:25:18.521 11:36:02 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:25:18.521 11:36:02 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:25:18.521 11:36:02 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:25:18.521 11:36:02 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:25:18.521 11:36:02 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:25:18.521 11:36:02 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:25:18.521 11:36:02 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:25:18.521 11:36:02 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:25:18.521 11:36:02 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:25:18.521 11:36:02 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:25:18.521 11:36:02 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:25:18.521 11:36:02 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:25:18.521 11:36:02 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ -n software ]] 00:25:18.521 11:36:02 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ -n dif_generate ]] 00:25:18.521 11:36:02 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:25:18.521 00:25:18.521 real 0m1.435s 00:25:18.521 user 0m1.283s 00:25:18.521 sys 0m0.156s 00:25:18.521 11:36:02 accel.accel_dif_generate -- common/autotest_common.sh@1125 -- # xtrace_disable 00:25:18.521 11:36:02 accel.accel_dif_generate -- common/autotest_common.sh@10 -- # set +x 00:25:18.521 ************************************ 00:25:18.521 END TEST accel_dif_generate 00:25:18.521 ************************************ 00:25:18.521 11:36:02 accel -- accel/accel.sh@113 -- # run_test accel_dif_generate_copy accel_test -t 1 -w dif_generate_copy 00:25:18.521 11:36:02 accel -- common/autotest_common.sh@1100 -- # '[' 6 -le 1 ']' 00:25:18.521 11:36:02 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:25:18.521 11:36:02 accel -- common/autotest_common.sh@10 -- # set +x 00:25:18.521 ************************************ 00:25:18.521 START TEST accel_dif_generate_copy 00:25:18.521 ************************************ 00:25:18.521 11:36:02 accel.accel_dif_generate_copy -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w dif_generate_copy 00:25:18.521 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@16 -- # local accel_opc 00:25:18.521 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@17 -- # local accel_module 00:25:18.521 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:25:18.521 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:25:18.521 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate_copy 00:25:18.521 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:25:18.521 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@12 -- # build_accel_config 00:25:18.521 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@31 -- # accel_json_cfg=() 00:25:18.521 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:25:18.521 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:25:18.521 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@36 -- # [[ -n '' ]] 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@40 -- # local IFS=, 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@41 -- # jq -r . 00:25:18.522 [2024-06-10 11:36:02.164802] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:25:18.522 [2024-06-10 11:36:02.164857] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid108036 ] 00:25:18.522 [2024-06-10 11:36:02.251486] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:18.522 [2024-06-10 11:36:02.334492] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=0x1 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=dif_generate_copy 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@23 -- # accel_opc=dif_generate_copy 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=software 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@22 -- # accel_module=software 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=32 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=32 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=1 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='1 seconds' 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=No 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:25:18.522 11:36:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:25:19.901 11:36:03 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:25:19.901 11:36:03 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:25:19.901 11:36:03 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:25:19.901 11:36:03 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:25:19.901 11:36:03 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:25:19.901 11:36:03 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:25:19.901 11:36:03 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:25:19.901 11:36:03 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:25:19.901 11:36:03 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:25:19.901 11:36:03 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:25:19.901 11:36:03 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:25:19.901 11:36:03 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:25:19.901 11:36:03 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:25:19.901 11:36:03 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:25:19.901 11:36:03 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:25:19.901 11:36:03 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:25:19.901 11:36:03 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:25:19.901 11:36:03 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:25:19.901 11:36:03 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:25:19.901 11:36:03 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:25:19.901 11:36:03 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:25:19.901 11:36:03 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:25:19.901 11:36:03 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:25:19.901 11:36:03 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:25:19.901 11:36:03 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ -n software ]] 00:25:19.901 11:36:03 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ -n dif_generate_copy ]] 00:25:19.901 11:36:03 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:25:19.901 00:25:19.901 real 0m1.432s 00:25:19.901 user 0m1.265s 00:25:19.901 sys 0m0.164s 00:25:19.901 11:36:03 accel.accel_dif_generate_copy -- common/autotest_common.sh@1125 -- # xtrace_disable 00:25:19.901 11:36:03 accel.accel_dif_generate_copy -- common/autotest_common.sh@10 -- # set +x 00:25:19.901 ************************************ 00:25:19.901 END TEST accel_dif_generate_copy 00:25:19.901 ************************************ 00:25:19.901 11:36:03 accel -- accel/accel.sh@115 -- # [[ y == y ]] 00:25:19.901 11:36:03 accel -- accel/accel.sh@116 -- # run_test accel_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:25:19.901 11:36:03 accel -- common/autotest_common.sh@1100 -- # '[' 8 -le 1 ']' 00:25:19.901 11:36:03 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:25:19.901 11:36:03 accel -- common/autotest_common.sh@10 -- # set +x 00:25:19.901 ************************************ 00:25:19.901 START TEST accel_comp 00:25:19.901 ************************************ 00:25:19.901 11:36:03 accel.accel_comp -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:25:19.901 11:36:03 accel.accel_comp -- accel/accel.sh@16 -- # local accel_opc 00:25:19.901 11:36:03 accel.accel_comp -- accel/accel.sh@17 -- # local accel_module 00:25:19.901 11:36:03 accel.accel_comp -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:25:19.901 11:36:03 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:25:19.901 11:36:03 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:25:19.901 11:36:03 accel.accel_comp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:25:19.901 11:36:03 accel.accel_comp -- accel/accel.sh@12 -- # build_accel_config 00:25:19.901 11:36:03 accel.accel_comp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:25:19.901 11:36:03 accel.accel_comp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:25:19.901 11:36:03 accel.accel_comp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:25:19.901 11:36:03 accel.accel_comp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:25:19.901 11:36:03 accel.accel_comp -- accel/accel.sh@36 -- # [[ -n '' ]] 00:25:19.901 11:36:03 accel.accel_comp -- accel/accel.sh@40 -- # local IFS=, 00:25:19.901 11:36:03 accel.accel_comp -- accel/accel.sh@41 -- # jq -r . 00:25:19.901 [2024-06-10 11:36:03.659372] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:25:19.901 [2024-06-10 11:36:03.659420] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid108239 ] 00:25:19.901 [2024-06-10 11:36:03.740829] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:19.901 [2024-06-10 11:36:03.821524] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@20 -- # val=0x1 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@20 -- # val=compress 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@23 -- # accel_opc=compress 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@20 -- # val='4096 bytes' 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@20 -- # val=software 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@22 -- # accel_module=software 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@20 -- # val=32 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@20 -- # val=32 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@20 -- # val=1 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@20 -- # val='1 seconds' 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@20 -- # val=No 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:25:20.161 11:36:03 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:25:21.098 11:36:05 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:25:21.098 11:36:05 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:25:21.098 11:36:05 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:25:21.098 11:36:05 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:25:21.098 11:36:05 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:25:21.098 11:36:05 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:25:21.098 11:36:05 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:25:21.098 11:36:05 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:25:21.098 11:36:05 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:25:21.098 11:36:05 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:25:21.098 11:36:05 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:25:21.098 11:36:05 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:25:21.098 11:36:05 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:25:21.098 11:36:05 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:25:21.098 11:36:05 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:25:21.098 11:36:05 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:25:21.098 11:36:05 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:25:21.098 11:36:05 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:25:21.098 11:36:05 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:25:21.098 11:36:05 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:25:21.098 11:36:05 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:25:21.098 11:36:05 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:25:21.098 11:36:05 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:25:21.098 11:36:05 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:25:21.098 11:36:05 accel.accel_comp -- accel/accel.sh@27 -- # [[ -n software ]] 00:25:21.098 11:36:05 accel.accel_comp -- accel/accel.sh@27 -- # [[ -n compress ]] 00:25:21.098 11:36:05 accel.accel_comp -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:25:21.098 00:25:21.098 real 0m1.403s 00:25:21.098 user 0m1.252s 00:25:21.098 sys 0m0.141s 00:25:21.098 11:36:05 accel.accel_comp -- common/autotest_common.sh@1125 -- # xtrace_disable 00:25:21.098 11:36:05 accel.accel_comp -- common/autotest_common.sh@10 -- # set +x 00:25:21.098 ************************************ 00:25:21.098 END TEST accel_comp 00:25:21.098 ************************************ 00:25:21.357 11:36:05 accel -- accel/accel.sh@117 -- # run_test accel_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:25:21.357 11:36:05 accel -- common/autotest_common.sh@1100 -- # '[' 9 -le 1 ']' 00:25:21.357 11:36:05 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:25:21.357 11:36:05 accel -- common/autotest_common.sh@10 -- # set +x 00:25:21.357 ************************************ 00:25:21.357 START TEST accel_decomp 00:25:21.357 ************************************ 00:25:21.357 11:36:05 accel.accel_decomp -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:25:21.357 11:36:05 accel.accel_decomp -- accel/accel.sh@16 -- # local accel_opc 00:25:21.357 11:36:05 accel.accel_decomp -- accel/accel.sh@17 -- # local accel_module 00:25:21.357 11:36:05 accel.accel_decomp -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:25:21.357 11:36:05 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:25:21.357 11:36:05 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:25:21.357 11:36:05 accel.accel_decomp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:25:21.357 11:36:05 accel.accel_decomp -- accel/accel.sh@12 -- # build_accel_config 00:25:21.357 11:36:05 accel.accel_decomp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:25:21.357 11:36:05 accel.accel_decomp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:25:21.357 11:36:05 accel.accel_decomp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:25:21.357 11:36:05 accel.accel_decomp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:25:21.357 11:36:05 accel.accel_decomp -- accel/accel.sh@36 -- # [[ -n '' ]] 00:25:21.357 11:36:05 accel.accel_decomp -- accel/accel.sh@40 -- # local IFS=, 00:25:21.357 11:36:05 accel.accel_decomp -- accel/accel.sh@41 -- # jq -r . 00:25:21.357 [2024-06-10 11:36:05.131170] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:25:21.357 [2024-06-10 11:36:05.131222] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid108705 ] 00:25:21.357 [2024-06-10 11:36:05.217281] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:21.357 [2024-06-10 11:36:05.298319] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@20 -- # val=0x1 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@20 -- # val=decompress 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@23 -- # accel_opc=decompress 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@20 -- # val='4096 bytes' 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@20 -- # val=software 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@22 -- # accel_module=software 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@20 -- # val=32 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@20 -- # val=32 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@20 -- # val=1 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@20 -- # val='1 seconds' 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@20 -- # val=Yes 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:25:21.617 11:36:05 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:25:22.996 11:36:06 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:25:22.996 11:36:06 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:25:22.996 11:36:06 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:25:22.996 11:36:06 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:25:22.996 11:36:06 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:25:22.996 11:36:06 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:25:22.996 11:36:06 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:25:22.996 11:36:06 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:25:22.996 11:36:06 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:25:22.996 11:36:06 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:25:22.996 11:36:06 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:25:22.996 11:36:06 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:25:22.996 11:36:06 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:25:22.996 11:36:06 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:25:22.996 11:36:06 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:25:22.996 11:36:06 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:25:22.996 11:36:06 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:25:22.996 11:36:06 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:25:22.996 11:36:06 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:25:22.996 11:36:06 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:25:22.996 11:36:06 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:25:22.996 11:36:06 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:25:22.996 11:36:06 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:25:22.996 11:36:06 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:25:22.996 11:36:06 accel.accel_decomp -- accel/accel.sh@27 -- # [[ -n software ]] 00:25:22.996 11:36:06 accel.accel_decomp -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:25:22.996 11:36:06 accel.accel_decomp -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:25:22.996 00:25:22.996 real 0m1.414s 00:25:22.996 user 0m1.261s 00:25:22.996 sys 0m0.145s 00:25:22.996 11:36:06 accel.accel_decomp -- common/autotest_common.sh@1125 -- # xtrace_disable 00:25:22.996 11:36:06 accel.accel_decomp -- common/autotest_common.sh@10 -- # set +x 00:25:22.996 ************************************ 00:25:22.996 END TEST accel_decomp 00:25:22.996 ************************************ 00:25:22.996 11:36:06 accel -- accel/accel.sh@118 -- # run_test accel_decomp_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:25:22.996 11:36:06 accel -- common/autotest_common.sh@1100 -- # '[' 11 -le 1 ']' 00:25:22.996 11:36:06 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:25:22.996 11:36:06 accel -- common/autotest_common.sh@10 -- # set +x 00:25:22.996 ************************************ 00:25:22.996 START TEST accel_decomp_full 00:25:22.996 ************************************ 00:25:22.996 11:36:06 accel.accel_decomp_full -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:25:22.996 11:36:06 accel.accel_decomp_full -- accel/accel.sh@16 -- # local accel_opc 00:25:22.996 11:36:06 accel.accel_decomp_full -- accel/accel.sh@17 -- # local accel_module 00:25:22.996 11:36:06 accel.accel_decomp_full -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:25:22.996 11:36:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:25:22.996 11:36:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:25:22.996 11:36:06 accel.accel_decomp_full -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:25:22.996 11:36:06 accel.accel_decomp_full -- accel/accel.sh@12 -- # build_accel_config 00:25:22.996 11:36:06 accel.accel_decomp_full -- accel/accel.sh@31 -- # accel_json_cfg=() 00:25:22.996 11:36:06 accel.accel_decomp_full -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:25:22.996 11:36:06 accel.accel_decomp_full -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:25:22.996 11:36:06 accel.accel_decomp_full -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:25:22.996 11:36:06 accel.accel_decomp_full -- accel/accel.sh@36 -- # [[ -n '' ]] 00:25:22.996 11:36:06 accel.accel_decomp_full -- accel/accel.sh@40 -- # local IFS=, 00:25:22.996 11:36:06 accel.accel_decomp_full -- accel/accel.sh@41 -- # jq -r . 00:25:22.996 [2024-06-10 11:36:06.622214] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:25:22.996 [2024-06-10 11:36:06.622263] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid109093 ] 00:25:22.996 [2024-06-10 11:36:06.715239] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:22.997 [2024-06-10 11:36:06.798314] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=0x1 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=decompress 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@23 -- # accel_opc=decompress 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@20 -- # val='111250 bytes' 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=software 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@22 -- # accel_module=software 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=32 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=32 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=1 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@20 -- # val='1 seconds' 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=Yes 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:25:22.997 11:36:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:25:24.375 11:36:08 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:25:24.375 11:36:08 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:25:24.375 11:36:08 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:25:24.375 11:36:08 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:25:24.375 11:36:08 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:25:24.375 11:36:08 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:25:24.375 11:36:08 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:25:24.375 11:36:08 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:25:24.375 11:36:08 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:25:24.375 11:36:08 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:25:24.375 11:36:08 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:25:24.375 11:36:08 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:25:24.375 11:36:08 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:25:24.375 11:36:08 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:25:24.375 11:36:08 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:25:24.375 11:36:08 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:25:24.375 11:36:08 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:25:24.375 11:36:08 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:25:24.375 11:36:08 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:25:24.375 11:36:08 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:25:24.375 11:36:08 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:25:24.375 11:36:08 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:25:24.375 11:36:08 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:25:24.375 11:36:08 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:25:24.375 11:36:08 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ -n software ]] 00:25:24.375 11:36:08 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:25:24.375 11:36:08 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:25:24.375 00:25:24.375 real 0m1.423s 00:25:24.375 user 0m1.275s 00:25:24.375 sys 0m0.156s 00:25:24.375 11:36:08 accel.accel_decomp_full -- common/autotest_common.sh@1125 -- # xtrace_disable 00:25:24.375 11:36:08 accel.accel_decomp_full -- common/autotest_common.sh@10 -- # set +x 00:25:24.375 ************************************ 00:25:24.375 END TEST accel_decomp_full 00:25:24.375 ************************************ 00:25:24.375 11:36:08 accel -- accel/accel.sh@119 -- # run_test accel_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:25:24.375 11:36:08 accel -- common/autotest_common.sh@1100 -- # '[' 11 -le 1 ']' 00:25:24.375 11:36:08 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:25:24.375 11:36:08 accel -- common/autotest_common.sh@10 -- # set +x 00:25:24.375 ************************************ 00:25:24.375 START TEST accel_decomp_mcore 00:25:24.375 ************************************ 00:25:24.375 11:36:08 accel.accel_decomp_mcore -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:25:24.375 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@16 -- # local accel_opc 00:25:24.375 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@17 -- # local accel_module 00:25:24.375 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:24.375 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:24.375 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:25:24.375 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:25:24.375 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@12 -- # build_accel_config 00:25:24.375 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:25:24.375 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:25:24.375 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:25:24.375 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:25:24.375 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@36 -- # [[ -n '' ]] 00:25:24.375 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@40 -- # local IFS=, 00:25:24.375 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@41 -- # jq -r . 00:25:24.375 [2024-06-10 11:36:08.136111] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:25:24.375 [2024-06-10 11:36:08.136168] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid109363 ] 00:25:24.375 [2024-06-10 11:36:08.221893] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:25:24.375 [2024-06-10 11:36:08.306738] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:25:24.375 [2024-06-10 11:36:08.306760] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:25:24.375 [2024-06-10 11:36:08.306836] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 3 00:25:24.375 [2024-06-10 11:36:08.306838] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:25:24.634 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:25:24.634 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:24.634 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:24.634 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:24.634 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:25:24.634 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:24.634 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:24.634 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:24.634 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:25:24.634 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:24.634 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:24.634 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:24.634 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=0xf 00:25:24.634 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:24.634 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:24.634 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:24.634 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:25:24.634 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:24.634 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:24.634 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:24.634 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:25:24.634 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:24.634 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:24.634 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:24.634 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=decompress 00:25:24.634 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:24.634 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:25:24.634 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:24.634 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:24.634 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val='4096 bytes' 00:25:24.634 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:24.634 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:24.634 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:24.634 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:25:24.634 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:24.634 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:24.634 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:24.634 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=software 00:25:24.634 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:24.634 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@22 -- # accel_module=software 00:25:24.634 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:24.634 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:24.634 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:25:24.634 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:24.634 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:24.634 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:24.634 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:25:24.634 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:24.634 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:24.634 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:24.634 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:25:24.634 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:24.634 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:24.634 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:24.634 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=1 00:25:24.634 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:24.634 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:24.635 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:24.635 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:25:24.635 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:24.635 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:24.635 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:24.635 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=Yes 00:25:24.635 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:24.635 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:24.635 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:24.635 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:25:24.635 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:24.635 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:24.635 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:24.635 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:25:24.635 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:24.635 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:24.635 11:36:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:26.012 11:36:09 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:25:26.012 11:36:09 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:26.012 11:36:09 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:26.012 11:36:09 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:26.012 11:36:09 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:25:26.012 11:36:09 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:26.012 11:36:09 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:26.012 11:36:09 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:26.012 11:36:09 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:25:26.012 11:36:09 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:26.012 11:36:09 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:26.012 11:36:09 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:26.012 11:36:09 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:25:26.012 11:36:09 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:26.012 11:36:09 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:26.012 11:36:09 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:26.012 11:36:09 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:25:26.012 11:36:09 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:26.012 11:36:09 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:26.012 11:36:09 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:26.012 11:36:09 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:25:26.012 11:36:09 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:26.012 11:36:09 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:26.012 11:36:09 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:26.012 11:36:09 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:25:26.012 11:36:09 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:26.012 11:36:09 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:26.012 11:36:09 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:26.012 11:36:09 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:25:26.012 11:36:09 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:26.012 11:36:09 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:26.012 11:36:09 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:26.012 11:36:09 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:25:26.012 11:36:09 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:26.012 11:36:09 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:26.012 11:36:09 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:26.012 11:36:09 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ -n software ]] 00:25:26.012 11:36:09 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:25:26.012 11:36:09 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:25:26.012 00:25:26.012 real 0m1.429s 00:25:26.012 user 0m4.665s 00:25:26.012 sys 0m0.170s 00:25:26.012 11:36:09 accel.accel_decomp_mcore -- common/autotest_common.sh@1125 -- # xtrace_disable 00:25:26.012 11:36:09 accel.accel_decomp_mcore -- common/autotest_common.sh@10 -- # set +x 00:25:26.012 ************************************ 00:25:26.012 END TEST accel_decomp_mcore 00:25:26.012 ************************************ 00:25:26.012 11:36:09 accel -- accel/accel.sh@120 -- # run_test accel_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:25:26.012 11:36:09 accel -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:25:26.012 11:36:09 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:25:26.012 11:36:09 accel -- common/autotest_common.sh@10 -- # set +x 00:25:26.012 ************************************ 00:25:26.012 START TEST accel_decomp_full_mcore 00:25:26.012 ************************************ 00:25:26.012 11:36:09 accel.accel_decomp_full_mcore -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:25:26.012 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@16 -- # local accel_opc 00:25:26.012 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@17 -- # local accel_module 00:25:26.012 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:26.012 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:26.012 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:25:26.012 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:25:26.012 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@12 -- # build_accel_config 00:25:26.012 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:25:26.012 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:25:26.012 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:25:26.012 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:25:26.012 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@36 -- # [[ -n '' ]] 00:25:26.012 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@40 -- # local IFS=, 00:25:26.012 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@41 -- # jq -r . 00:25:26.012 [2024-06-10 11:36:09.638412] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:25:26.012 [2024-06-10 11:36:09.638468] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid109577 ] 00:25:26.012 [2024-06-10 11:36:09.725158] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:25:26.012 [2024-06-10 11:36:09.810129] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:25:26.012 [2024-06-10 11:36:09.810216] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:25:26.012 [2024-06-10 11:36:09.810292] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 3 00:25:26.012 [2024-06-10 11:36:09.810294] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:25:26.012 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:25:26.012 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:26.012 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:26.012 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:26.012 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:25:26.012 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:26.012 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:26.012 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:26.012 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:25:26.012 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:26.012 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:26.012 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:26.012 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=0xf 00:25:26.012 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:26.012 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:26.012 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:26.013 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:25:26.013 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:26.013 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:26.013 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:26.013 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:25:26.013 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:26.013 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:26.013 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:26.013 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=decompress 00:25:26.013 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:26.013 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:25:26.013 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:26.013 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:26.013 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val='111250 bytes' 00:25:26.013 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:26.013 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:26.013 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:26.013 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:25:26.013 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:26.013 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:26.013 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:26.013 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=software 00:25:26.013 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:26.013 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@22 -- # accel_module=software 00:25:26.013 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:26.013 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:26.013 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:25:26.013 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:26.013 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:26.013 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:26.013 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:25:26.013 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:26.013 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:26.013 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:26.013 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:25:26.013 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:26.013 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:26.013 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:26.013 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=1 00:25:26.013 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:26.013 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:26.013 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:26.013 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:25:26.013 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:26.013 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:26.013 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:26.013 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=Yes 00:25:26.013 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:26.013 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:26.013 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:26.013 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:25:26.013 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:26.013 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:26.013 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:26.013 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:25:26.013 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:26.013 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:26.013 11:36:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:27.388 11:36:11 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:25:27.388 11:36:11 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:27.388 11:36:11 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:27.388 11:36:11 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:27.388 11:36:11 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:25:27.388 11:36:11 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:27.388 11:36:11 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:27.388 11:36:11 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:27.388 11:36:11 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:25:27.388 11:36:11 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:27.388 11:36:11 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:27.388 11:36:11 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:27.388 11:36:11 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:25:27.388 11:36:11 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:27.388 11:36:11 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:27.388 11:36:11 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:27.388 11:36:11 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:25:27.388 11:36:11 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:27.388 11:36:11 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:27.388 11:36:11 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:27.388 11:36:11 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:25:27.388 11:36:11 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:27.388 11:36:11 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:27.388 11:36:11 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:27.388 11:36:11 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:25:27.388 11:36:11 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:27.388 11:36:11 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:27.388 11:36:11 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:27.388 11:36:11 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:25:27.388 11:36:11 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:27.388 11:36:11 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:27.388 11:36:11 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:27.388 11:36:11 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:25:27.388 11:36:11 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:27.388 11:36:11 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:27.388 11:36:11 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:27.388 11:36:11 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n software ]] 00:25:27.388 11:36:11 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:25:27.388 11:36:11 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:25:27.388 00:25:27.388 real 0m1.439s 00:25:27.388 user 0m4.696s 00:25:27.388 sys 0m0.174s 00:25:27.388 11:36:11 accel.accel_decomp_full_mcore -- common/autotest_common.sh@1125 -- # xtrace_disable 00:25:27.388 11:36:11 accel.accel_decomp_full_mcore -- common/autotest_common.sh@10 -- # set +x 00:25:27.388 ************************************ 00:25:27.388 END TEST accel_decomp_full_mcore 00:25:27.388 ************************************ 00:25:27.388 11:36:11 accel -- accel/accel.sh@121 -- # run_test accel_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:25:27.388 11:36:11 accel -- common/autotest_common.sh@1100 -- # '[' 11 -le 1 ']' 00:25:27.388 11:36:11 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:25:27.388 11:36:11 accel -- common/autotest_common.sh@10 -- # set +x 00:25:27.388 ************************************ 00:25:27.388 START TEST accel_decomp_mthread 00:25:27.388 ************************************ 00:25:27.388 11:36:11 accel.accel_decomp_mthread -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:25:27.388 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@16 -- # local accel_opc 00:25:27.388 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@17 -- # local accel_module 00:25:27.388 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:27.388 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:27.388 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:25:27.388 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:25:27.388 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@12 -- # build_accel_config 00:25:27.388 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:25:27.388 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:25:27.388 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:25:27.388 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:25:27.388 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@36 -- # [[ -n '' ]] 00:25:27.388 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@40 -- # local IFS=, 00:25:27.388 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@41 -- # jq -r . 00:25:27.388 [2024-06-10 11:36:11.151905] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:25:27.388 [2024-06-10 11:36:11.151962] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid109775 ] 00:25:27.388 [2024-06-10 11:36:11.235760] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:27.388 [2024-06-10 11:36:11.316600] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:25:27.648 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:25:27.648 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:27.648 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:27.648 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:27.648 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:25:27.648 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:27.648 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:27.648 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:27.648 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:25:27.648 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:27.648 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:27.648 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:27.648 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=0x1 00:25:27.648 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:27.648 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:27.648 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:27.648 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:25:27.648 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:27.648 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:27.648 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:27.648 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:25:27.648 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:27.648 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:27.648 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:27.648 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=decompress 00:25:27.648 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:27.648 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:25:27.648 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:27.648 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:27.648 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val='4096 bytes' 00:25:27.648 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:27.648 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:27.648 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:27.648 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:25:27.648 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:27.648 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:27.648 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:27.648 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=software 00:25:27.648 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:27.648 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@22 -- # accel_module=software 00:25:27.648 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:27.648 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:27.648 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:25:27.648 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:27.648 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:27.648 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:27.648 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:25:27.648 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:27.648 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:27.648 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:27.648 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:25:27.648 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:27.648 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:27.648 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:27.648 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=2 00:25:27.648 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:27.648 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:27.648 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:27.648 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:25:27.648 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:27.648 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:27.648 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:27.648 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=Yes 00:25:27.648 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:27.648 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:27.648 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:27.648 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:25:27.648 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:27.648 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:27.648 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:27.649 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:25:27.649 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:27.649 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:27.649 11:36:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:28.584 11:36:12 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:25:28.584 11:36:12 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:28.584 11:36:12 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:28.584 11:36:12 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:28.584 11:36:12 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:25:28.584 11:36:12 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:28.584 11:36:12 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:28.584 11:36:12 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:28.584 11:36:12 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:25:28.584 11:36:12 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:28.584 11:36:12 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:28.584 11:36:12 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:28.584 11:36:12 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:25:28.584 11:36:12 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:28.584 11:36:12 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:28.844 11:36:12 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:28.844 11:36:12 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:25:28.844 11:36:12 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:28.844 11:36:12 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:28.844 11:36:12 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:28.844 11:36:12 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:25:28.844 11:36:12 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:28.844 11:36:12 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:28.844 11:36:12 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:28.844 11:36:12 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:25:28.844 11:36:12 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:28.844 11:36:12 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:28.844 11:36:12 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:28.844 11:36:12 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ -n software ]] 00:25:28.844 11:36:12 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:25:28.844 11:36:12 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:25:28.844 00:25:28.844 real 0m1.416s 00:25:28.844 user 0m1.255s 00:25:28.844 sys 0m0.167s 00:25:28.844 11:36:12 accel.accel_decomp_mthread -- common/autotest_common.sh@1125 -- # xtrace_disable 00:25:28.844 11:36:12 accel.accel_decomp_mthread -- common/autotest_common.sh@10 -- # set +x 00:25:28.844 ************************************ 00:25:28.844 END TEST accel_decomp_mthread 00:25:28.844 ************************************ 00:25:28.844 11:36:12 accel -- accel/accel.sh@122 -- # run_test accel_decomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:25:28.844 11:36:12 accel -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:25:28.844 11:36:12 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:25:28.844 11:36:12 accel -- common/autotest_common.sh@10 -- # set +x 00:25:28.844 ************************************ 00:25:28.844 START TEST accel_decomp_full_mthread 00:25:28.844 ************************************ 00:25:28.844 11:36:12 accel.accel_decomp_full_mthread -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:25:28.844 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@16 -- # local accel_opc 00:25:28.844 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@17 -- # local accel_module 00:25:28.844 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:25:28.844 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:28.844 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:28.844 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:25:28.844 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@12 -- # build_accel_config 00:25:28.844 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:25:28.844 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:25:28.844 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:25:28.844 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:25:28.844 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@36 -- # [[ -n '' ]] 00:25:28.844 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@40 -- # local IFS=, 00:25:28.844 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@41 -- # jq -r . 00:25:28.844 [2024-06-10 11:36:12.632713] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:25:28.844 [2024-06-10 11:36:12.632758] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid109977 ] 00:25:28.844 [2024-06-10 11:36:12.713692] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:29.104 [2024-06-10 11:36:12.797619] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=0x1 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=decompress 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val='111250 bytes' 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=software 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@22 -- # accel_module=software 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=2 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=Yes 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:29.104 11:36:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:30.483 11:36:14 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:25:30.483 11:36:14 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:30.483 11:36:14 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:30.483 11:36:14 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:30.483 11:36:14 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:25:30.483 11:36:14 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:30.483 11:36:14 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:30.483 11:36:14 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:30.483 11:36:14 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:25:30.483 11:36:14 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:30.483 11:36:14 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:30.483 11:36:14 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:30.483 11:36:14 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:25:30.483 11:36:14 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:30.483 11:36:14 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:30.483 11:36:14 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:30.483 11:36:14 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:25:30.483 11:36:14 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:30.483 11:36:14 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:30.483 11:36:14 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:30.483 11:36:14 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:25:30.483 11:36:14 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:30.483 11:36:14 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:30.483 11:36:14 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:30.483 11:36:14 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:25:30.483 11:36:14 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:30.483 11:36:14 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:30.483 11:36:14 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:30.483 11:36:14 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n software ]] 00:25:30.483 11:36:14 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:25:30.483 11:36:14 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:25:30.483 00:25:30.483 real 0m1.430s 00:25:30.483 user 0m1.281s 00:25:30.483 sys 0m0.155s 00:25:30.483 11:36:14 accel.accel_decomp_full_mthread -- common/autotest_common.sh@1125 -- # xtrace_disable 00:25:30.483 11:36:14 accel.accel_decomp_full_mthread -- common/autotest_common.sh@10 -- # set +x 00:25:30.483 ************************************ 00:25:30.483 END TEST accel_decomp_full_mthread 00:25:30.483 ************************************ 00:25:30.483 11:36:14 accel -- accel/accel.sh@124 -- # [[ y == y ]] 00:25:30.483 11:36:14 accel -- accel/accel.sh@125 -- # COMPRESSDEV=1 00:25:30.483 11:36:14 accel -- accel/accel.sh@126 -- # get_expected_opcs 00:25:30.483 11:36:14 accel -- accel/accel.sh@60 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:25:30.483 11:36:14 accel -- accel/accel.sh@62 -- # spdk_tgt_pid=110166 00:25:30.483 11:36:14 accel -- accel/accel.sh@63 -- # waitforlisten 110166 00:25:30.483 11:36:14 accel -- common/autotest_common.sh@830 -- # '[' -z 110166 ']' 00:25:30.483 11:36:14 accel -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:25:30.483 11:36:14 accel -- common/autotest_common.sh@835 -- # local max_retries=100 00:25:30.483 11:36:14 accel -- accel/accel.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:25:30.483 11:36:14 accel -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:25:30.483 11:36:14 accel -- accel/accel.sh@61 -- # build_accel_config 00:25:30.483 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:25:30.483 11:36:14 accel -- common/autotest_common.sh@839 -- # xtrace_disable 00:25:30.483 11:36:14 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:25:30.483 11:36:14 accel -- common/autotest_common.sh@10 -- # set +x 00:25:30.483 11:36:14 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:25:30.483 11:36:14 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:25:30.483 11:36:14 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:25:30.483 11:36:14 accel -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:25:30.483 11:36:14 accel -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:25:30.483 11:36:14 accel -- accel/accel.sh@40 -- # local IFS=, 00:25:30.483 11:36:14 accel -- accel/accel.sh@41 -- # jq -r . 00:25:30.483 [2024-06-10 11:36:14.137025] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:25:30.483 [2024-06-10 11:36:14.137083] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid110166 ] 00:25:30.483 [2024-06-10 11:36:14.222595] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:30.483 [2024-06-10 11:36:14.309905] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:25:31.051 [2024-06-10 11:36:14.852431] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:25:31.310 11:36:15 accel -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:25:31.310 11:36:15 accel -- common/autotest_common.sh@863 -- # return 0 00:25:31.310 11:36:15 accel -- accel/accel.sh@65 -- # [[ 0 -gt 0 ]] 00:25:31.310 11:36:15 accel -- accel/accel.sh@66 -- # [[ 0 -gt 0 ]] 00:25:31.310 11:36:15 accel -- accel/accel.sh@67 -- # [[ 0 -gt 0 ]] 00:25:31.310 11:36:15 accel -- accel/accel.sh@68 -- # [[ -n 1 ]] 00:25:31.310 11:36:15 accel -- accel/accel.sh@68 -- # check_save_config compressdev_scan_accel_module 00:25:31.310 11:36:15 accel -- accel/accel.sh@56 -- # rpc_cmd save_config 00:25:31.310 11:36:15 accel -- common/autotest_common.sh@560 -- # xtrace_disable 00:25:31.310 11:36:15 accel -- common/autotest_common.sh@10 -- # set +x 00:25:31.310 11:36:15 accel -- accel/accel.sh@56 -- # grep compressdev_scan_accel_module 00:25:31.310 11:36:15 accel -- accel/accel.sh@56 -- # jq -r '.subsystems[] | select(.subsystem=="accel").config[]' 00:25:31.310 11:36:15 accel -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:25:31.310 "method": "compressdev_scan_accel_module", 00:25:31.310 11:36:15 accel -- accel/accel.sh@70 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:25:31.310 11:36:15 accel -- accel/accel.sh@70 -- # rpc_cmd accel_get_opc_assignments 00:25:31.310 11:36:15 accel -- common/autotest_common.sh@560 -- # xtrace_disable 00:25:31.310 11:36:15 accel -- common/autotest_common.sh@10 -- # set +x 00:25:31.310 11:36:15 accel -- accel/accel.sh@70 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:25:31.310 11:36:15 accel -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:25:31.310 11:36:15 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:25:31.310 11:36:15 accel -- accel/accel.sh@72 -- # IFS== 00:25:31.310 11:36:15 accel -- accel/accel.sh@72 -- # read -r opc module 00:25:31.310 11:36:15 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:25:31.310 11:36:15 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:25:31.310 11:36:15 accel -- accel/accel.sh@72 -- # IFS== 00:25:31.310 11:36:15 accel -- accel/accel.sh@72 -- # read -r opc module 00:25:31.310 11:36:15 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:25:31.310 11:36:15 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:25:31.310 11:36:15 accel -- accel/accel.sh@72 -- # IFS== 00:25:31.310 11:36:15 accel -- accel/accel.sh@72 -- # read -r opc module 00:25:31.310 11:36:15 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:25:31.310 11:36:15 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:25:31.310 11:36:15 accel -- accel/accel.sh@72 -- # IFS== 00:25:31.310 11:36:15 accel -- accel/accel.sh@72 -- # read -r opc module 00:25:31.310 11:36:15 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:25:31.310 11:36:15 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:25:31.310 11:36:15 accel -- accel/accel.sh@72 -- # IFS== 00:25:31.310 11:36:15 accel -- accel/accel.sh@72 -- # read -r opc module 00:25:31.310 11:36:15 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:25:31.310 11:36:15 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:25:31.310 11:36:15 accel -- accel/accel.sh@72 -- # IFS== 00:25:31.311 11:36:15 accel -- accel/accel.sh@72 -- # read -r opc module 00:25:31.311 11:36:15 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:25:31.311 11:36:15 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:25:31.311 11:36:15 accel -- accel/accel.sh@72 -- # IFS== 00:25:31.311 11:36:15 accel -- accel/accel.sh@72 -- # read -r opc module 00:25:31.311 11:36:15 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=dpdk_compressdev 00:25:31.311 11:36:15 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:25:31.311 11:36:15 accel -- accel/accel.sh@72 -- # IFS== 00:25:31.311 11:36:15 accel -- accel/accel.sh@72 -- # read -r opc module 00:25:31.311 11:36:15 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=dpdk_compressdev 00:25:31.311 11:36:15 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:25:31.311 11:36:15 accel -- accel/accel.sh@72 -- # IFS== 00:25:31.311 11:36:15 accel -- accel/accel.sh@72 -- # read -r opc module 00:25:31.311 11:36:15 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:25:31.311 11:36:15 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:25:31.311 11:36:15 accel -- accel/accel.sh@72 -- # IFS== 00:25:31.311 11:36:15 accel -- accel/accel.sh@72 -- # read -r opc module 00:25:31.311 11:36:15 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:25:31.311 11:36:15 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:25:31.311 11:36:15 accel -- accel/accel.sh@72 -- # IFS== 00:25:31.311 11:36:15 accel -- accel/accel.sh@72 -- # read -r opc module 00:25:31.311 11:36:15 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:25:31.311 11:36:15 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:25:31.311 11:36:15 accel -- accel/accel.sh@72 -- # IFS== 00:25:31.311 11:36:15 accel -- accel/accel.sh@72 -- # read -r opc module 00:25:31.311 11:36:15 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:25:31.311 11:36:15 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:25:31.311 11:36:15 accel -- accel/accel.sh@72 -- # IFS== 00:25:31.311 11:36:15 accel -- accel/accel.sh@72 -- # read -r opc module 00:25:31.311 11:36:15 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:25:31.311 11:36:15 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:25:31.311 11:36:15 accel -- accel/accel.sh@72 -- # IFS== 00:25:31.311 11:36:15 accel -- accel/accel.sh@72 -- # read -r opc module 00:25:31.311 11:36:15 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:25:31.311 11:36:15 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:25:31.311 11:36:15 accel -- accel/accel.sh@72 -- # IFS== 00:25:31.311 11:36:15 accel -- accel/accel.sh@72 -- # read -r opc module 00:25:31.311 11:36:15 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:25:31.311 11:36:15 accel -- accel/accel.sh@75 -- # killprocess 110166 00:25:31.311 11:36:15 accel -- common/autotest_common.sh@949 -- # '[' -z 110166 ']' 00:25:31.311 11:36:15 accel -- common/autotest_common.sh@953 -- # kill -0 110166 00:25:31.311 11:36:15 accel -- common/autotest_common.sh@954 -- # uname 00:25:31.311 11:36:15 accel -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:25:31.311 11:36:15 accel -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 110166 00:25:31.570 11:36:15 accel -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:25:31.570 11:36:15 accel -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:25:31.570 11:36:15 accel -- common/autotest_common.sh@967 -- # echo 'killing process with pid 110166' 00:25:31.570 killing process with pid 110166 00:25:31.570 11:36:15 accel -- common/autotest_common.sh@968 -- # kill 110166 00:25:31.570 11:36:15 accel -- common/autotest_common.sh@973 -- # wait 110166 00:25:31.829 11:36:15 accel -- accel/accel.sh@76 -- # trap - ERR 00:25:31.829 11:36:15 accel -- accel/accel.sh@127 -- # run_test accel_cdev_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:25:31.829 11:36:15 accel -- common/autotest_common.sh@1100 -- # '[' 8 -le 1 ']' 00:25:31.829 11:36:15 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:25:31.829 11:36:15 accel -- common/autotest_common.sh@10 -- # set +x 00:25:31.829 ************************************ 00:25:31.829 START TEST accel_cdev_comp 00:25:31.829 ************************************ 00:25:31.829 11:36:15 accel.accel_cdev_comp -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:25:31.829 11:36:15 accel.accel_cdev_comp -- accel/accel.sh@16 -- # local accel_opc 00:25:31.830 11:36:15 accel.accel_cdev_comp -- accel/accel.sh@17 -- # local accel_module 00:25:31.830 11:36:15 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:25:31.830 11:36:15 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:25:31.830 11:36:15 accel.accel_cdev_comp -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:25:31.830 11:36:15 accel.accel_cdev_comp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:25:31.830 11:36:15 accel.accel_cdev_comp -- accel/accel.sh@12 -- # build_accel_config 00:25:31.830 11:36:15 accel.accel_cdev_comp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:25:31.830 11:36:15 accel.accel_cdev_comp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:25:31.830 11:36:15 accel.accel_cdev_comp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:25:31.830 11:36:15 accel.accel_cdev_comp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:25:31.830 11:36:15 accel.accel_cdev_comp -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:25:31.830 11:36:15 accel.accel_cdev_comp -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:25:31.830 11:36:15 accel.accel_cdev_comp -- accel/accel.sh@40 -- # local IFS=, 00:25:31.830 11:36:15 accel.accel_cdev_comp -- accel/accel.sh@41 -- # jq -r . 00:25:31.830 [2024-06-10 11:36:15.720355] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:25:31.830 [2024-06-10 11:36:15.720414] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid110366 ] 00:25:32.088 [2024-06-10 11:36:15.808700] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:32.088 [2024-06-10 11:36:15.895400] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:25:32.657 [2024-06-10 11:36:16.444502] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:25:32.657 [2024-06-10 11:36:16.446419] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1f7dd10 PMD being used: compress_qat 00:25:32.657 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:25:32.657 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:25:32.657 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:25:32.657 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:25:32.657 [2024-06-10 11:36:16.449918] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1f82a90 PMD being used: compress_qat 00:25:32.657 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:25:32.657 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:25:32.657 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:25:32.657 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:25:32.657 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:25:32.657 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:25:32.657 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:25:32.657 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:25:32.657 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=0x1 00:25:32.657 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:25:32.657 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:25:32.657 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:25:32.657 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:25:32.657 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:25:32.657 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:25:32.657 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:25:32.657 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:25:32.657 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:25:32.657 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:25:32.657 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:25:32.657 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=compress 00:25:32.657 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:25:32.657 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@23 -- # accel_opc=compress 00:25:32.657 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:25:32.657 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:25:32.657 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val='4096 bytes' 00:25:32.657 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:25:32.657 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:25:32.657 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:25:32.657 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:25:32.657 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:25:32.657 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:25:32.657 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:25:32.657 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:25:32.657 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:25:32.657 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:25:32.657 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:25:32.657 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:25:32.657 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:25:32.657 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:25:32.657 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:25:32.657 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:25:32.657 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=32 00:25:32.657 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:25:32.657 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:25:32.657 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:25:32.657 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=32 00:25:32.657 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:25:32.657 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:25:32.658 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:25:32.658 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=1 00:25:32.658 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:25:32.658 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:25:32.658 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:25:32.658 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val='1 seconds' 00:25:32.658 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:25:32.658 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:25:32.658 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:25:32.658 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=No 00:25:32.658 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:25:32.658 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:25:32.658 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:25:32.658 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:25:32.658 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:25:32.658 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:25:32.658 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:25:32.658 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:25:32.658 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:25:32.658 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:25:32.658 11:36:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:25:34.036 11:36:17 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:25:34.036 11:36:17 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:25:34.036 11:36:17 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:25:34.036 11:36:17 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:25:34.036 11:36:17 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:25:34.036 11:36:17 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:25:34.036 11:36:17 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:25:34.036 11:36:17 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:25:34.036 11:36:17 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:25:34.036 11:36:17 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:25:34.036 11:36:17 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:25:34.036 11:36:17 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:25:34.036 11:36:17 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:25:34.036 11:36:17 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:25:34.036 11:36:17 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:25:34.036 11:36:17 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:25:34.036 11:36:17 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:25:34.036 11:36:17 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:25:34.036 11:36:17 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:25:34.036 11:36:17 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:25:34.036 11:36:17 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:25:34.036 11:36:17 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:25:34.036 11:36:17 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:25:34.036 11:36:17 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:25:34.036 11:36:17 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:25:34.036 11:36:17 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ -n compress ]] 00:25:34.036 11:36:17 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:25:34.036 00:25:34.036 real 0m1.926s 00:25:34.036 user 0m1.481s 00:25:34.036 sys 0m0.443s 00:25:34.036 11:36:17 accel.accel_cdev_comp -- common/autotest_common.sh@1125 -- # xtrace_disable 00:25:34.036 11:36:17 accel.accel_cdev_comp -- common/autotest_common.sh@10 -- # set +x 00:25:34.036 ************************************ 00:25:34.036 END TEST accel_cdev_comp 00:25:34.036 ************************************ 00:25:34.036 11:36:17 accel -- accel/accel.sh@128 -- # run_test accel_cdev_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:25:34.036 11:36:17 accel -- common/autotest_common.sh@1100 -- # '[' 9 -le 1 ']' 00:25:34.036 11:36:17 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:25:34.036 11:36:17 accel -- common/autotest_common.sh@10 -- # set +x 00:25:34.036 ************************************ 00:25:34.036 START TEST accel_cdev_decomp 00:25:34.036 ************************************ 00:25:34.036 11:36:17 accel.accel_cdev_decomp -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:25:34.036 11:36:17 accel.accel_cdev_decomp -- accel/accel.sh@16 -- # local accel_opc 00:25:34.036 11:36:17 accel.accel_cdev_decomp -- accel/accel.sh@17 -- # local accel_module 00:25:34.036 11:36:17 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:25:34.036 11:36:17 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:25:34.036 11:36:17 accel.accel_cdev_decomp -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:25:34.037 11:36:17 accel.accel_cdev_decomp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:25:34.037 11:36:17 accel.accel_cdev_decomp -- accel/accel.sh@12 -- # build_accel_config 00:25:34.037 11:36:17 accel.accel_cdev_decomp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:25:34.037 11:36:17 accel.accel_cdev_decomp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:25:34.037 11:36:17 accel.accel_cdev_decomp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:25:34.037 11:36:17 accel.accel_cdev_decomp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:25:34.037 11:36:17 accel.accel_cdev_decomp -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:25:34.037 11:36:17 accel.accel_cdev_decomp -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:25:34.037 11:36:17 accel.accel_cdev_decomp -- accel/accel.sh@40 -- # local IFS=, 00:25:34.037 11:36:17 accel.accel_cdev_decomp -- accel/accel.sh@41 -- # jq -r . 00:25:34.037 [2024-06-10 11:36:17.720337] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:25:34.037 [2024-06-10 11:36:17.720395] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid110727 ] 00:25:34.037 [2024-06-10 11:36:17.805966] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:34.037 [2024-06-10 11:36:17.886973] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:25:34.605 [2024-06-10 11:36:18.409534] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:25:34.605 [2024-06-10 11:36:18.411339] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1fd1d10 PMD being used: compress_qat 00:25:34.605 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:25:34.605 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:25:34.605 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:25:34.605 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:25:34.605 [2024-06-10 11:36:18.414768] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x21d6af0 PMD being used: compress_qat 00:25:34.605 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:25:34.605 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:25:34.605 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:25:34.605 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:25:34.605 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:25:34.605 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:25:34.605 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:25:34.605 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:25:34.605 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=0x1 00:25:34.605 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:25:34.605 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:25:34.605 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:25:34.605 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:25:34.605 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:25:34.605 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:25:34.605 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:25:34.605 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:25:34.605 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:25:34.605 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:25:34.605 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:25:34.605 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=decompress 00:25:34.605 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:25:34.605 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@23 -- # accel_opc=decompress 00:25:34.605 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:25:34.605 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:25:34.605 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val='4096 bytes' 00:25:34.605 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:25:34.605 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:25:34.605 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:25:34.605 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:25:34.605 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:25:34.605 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:25:34.605 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:25:34.605 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:25:34.605 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:25:34.605 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:25:34.605 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:25:34.605 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:25:34.605 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:25:34.605 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:25:34.605 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:25:34.605 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:25:34.605 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=32 00:25:34.605 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:25:34.605 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:25:34.605 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:25:34.605 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=32 00:25:34.605 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:25:34.605 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:25:34.605 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:25:34.605 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=1 00:25:34.605 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:25:34.605 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:25:34.606 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:25:34.606 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val='1 seconds' 00:25:34.606 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:25:34.606 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:25:34.606 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:25:34.606 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=Yes 00:25:34.606 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:25:34.606 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:25:34.606 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:25:34.606 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:25:34.606 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:25:34.606 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:25:34.606 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:25:34.606 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:25:34.606 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:25:34.606 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:25:34.606 11:36:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:25:35.985 11:36:19 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:25:35.985 11:36:19 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:25:35.985 11:36:19 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:25:35.985 11:36:19 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:25:35.985 11:36:19 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:25:35.985 11:36:19 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:25:35.985 11:36:19 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:25:35.985 11:36:19 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:25:35.985 11:36:19 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:25:35.985 11:36:19 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:25:35.985 11:36:19 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:25:35.985 11:36:19 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:25:35.985 11:36:19 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:25:35.985 11:36:19 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:25:35.985 11:36:19 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:25:35.985 11:36:19 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:25:35.985 11:36:19 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:25:35.985 11:36:19 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:25:35.985 11:36:19 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:25:35.985 11:36:19 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:25:35.985 11:36:19 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:25:35.985 11:36:19 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:25:35.985 11:36:19 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:25:35.985 11:36:19 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:25:35.985 11:36:19 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:25:35.985 11:36:19 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:25:35.985 11:36:19 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:25:35.985 00:25:35.985 real 0m1.889s 00:25:35.985 user 0m1.469s 00:25:35.985 sys 0m0.426s 00:25:35.985 11:36:19 accel.accel_cdev_decomp -- common/autotest_common.sh@1125 -- # xtrace_disable 00:25:35.985 11:36:19 accel.accel_cdev_decomp -- common/autotest_common.sh@10 -- # set +x 00:25:35.985 ************************************ 00:25:35.985 END TEST accel_cdev_decomp 00:25:35.985 ************************************ 00:25:35.985 11:36:19 accel -- accel/accel.sh@129 -- # run_test accel_cdev_decomp_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:25:35.985 11:36:19 accel -- common/autotest_common.sh@1100 -- # '[' 11 -le 1 ']' 00:25:35.985 11:36:19 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:25:35.985 11:36:19 accel -- common/autotest_common.sh@10 -- # set +x 00:25:35.985 ************************************ 00:25:35.985 START TEST accel_cdev_decomp_full 00:25:35.985 ************************************ 00:25:35.985 11:36:19 accel.accel_cdev_decomp_full -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:25:35.985 11:36:19 accel.accel_cdev_decomp_full -- accel/accel.sh@16 -- # local accel_opc 00:25:35.985 11:36:19 accel.accel_cdev_decomp_full -- accel/accel.sh@17 -- # local accel_module 00:25:35.985 11:36:19 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:25:35.985 11:36:19 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:25:35.985 11:36:19 accel.accel_cdev_decomp_full -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:25:35.985 11:36:19 accel.accel_cdev_decomp_full -- accel/accel.sh@12 -- # build_accel_config 00:25:35.985 11:36:19 accel.accel_cdev_decomp_full -- accel/accel.sh@31 -- # accel_json_cfg=() 00:25:35.985 11:36:19 accel.accel_cdev_decomp_full -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:25:35.985 11:36:19 accel.accel_cdev_decomp_full -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:25:35.985 11:36:19 accel.accel_cdev_decomp_full -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:25:35.985 11:36:19 accel.accel_cdev_decomp_full -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:25:35.985 11:36:19 accel.accel_cdev_decomp_full -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:25:35.985 11:36:19 accel.accel_cdev_decomp_full -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:25:35.985 11:36:19 accel.accel_cdev_decomp_full -- accel/accel.sh@40 -- # local IFS=, 00:25:35.985 11:36:19 accel.accel_cdev_decomp_full -- accel/accel.sh@41 -- # jq -r . 00:25:35.985 [2024-06-10 11:36:19.674603] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:25:35.985 [2024-06-10 11:36:19.674657] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid110928 ] 00:25:35.985 [2024-06-10 11:36:19.759156] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:35.985 [2024-06-10 11:36:19.839094] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:25:36.552 [2024-06-10 11:36:20.395277] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:25:36.552 [2024-06-10 11:36:20.397228] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x142cd10 PMD being used: compress_qat 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:25:36.552 [2024-06-10 11:36:20.399953] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1430070 PMD being used: compress_qat 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=0x1 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=decompress 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@23 -- # accel_opc=decompress 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val='111250 bytes' 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=32 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=32 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=1 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val='1 seconds' 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=Yes 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:25:36.552 11:36:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:25:37.930 11:36:21 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:25:37.930 11:36:21 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:25:37.930 11:36:21 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:25:37.930 11:36:21 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:25:37.930 11:36:21 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:25:37.930 11:36:21 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:25:37.930 11:36:21 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:25:37.930 11:36:21 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:25:37.930 11:36:21 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:25:37.930 11:36:21 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:25:37.930 11:36:21 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:25:37.930 11:36:21 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:25:37.930 11:36:21 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:25:37.930 11:36:21 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:25:37.930 11:36:21 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:25:37.930 11:36:21 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:25:37.930 11:36:21 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:25:37.930 11:36:21 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:25:37.930 11:36:21 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:25:37.930 11:36:21 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:25:37.930 11:36:21 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:25:37.930 11:36:21 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:25:37.930 11:36:21 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:25:37.930 11:36:21 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:25:37.930 11:36:21 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:25:37.930 11:36:21 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:25:37.930 11:36:21 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:25:37.930 00:25:37.930 real 0m1.925s 00:25:37.930 user 0m1.479s 00:25:37.930 sys 0m0.446s 00:25:37.930 11:36:21 accel.accel_cdev_decomp_full -- common/autotest_common.sh@1125 -- # xtrace_disable 00:25:37.930 11:36:21 accel.accel_cdev_decomp_full -- common/autotest_common.sh@10 -- # set +x 00:25:37.930 ************************************ 00:25:37.930 END TEST accel_cdev_decomp_full 00:25:37.930 ************************************ 00:25:37.930 11:36:21 accel -- accel/accel.sh@130 -- # run_test accel_cdev_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:25:37.930 11:36:21 accel -- common/autotest_common.sh@1100 -- # '[' 11 -le 1 ']' 00:25:37.930 11:36:21 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:25:37.930 11:36:21 accel -- common/autotest_common.sh@10 -- # set +x 00:25:37.930 ************************************ 00:25:37.930 START TEST accel_cdev_decomp_mcore 00:25:37.930 ************************************ 00:25:37.930 11:36:21 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:25:37.930 11:36:21 accel.accel_cdev_decomp_mcore -- accel/accel.sh@16 -- # local accel_opc 00:25:37.930 11:36:21 accel.accel_cdev_decomp_mcore -- accel/accel.sh@17 -- # local accel_module 00:25:37.930 11:36:21 accel.accel_cdev_decomp_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:25:37.930 11:36:21 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:37.930 11:36:21 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:37.930 11:36:21 accel.accel_cdev_decomp_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:25:37.930 11:36:21 accel.accel_cdev_decomp_mcore -- accel/accel.sh@12 -- # build_accel_config 00:25:37.930 11:36:21 accel.accel_cdev_decomp_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:25:37.931 11:36:21 accel.accel_cdev_decomp_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:25:37.931 11:36:21 accel.accel_cdev_decomp_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:25:37.931 11:36:21 accel.accel_cdev_decomp_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:25:37.931 11:36:21 accel.accel_cdev_decomp_mcore -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:25:37.931 11:36:21 accel.accel_cdev_decomp_mcore -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:25:37.931 11:36:21 accel.accel_cdev_decomp_mcore -- accel/accel.sh@40 -- # local IFS=, 00:25:37.931 11:36:21 accel.accel_cdev_decomp_mcore -- accel/accel.sh@41 -- # jq -r . 00:25:37.931 [2024-06-10 11:36:21.651799] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:25:37.931 [2024-06-10 11:36:21.651842] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid111288 ] 00:25:37.931 [2024-06-10 11:36:21.733194] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:25:37.931 [2024-06-10 11:36:21.818893] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:25:37.931 [2024-06-10 11:36:21.818963] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 3 00:25:37.931 [2024-06-10 11:36:21.818965] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:25:37.931 [2024-06-10 11:36:21.818938] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:25:38.499 [2024-06-10 11:36:22.373957] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:25:38.499 [2024-06-10 11:36:22.375941] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1c1d3b0 PMD being used: compress_qat 00:25:38.499 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:25:38.499 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:38.499 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:38.499 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:38.499 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:25:38.499 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:38.499 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:38.499 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:38.499 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:25:38.499 [2024-06-10 11:36:22.380739] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f2ec819b8b0 PMD being used: compress_qat 00:25:38.499 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:38.499 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:38.499 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:38.499 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=0xf 00:25:38.499 [2024-06-10 11:36:22.381885] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f2eb819b8b0 PMD being used: compress_qat 00:25:38.499 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:38.499 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:38.499 [2024-06-10 11:36:22.382313] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1c228a0 PMD being used: compress_qat 00:25:38.499 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:38.499 [2024-06-10 11:36:22.382427] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f2ec019b8b0 PMD being used: compress_qat 00:25:38.499 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:25:38.499 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:38.499 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:38.499 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:38.499 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:25:38.499 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:38.499 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:38.499 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:38.499 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=decompress 00:25:38.499 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:38.499 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:25:38.499 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:38.499 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:38.499 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val='4096 bytes' 00:25:38.499 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:38.499 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:38.499 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:38.499 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:25:38.499 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:38.499 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:38.499 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:38.499 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:25:38.499 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:38.499 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:25:38.499 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:38.499 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:38.499 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:25:38.499 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:38.499 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:38.499 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:38.499 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:25:38.499 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:38.499 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:38.499 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:38.499 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:25:38.499 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:38.499 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:38.500 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:38.500 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=1 00:25:38.500 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:38.500 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:38.500 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:38.500 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:25:38.500 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:38.500 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:38.500 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:38.500 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=Yes 00:25:38.500 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:38.500 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:38.500 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:38.500 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:25:38.500 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:38.500 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:38.500 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:38.500 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:25:38.500 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:38.500 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:38.500 11:36:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:39.878 11:36:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:25:39.878 11:36:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:39.878 11:36:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:39.878 11:36:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:39.878 11:36:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:25:39.878 11:36:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:39.878 11:36:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:39.878 11:36:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:39.878 11:36:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:25:39.878 11:36:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:39.878 11:36:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:39.878 11:36:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:39.878 11:36:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:25:39.878 11:36:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:39.878 11:36:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:39.878 11:36:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:39.878 11:36:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:25:39.878 11:36:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:39.878 11:36:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:39.878 11:36:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:39.878 11:36:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:25:39.878 11:36:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:39.878 11:36:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:39.878 11:36:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:39.878 11:36:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:25:39.878 11:36:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:39.878 11:36:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:39.878 11:36:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:39.878 11:36:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:25:39.878 11:36:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:39.878 11:36:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:39.878 11:36:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:39.878 11:36:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:25:39.878 11:36:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:39.878 11:36:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:39.878 11:36:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:39.878 11:36:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:25:39.878 11:36:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:25:39.878 11:36:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:25:39.878 00:25:39.878 real 0m1.918s 00:25:39.878 user 0m6.395s 00:25:39.878 sys 0m0.444s 00:25:39.878 11:36:23 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@1125 -- # xtrace_disable 00:25:39.878 11:36:23 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@10 -- # set +x 00:25:39.878 ************************************ 00:25:39.878 END TEST accel_cdev_decomp_mcore 00:25:39.878 ************************************ 00:25:39.878 11:36:23 accel -- accel/accel.sh@131 -- # run_test accel_cdev_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:25:39.878 11:36:23 accel -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:25:39.878 11:36:23 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:25:39.878 11:36:23 accel -- common/autotest_common.sh@10 -- # set +x 00:25:39.878 ************************************ 00:25:39.878 START TEST accel_cdev_decomp_full_mcore 00:25:39.878 ************************************ 00:25:39.878 11:36:23 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:25:39.878 11:36:23 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@16 -- # local accel_opc 00:25:39.878 11:36:23 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@17 -- # local accel_module 00:25:39.878 11:36:23 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:39.878 11:36:23 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:39.878 11:36:23 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:25:39.878 11:36:23 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:25:39.878 11:36:23 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@12 -- # build_accel_config 00:25:39.878 11:36:23 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:25:39.878 11:36:23 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:25:39.878 11:36:23 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:25:39.878 11:36:23 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:25:39.878 11:36:23 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:25:39.878 11:36:23 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:25:39.878 11:36:23 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@40 -- # local IFS=, 00:25:39.878 11:36:23 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@41 -- # jq -r . 00:25:39.878 [2024-06-10 11:36:23.643877] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:25:39.878 [2024-06-10 11:36:23.643937] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid111496 ] 00:25:39.878 [2024-06-10 11:36:23.727805] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:25:39.878 [2024-06-10 11:36:23.811760] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:25:39.878 [2024-06-10 11:36:23.811848] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:25:39.878 [2024-06-10 11:36:23.811924] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 3 00:25:39.878 [2024-06-10 11:36:23.811926] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:25:40.450 [2024-06-10 11:36:24.380282] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:25:40.450 [2024-06-10 11:36:24.382272] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1c623b0 PMD being used: compress_qat 00:25:40.450 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:25:40.450 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:40.450 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:40.450 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:40.450 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:25:40.450 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:40.450 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:40.450 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:40.450 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:25:40.450 [2024-06-10 11:36:24.386227] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f53cc19b8b0 PMD being used: compress_qat 00:25:40.450 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:40.450 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:40.450 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:40.450 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=0xf 00:25:40.450 [2024-06-10 11:36:24.387406] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f53c419b8b0 PMD being used: compress_qat 00:25:40.450 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:40.450 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:40.450 [2024-06-10 11:36:24.387835] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1c62450 PMD being used: compress_qat 00:25:40.450 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:40.450 [2024-06-10 11:36:24.387951] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f53bc19b8b0 PMD being used: compress_qat 00:25:40.450 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:25:40.450 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:40.450 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:40.450 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:40.450 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:25:40.450 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:40.450 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:40.450 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:40.450 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=decompress 00:25:40.450 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:40.450 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:25:40.450 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:40.450 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:40.450 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val='111250 bytes' 00:25:40.450 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:40.450 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:40.450 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:40.791 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:25:40.791 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:40.791 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:40.791 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:40.791 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:25:40.791 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:40.791 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:25:40.791 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:40.791 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:40.791 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:25:40.791 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:40.791 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:40.791 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:40.791 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:25:40.791 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:40.791 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:40.791 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:40.791 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:25:40.791 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:40.791 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:40.791 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:40.791 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=1 00:25:40.791 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:40.791 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:40.791 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:40.791 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:25:40.791 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:40.791 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:40.791 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:40.791 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=Yes 00:25:40.791 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:40.791 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:40.791 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:40.791 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:25:40.791 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:40.791 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:40.791 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:40.791 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:25:40.791 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:40.791 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:40.791 11:36:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:41.743 11:36:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:25:41.743 11:36:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:41.743 11:36:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:41.743 11:36:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:41.743 11:36:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:25:41.743 11:36:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:41.743 11:36:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:41.743 11:36:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:41.743 11:36:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:25:41.743 11:36:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:41.743 11:36:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:41.743 11:36:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:41.743 11:36:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:25:41.743 11:36:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:41.743 11:36:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:41.743 11:36:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:41.743 11:36:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:25:41.743 11:36:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:41.743 11:36:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:41.743 11:36:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:41.743 11:36:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:25:41.743 11:36:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:41.743 11:36:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:41.743 11:36:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:41.743 11:36:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:25:41.743 11:36:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:41.743 11:36:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:41.743 11:36:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:41.743 11:36:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:25:41.743 11:36:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:41.743 11:36:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:41.743 11:36:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:41.743 11:36:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:25:41.743 11:36:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:25:41.743 11:36:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:25:41.743 11:36:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:25:41.743 11:36:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:25:41.743 11:36:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:25:41.743 11:36:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:25:41.743 00:25:41.743 real 0m1.954s 00:25:41.743 user 0m6.493s 00:25:41.743 sys 0m0.446s 00:25:41.743 11:36:25 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@1125 -- # xtrace_disable 00:25:41.743 11:36:25 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@10 -- # set +x 00:25:41.743 ************************************ 00:25:41.743 END TEST accel_cdev_decomp_full_mcore 00:25:41.743 ************************************ 00:25:41.743 11:36:25 accel -- accel/accel.sh@132 -- # run_test accel_cdev_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:25:41.743 11:36:25 accel -- common/autotest_common.sh@1100 -- # '[' 11 -le 1 ']' 00:25:41.743 11:36:25 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:25:41.743 11:36:25 accel -- common/autotest_common.sh@10 -- # set +x 00:25:41.743 ************************************ 00:25:41.743 START TEST accel_cdev_decomp_mthread 00:25:41.743 ************************************ 00:25:41.743 11:36:25 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:25:41.743 11:36:25 accel.accel_cdev_decomp_mthread -- accel/accel.sh@16 -- # local accel_opc 00:25:41.743 11:36:25 accel.accel_cdev_decomp_mthread -- accel/accel.sh@17 -- # local accel_module 00:25:41.743 11:36:25 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:41.743 11:36:25 accel.accel_cdev_decomp_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:25:41.743 11:36:25 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:41.743 11:36:25 accel.accel_cdev_decomp_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:25:41.743 11:36:25 accel.accel_cdev_decomp_mthread -- accel/accel.sh@12 -- # build_accel_config 00:25:41.743 11:36:25 accel.accel_cdev_decomp_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:25:41.743 11:36:25 accel.accel_cdev_decomp_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:25:41.743 11:36:25 accel.accel_cdev_decomp_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:25:41.743 11:36:25 accel.accel_cdev_decomp_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:25:41.743 11:36:25 accel.accel_cdev_decomp_mthread -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:25:41.743 11:36:25 accel.accel_cdev_decomp_mthread -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:25:41.743 11:36:25 accel.accel_cdev_decomp_mthread -- accel/accel.sh@40 -- # local IFS=, 00:25:41.743 11:36:25 accel.accel_cdev_decomp_mthread -- accel/accel.sh@41 -- # jq -r . 00:25:41.743 [2024-06-10 11:36:25.664107] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:25:41.743 [2024-06-10 11:36:25.664151] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid111861 ] 00:25:42.003 [2024-06-10 11:36:25.750905] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:42.003 [2024-06-10 11:36:25.829073] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:25:42.572 [2024-06-10 11:36:26.358795] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:25:42.572 [2024-06-10 11:36:26.360698] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x136cd10 PMD being used: compress_qat 00:25:42.572 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:25:42.572 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:42.572 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:42.572 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:42.572 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:25:42.572 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:42.572 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:42.572 [2024-06-10 11:36:26.364975] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1371e40 PMD being used: compress_qat 00:25:42.572 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:42.572 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:25:42.572 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:42.572 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:42.572 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:42.572 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=0x1 00:25:42.572 [2024-06-10 11:36:26.366815] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1494cf0 PMD being used: compress_qat 00:25:42.572 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:42.572 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:42.572 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:42.572 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:25:42.572 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:42.572 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:42.572 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:42.572 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:25:42.572 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:42.572 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:42.572 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:42.572 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=decompress 00:25:42.572 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:42.572 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:25:42.572 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:42.572 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:42.572 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val='4096 bytes' 00:25:42.572 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:42.572 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:42.572 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:42.572 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:25:42.572 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:42.572 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:42.572 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:42.572 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:25:42.572 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:42.572 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:25:42.572 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:42.572 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:42.572 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:25:42.572 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:42.572 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:42.572 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:42.572 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:25:42.572 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:42.572 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:42.572 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:42.572 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:25:42.572 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:42.572 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:42.572 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:42.572 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=2 00:25:42.572 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:42.572 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:42.572 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:42.572 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:25:42.572 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:42.572 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:42.572 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:42.572 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=Yes 00:25:42.572 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:42.572 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:42.573 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:42.573 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:25:42.573 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:42.573 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:42.573 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:42.573 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:25:42.573 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:42.573 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:42.573 11:36:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:43.952 11:36:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:25:43.952 11:36:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:43.952 11:36:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:43.952 11:36:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:43.952 11:36:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:25:43.952 11:36:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:43.952 11:36:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:43.952 11:36:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:43.952 11:36:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:25:43.952 11:36:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:43.952 11:36:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:43.952 11:36:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:43.952 11:36:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:25:43.952 11:36:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:43.952 11:36:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:43.952 11:36:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:43.952 11:36:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:25:43.952 11:36:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:43.952 11:36:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:43.952 11:36:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:43.952 11:36:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:25:43.952 11:36:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:43.952 11:36:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:43.952 11:36:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:43.952 11:36:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:25:43.952 11:36:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:43.952 11:36:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:43.952 11:36:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:43.952 11:36:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:25:43.952 11:36:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:25:43.952 11:36:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:25:43.952 00:25:43.952 real 0m1.891s 00:25:43.952 user 0m1.445s 00:25:43.952 sys 0m0.447s 00:25:43.952 11:36:27 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@1125 -- # xtrace_disable 00:25:43.952 11:36:27 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@10 -- # set +x 00:25:43.952 ************************************ 00:25:43.952 END TEST accel_cdev_decomp_mthread 00:25:43.952 ************************************ 00:25:43.952 11:36:27 accel -- accel/accel.sh@133 -- # run_test accel_cdev_decomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:25:43.952 11:36:27 accel -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:25:43.952 11:36:27 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:25:43.952 11:36:27 accel -- common/autotest_common.sh@10 -- # set +x 00:25:43.952 ************************************ 00:25:43.952 START TEST accel_cdev_decomp_full_mthread 00:25:43.952 ************************************ 00:25:43.952 11:36:27 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:25:43.952 11:36:27 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@16 -- # local accel_opc 00:25:43.952 11:36:27 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@17 -- # local accel_module 00:25:43.952 11:36:27 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:43.952 11:36:27 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:43.952 11:36:27 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:25:43.952 11:36:27 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:25:43.952 11:36:27 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@12 -- # build_accel_config 00:25:43.952 11:36:27 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:25:43.952 11:36:27 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:25:43.952 11:36:27 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:25:43.952 11:36:27 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:25:43.952 11:36:27 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:25:43.953 11:36:27 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:25:43.953 11:36:27 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@40 -- # local IFS=, 00:25:43.953 11:36:27 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@41 -- # jq -r . 00:25:43.953 [2024-06-10 11:36:27.643752] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:25:43.953 [2024-06-10 11:36:27.643810] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid112065 ] 00:25:43.953 [2024-06-10 11:36:27.731967] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:43.953 [2024-06-10 11:36:27.816968] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:25:44.533 [2024-06-10 11:36:28.340844] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:25:44.533 [2024-06-10 11:36:28.342774] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1991d10 PMD being used: compress_qat 00:25:44.533 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:25:44.533 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:44.533 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:44.533 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:44.533 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:25:44.533 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:44.533 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:44.533 [2024-06-10 11:36:28.346155] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1991db0 PMD being used: compress_qat 00:25:44.533 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:44.533 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:25:44.533 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:44.533 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:44.533 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:44.533 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=0x1 00:25:44.533 [2024-06-10 11:36:28.347984] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1b96910 PMD being used: compress_qat 00:25:44.533 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:44.533 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:44.533 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:44.533 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:25:44.533 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:44.533 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:44.533 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:44.533 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:25:44.533 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:44.533 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:44.533 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:44.533 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=decompress 00:25:44.533 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:44.533 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:25:44.533 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:44.533 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:44.533 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val='111250 bytes' 00:25:44.533 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:44.533 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:44.533 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:44.533 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:25:44.533 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:44.533 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:44.533 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:44.533 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:25:44.533 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:44.533 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:25:44.533 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:44.533 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:44.533 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:25:44.533 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:44.533 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:44.533 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:44.533 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:25:44.533 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:44.533 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:44.533 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:44.533 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:25:44.533 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:44.533 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:44.533 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:44.533 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=2 00:25:44.533 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:44.533 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:44.533 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:44.533 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:25:44.533 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:44.533 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:44.533 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:44.533 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=Yes 00:25:44.533 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:44.533 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:44.533 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:44.533 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:25:44.533 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:44.533 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:44.533 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:44.534 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:25:44.534 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:44.534 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:44.534 11:36:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:45.914 11:36:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:25:45.914 11:36:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:45.914 11:36:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:45.914 11:36:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:45.914 11:36:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:25:45.914 11:36:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:45.914 11:36:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:45.914 11:36:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:45.914 11:36:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:25:45.914 11:36:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:45.914 11:36:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:45.914 11:36:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:45.914 11:36:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:25:45.914 11:36:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:45.914 11:36:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:45.914 11:36:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:45.914 11:36:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:25:45.914 11:36:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:45.914 11:36:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:45.914 11:36:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:45.914 11:36:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:25:45.914 11:36:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:45.914 11:36:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:45.914 11:36:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:45.914 11:36:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:25:45.914 11:36:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:25:45.914 11:36:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:25:45.914 11:36:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:25:45.914 11:36:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:25:45.914 11:36:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:25:45.914 11:36:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:25:45.914 00:25:45.914 real 0m1.889s 00:25:45.914 user 0m1.470s 00:25:45.914 sys 0m0.428s 00:25:45.914 11:36:29 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@1125 -- # xtrace_disable 00:25:45.914 11:36:29 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@10 -- # set +x 00:25:45.914 ************************************ 00:25:45.914 END TEST accel_cdev_decomp_full_mthread 00:25:45.914 ************************************ 00:25:45.914 11:36:29 accel -- accel/accel.sh@134 -- # unset COMPRESSDEV 00:25:45.914 11:36:29 accel -- accel/accel.sh@137 -- # run_test accel_dif_functional_tests /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:25:45.914 11:36:29 accel -- accel/accel.sh@137 -- # build_accel_config 00:25:45.914 11:36:29 accel -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:25:45.914 11:36:29 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:25:45.914 11:36:29 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:25:45.914 11:36:29 accel -- common/autotest_common.sh@10 -- # set +x 00:25:45.914 11:36:29 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:25:45.914 11:36:29 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:25:45.914 11:36:29 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:25:45.914 11:36:29 accel -- accel/accel.sh@36 -- # [[ -n '' ]] 00:25:45.914 11:36:29 accel -- accel/accel.sh@40 -- # local IFS=, 00:25:45.914 11:36:29 accel -- accel/accel.sh@41 -- # jq -r . 00:25:45.914 ************************************ 00:25:45.914 START TEST accel_dif_functional_tests 00:25:45.914 ************************************ 00:25:45.914 11:36:29 accel.accel_dif_functional_tests -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:25:45.914 [2024-06-10 11:36:29.623504] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:25:45.914 [2024-06-10 11:36:29.623550] [ DPDK EAL parameters: DIF --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid112378 ] 00:25:45.914 [2024-06-10 11:36:29.708069] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:25:45.914 [2024-06-10 11:36:29.792164] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:25:45.914 [2024-06-10 11:36:29.792182] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:25:45.914 [2024-06-10 11:36:29.792184] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:25:46.174 00:25:46.174 00:25:46.174 CUnit - A unit testing framework for C - Version 2.1-3 00:25:46.174 http://cunit.sourceforge.net/ 00:25:46.174 00:25:46.174 00:25:46.174 Suite: accel_dif 00:25:46.174 Test: verify: DIF generated, GUARD check ...passed 00:25:46.174 Test: verify: DIF generated, APPTAG check ...passed 00:25:46.174 Test: verify: DIF generated, REFTAG check ...passed 00:25:46.174 Test: verify: DIF not generated, GUARD check ...[2024-06-10 11:36:29.887986] dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:25:46.174 passed 00:25:46.174 Test: verify: DIF not generated, APPTAG check ...[2024-06-10 11:36:29.888039] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:25:46.174 passed 00:25:46.174 Test: verify: DIF not generated, REFTAG check ...[2024-06-10 11:36:29.888063] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:25:46.174 passed 00:25:46.174 Test: verify: APPTAG correct, APPTAG check ...passed 00:25:46.174 Test: verify: APPTAG incorrect, APPTAG check ...[2024-06-10 11:36:29.888110] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=30, Expected=28, Actual=14 00:25:46.174 passed 00:25:46.174 Test: verify: APPTAG incorrect, no APPTAG check ...passed 00:25:46.174 Test: verify: REFTAG incorrect, REFTAG ignore ...passed 00:25:46.174 Test: verify: REFTAG_INIT correct, REFTAG check ...passed 00:25:46.174 Test: verify: REFTAG_INIT incorrect, REFTAG check ...[2024-06-10 11:36:29.888215] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=10 00:25:46.174 passed 00:25:46.174 Test: verify copy: DIF generated, GUARD check ...passed 00:25:46.174 Test: verify copy: DIF generated, APPTAG check ...passed 00:25:46.174 Test: verify copy: DIF generated, REFTAG check ...passed 00:25:46.174 Test: verify copy: DIF not generated, GUARD check ...[2024-06-10 11:36:29.888330] dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:25:46.174 passed 00:25:46.174 Test: verify copy: DIF not generated, APPTAG check ...[2024-06-10 11:36:29.888357] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:25:46.174 passed 00:25:46.174 Test: verify copy: DIF not generated, REFTAG check ...[2024-06-10 11:36:29.888382] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:25:46.174 passed 00:25:46.174 Test: generate copy: DIF generated, GUARD check ...passed 00:25:46.174 Test: generate copy: DIF generated, APTTAG check ...passed 00:25:46.174 Test: generate copy: DIF generated, REFTAG check ...passed 00:25:46.174 Test: generate copy: DIF generated, no GUARD check flag set ...passed 00:25:46.174 Test: generate copy: DIF generated, no APPTAG check flag set ...passed 00:25:46.174 Test: generate copy: DIF generated, no REFTAG check flag set ...passed 00:25:46.174 Test: generate copy: iovecs-len validate ...[2024-06-10 11:36:29.888561] dif.c:1190:spdk_dif_generate_copy: *ERROR*: Size of bounce_iovs arrays are not valid or misaligned with block_size. 00:25:46.174 passed 00:25:46.174 Test: generate copy: buffer alignment validate ...passed 00:25:46.174 00:25:46.174 Run Summary: Type Total Ran Passed Failed Inactive 00:25:46.174 suites 1 1 n/a 0 0 00:25:46.174 tests 26 26 26 0 0 00:25:46.174 asserts 115 115 115 0 n/a 00:25:46.174 00:25:46.174 Elapsed time = 0.002 seconds 00:25:46.174 00:25:46.174 real 0m0.513s 00:25:46.174 user 0m0.737s 00:25:46.174 sys 0m0.194s 00:25:46.174 11:36:30 accel.accel_dif_functional_tests -- common/autotest_common.sh@1125 -- # xtrace_disable 00:25:46.174 11:36:30 accel.accel_dif_functional_tests -- common/autotest_common.sh@10 -- # set +x 00:25:46.174 ************************************ 00:25:46.174 END TEST accel_dif_functional_tests 00:25:46.174 ************************************ 00:25:46.434 00:25:46.434 real 0m48.577s 00:25:46.434 user 0m57.478s 00:25:46.434 sys 0m9.611s 00:25:46.434 11:36:30 accel -- common/autotest_common.sh@1125 -- # xtrace_disable 00:25:46.434 11:36:30 accel -- common/autotest_common.sh@10 -- # set +x 00:25:46.434 ************************************ 00:25:46.434 END TEST accel 00:25:46.434 ************************************ 00:25:46.434 11:36:30 -- spdk/autotest.sh@184 -- # run_test accel_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel_rpc.sh 00:25:46.434 11:36:30 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:25:46.435 11:36:30 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:25:46.435 11:36:30 -- common/autotest_common.sh@10 -- # set +x 00:25:46.435 ************************************ 00:25:46.435 START TEST accel_rpc 00:25:46.435 ************************************ 00:25:46.435 11:36:30 accel_rpc -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel_rpc.sh 00:25:46.435 * Looking for test storage... 00:25:46.435 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel 00:25:46.435 11:36:30 accel_rpc -- accel/accel_rpc.sh@11 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:25:46.435 11:36:30 accel_rpc -- accel/accel_rpc.sh@14 -- # spdk_tgt_pid=112498 00:25:46.435 11:36:30 accel_rpc -- accel/accel_rpc.sh@15 -- # waitforlisten 112498 00:25:46.435 11:36:30 accel_rpc -- accel/accel_rpc.sh@13 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --wait-for-rpc 00:25:46.435 11:36:30 accel_rpc -- common/autotest_common.sh@830 -- # '[' -z 112498 ']' 00:25:46.435 11:36:30 accel_rpc -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:25:46.435 11:36:30 accel_rpc -- common/autotest_common.sh@835 -- # local max_retries=100 00:25:46.435 11:36:30 accel_rpc -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:25:46.435 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:25:46.435 11:36:30 accel_rpc -- common/autotest_common.sh@839 -- # xtrace_disable 00:25:46.435 11:36:30 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:25:46.694 [2024-06-10 11:36:30.381189] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:25:46.694 [2024-06-10 11:36:30.381248] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid112498 ] 00:25:46.694 [2024-06-10 11:36:30.467179] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:46.694 [2024-06-10 11:36:30.552255] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:25:47.263 11:36:31 accel_rpc -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:25:47.263 11:36:31 accel_rpc -- common/autotest_common.sh@863 -- # return 0 00:25:47.263 11:36:31 accel_rpc -- accel/accel_rpc.sh@45 -- # [[ y == y ]] 00:25:47.263 11:36:31 accel_rpc -- accel/accel_rpc.sh@45 -- # [[ 0 -gt 0 ]] 00:25:47.263 11:36:31 accel_rpc -- accel/accel_rpc.sh@49 -- # [[ y == y ]] 00:25:47.263 11:36:31 accel_rpc -- accel/accel_rpc.sh@49 -- # [[ 0 -gt 0 ]] 00:25:47.263 11:36:31 accel_rpc -- accel/accel_rpc.sh@53 -- # run_test accel_assign_opcode accel_assign_opcode_test_suite 00:25:47.263 11:36:31 accel_rpc -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:25:47.263 11:36:31 accel_rpc -- common/autotest_common.sh@1106 -- # xtrace_disable 00:25:47.263 11:36:31 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:25:47.522 ************************************ 00:25:47.522 START TEST accel_assign_opcode 00:25:47.522 ************************************ 00:25:47.522 11:36:31 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@1124 -- # accel_assign_opcode_test_suite 00:25:47.522 11:36:31 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@38 -- # rpc_cmd accel_assign_opc -o copy -m incorrect 00:25:47.522 11:36:31 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@560 -- # xtrace_disable 00:25:47.522 11:36:31 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:25:47.522 [2024-06-10 11:36:31.226313] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module incorrect 00:25:47.522 11:36:31 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:25:47.522 11:36:31 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@40 -- # rpc_cmd accel_assign_opc -o copy -m software 00:25:47.522 11:36:31 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@560 -- # xtrace_disable 00:25:47.522 11:36:31 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:25:47.522 [2024-06-10 11:36:31.234324] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module software 00:25:47.522 11:36:31 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:25:47.522 11:36:31 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@41 -- # rpc_cmd framework_start_init 00:25:47.522 11:36:31 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@560 -- # xtrace_disable 00:25:47.522 11:36:31 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:25:47.522 11:36:31 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:25:47.522 11:36:31 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # rpc_cmd accel_get_opc_assignments 00:25:47.522 11:36:31 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@560 -- # xtrace_disable 00:25:47.522 11:36:31 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # jq -r .copy 00:25:47.522 11:36:31 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:25:47.522 11:36:31 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # grep software 00:25:47.522 11:36:31 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:25:47.781 software 00:25:47.781 00:25:47.781 real 0m0.272s 00:25:47.781 user 0m0.046s 00:25:47.781 sys 0m0.017s 00:25:47.782 11:36:31 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@1125 -- # xtrace_disable 00:25:47.782 11:36:31 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:25:47.782 ************************************ 00:25:47.782 END TEST accel_assign_opcode 00:25:47.782 ************************************ 00:25:47.782 11:36:31 accel_rpc -- accel/accel_rpc.sh@55 -- # killprocess 112498 00:25:47.782 11:36:31 accel_rpc -- common/autotest_common.sh@949 -- # '[' -z 112498 ']' 00:25:47.782 11:36:31 accel_rpc -- common/autotest_common.sh@953 -- # kill -0 112498 00:25:47.782 11:36:31 accel_rpc -- common/autotest_common.sh@954 -- # uname 00:25:47.782 11:36:31 accel_rpc -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:25:47.782 11:36:31 accel_rpc -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 112498 00:25:47.782 11:36:31 accel_rpc -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:25:47.782 11:36:31 accel_rpc -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:25:47.782 11:36:31 accel_rpc -- common/autotest_common.sh@967 -- # echo 'killing process with pid 112498' 00:25:47.782 killing process with pid 112498 00:25:47.782 11:36:31 accel_rpc -- common/autotest_common.sh@968 -- # kill 112498 00:25:47.782 11:36:31 accel_rpc -- common/autotest_common.sh@973 -- # wait 112498 00:25:48.041 00:25:48.041 real 0m1.719s 00:25:48.041 user 0m1.697s 00:25:48.041 sys 0m0.555s 00:25:48.041 11:36:31 accel_rpc -- common/autotest_common.sh@1125 -- # xtrace_disable 00:25:48.041 11:36:31 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:25:48.041 ************************************ 00:25:48.041 END TEST accel_rpc 00:25:48.041 ************************************ 00:25:48.041 11:36:31 -- spdk/autotest.sh@185 -- # run_test app_cmdline /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/cmdline.sh 00:25:48.041 11:36:31 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:25:48.041 11:36:31 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:25:48.041 11:36:31 -- common/autotest_common.sh@10 -- # set +x 00:25:48.300 ************************************ 00:25:48.300 START TEST app_cmdline 00:25:48.300 ************************************ 00:25:48.300 11:36:32 app_cmdline -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/cmdline.sh 00:25:48.300 * Looking for test storage... 00:25:48.300 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:25:48.300 11:36:32 app_cmdline -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:25:48.300 11:36:32 app_cmdline -- app/cmdline.sh@16 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:25:48.300 11:36:32 app_cmdline -- app/cmdline.sh@17 -- # spdk_tgt_pid=112785 00:25:48.300 11:36:32 app_cmdline -- app/cmdline.sh@18 -- # waitforlisten 112785 00:25:48.300 11:36:32 app_cmdline -- common/autotest_common.sh@830 -- # '[' -z 112785 ']' 00:25:48.300 11:36:32 app_cmdline -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:25:48.300 11:36:32 app_cmdline -- common/autotest_common.sh@835 -- # local max_retries=100 00:25:48.300 11:36:32 app_cmdline -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:25:48.300 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:25:48.300 11:36:32 app_cmdline -- common/autotest_common.sh@839 -- # xtrace_disable 00:25:48.300 11:36:32 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:25:48.300 [2024-06-10 11:36:32.149711] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:25:48.300 [2024-06-10 11:36:32.149769] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid112785 ] 00:25:48.300 [2024-06-10 11:36:32.233844] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:48.560 [2024-06-10 11:36:32.322905] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:25:49.128 11:36:32 app_cmdline -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:25:49.128 11:36:32 app_cmdline -- common/autotest_common.sh@863 -- # return 0 00:25:49.128 11:36:32 app_cmdline -- app/cmdline.sh@20 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:25:49.388 { 00:25:49.388 "version": "SPDK v24.09-pre git sha1 a3f6419f1", 00:25:49.388 "fields": { 00:25:49.388 "major": 24, 00:25:49.388 "minor": 9, 00:25:49.388 "patch": 0, 00:25:49.388 "suffix": "-pre", 00:25:49.388 "commit": "a3f6419f1" 00:25:49.388 } 00:25:49.388 } 00:25:49.388 11:36:33 app_cmdline -- app/cmdline.sh@22 -- # expected_methods=() 00:25:49.388 11:36:33 app_cmdline -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:25:49.388 11:36:33 app_cmdline -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:25:49.388 11:36:33 app_cmdline -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:25:49.388 11:36:33 app_cmdline -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:25:49.388 11:36:33 app_cmdline -- app/cmdline.sh@26 -- # jq -r '.[]' 00:25:49.388 11:36:33 app_cmdline -- app/cmdline.sh@26 -- # sort 00:25:49.388 11:36:33 app_cmdline -- common/autotest_common.sh@560 -- # xtrace_disable 00:25:49.388 11:36:33 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:25:49.388 11:36:33 app_cmdline -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:25:49.388 11:36:33 app_cmdline -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:25:49.388 11:36:33 app_cmdline -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:25:49.388 11:36:33 app_cmdline -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:25:49.388 11:36:33 app_cmdline -- common/autotest_common.sh@649 -- # local es=0 00:25:49.388 11:36:33 app_cmdline -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:25:49.388 11:36:33 app_cmdline -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:49.388 11:36:33 app_cmdline -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:25:49.388 11:36:33 app_cmdline -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:49.388 11:36:33 app_cmdline -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:25:49.388 11:36:33 app_cmdline -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:49.388 11:36:33 app_cmdline -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:25:49.388 11:36:33 app_cmdline -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:49.388 11:36:33 app_cmdline -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:25:49.388 11:36:33 app_cmdline -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:25:49.647 request: 00:25:49.647 { 00:25:49.647 "method": "env_dpdk_get_mem_stats", 00:25:49.647 "req_id": 1 00:25:49.647 } 00:25:49.647 Got JSON-RPC error response 00:25:49.647 response: 00:25:49.647 { 00:25:49.647 "code": -32601, 00:25:49.647 "message": "Method not found" 00:25:49.647 } 00:25:49.647 11:36:33 app_cmdline -- common/autotest_common.sh@652 -- # es=1 00:25:49.647 11:36:33 app_cmdline -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:25:49.647 11:36:33 app_cmdline -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:25:49.647 11:36:33 app_cmdline -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:25:49.647 11:36:33 app_cmdline -- app/cmdline.sh@1 -- # killprocess 112785 00:25:49.647 11:36:33 app_cmdline -- common/autotest_common.sh@949 -- # '[' -z 112785 ']' 00:25:49.647 11:36:33 app_cmdline -- common/autotest_common.sh@953 -- # kill -0 112785 00:25:49.647 11:36:33 app_cmdline -- common/autotest_common.sh@954 -- # uname 00:25:49.647 11:36:33 app_cmdline -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:25:49.647 11:36:33 app_cmdline -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 112785 00:25:49.647 11:36:33 app_cmdline -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:25:49.647 11:36:33 app_cmdline -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:25:49.647 11:36:33 app_cmdline -- common/autotest_common.sh@967 -- # echo 'killing process with pid 112785' 00:25:49.647 killing process with pid 112785 00:25:49.647 11:36:33 app_cmdline -- common/autotest_common.sh@968 -- # kill 112785 00:25:49.647 11:36:33 app_cmdline -- common/autotest_common.sh@973 -- # wait 112785 00:25:49.907 00:25:49.907 real 0m1.730s 00:25:49.907 user 0m1.943s 00:25:49.907 sys 0m0.524s 00:25:49.907 11:36:33 app_cmdline -- common/autotest_common.sh@1125 -- # xtrace_disable 00:25:49.907 11:36:33 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:25:49.907 ************************************ 00:25:49.907 END TEST app_cmdline 00:25:49.907 ************************************ 00:25:49.907 11:36:33 -- spdk/autotest.sh@186 -- # run_test version /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/version.sh 00:25:49.907 11:36:33 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:25:49.907 11:36:33 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:25:49.907 11:36:33 -- common/autotest_common.sh@10 -- # set +x 00:25:49.907 ************************************ 00:25:49.907 START TEST version 00:25:49.907 ************************************ 00:25:49.907 11:36:33 version -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/version.sh 00:25:50.166 * Looking for test storage... 00:25:50.166 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:25:50.166 11:36:33 version -- app/version.sh@17 -- # get_header_version major 00:25:50.166 11:36:33 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:25:50.166 11:36:33 version -- app/version.sh@14 -- # cut -f2 00:25:50.166 11:36:33 version -- app/version.sh@14 -- # tr -d '"' 00:25:50.166 11:36:33 version -- app/version.sh@17 -- # major=24 00:25:50.166 11:36:33 version -- app/version.sh@18 -- # get_header_version minor 00:25:50.166 11:36:33 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:25:50.166 11:36:33 version -- app/version.sh@14 -- # cut -f2 00:25:50.166 11:36:33 version -- app/version.sh@14 -- # tr -d '"' 00:25:50.166 11:36:33 version -- app/version.sh@18 -- # minor=9 00:25:50.166 11:36:33 version -- app/version.sh@19 -- # get_header_version patch 00:25:50.166 11:36:33 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:25:50.166 11:36:33 version -- app/version.sh@14 -- # cut -f2 00:25:50.166 11:36:33 version -- app/version.sh@14 -- # tr -d '"' 00:25:50.166 11:36:33 version -- app/version.sh@19 -- # patch=0 00:25:50.166 11:36:33 version -- app/version.sh@20 -- # get_header_version suffix 00:25:50.166 11:36:33 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:25:50.166 11:36:33 version -- app/version.sh@14 -- # cut -f2 00:25:50.166 11:36:33 version -- app/version.sh@14 -- # tr -d '"' 00:25:50.166 11:36:33 version -- app/version.sh@20 -- # suffix=-pre 00:25:50.166 11:36:33 version -- app/version.sh@22 -- # version=24.9 00:25:50.166 11:36:33 version -- app/version.sh@25 -- # (( patch != 0 )) 00:25:50.166 11:36:33 version -- app/version.sh@28 -- # version=24.9rc0 00:25:50.167 11:36:33 version -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:25:50.167 11:36:33 version -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:25:50.167 11:36:33 version -- app/version.sh@30 -- # py_version=24.9rc0 00:25:50.167 11:36:33 version -- app/version.sh@31 -- # [[ 24.9rc0 == \2\4\.\9\r\c\0 ]] 00:25:50.167 00:25:50.167 real 0m0.178s 00:25:50.167 user 0m0.089s 00:25:50.167 sys 0m0.137s 00:25:50.167 11:36:33 version -- common/autotest_common.sh@1125 -- # xtrace_disable 00:25:50.167 11:36:33 version -- common/autotest_common.sh@10 -- # set +x 00:25:50.167 ************************************ 00:25:50.167 END TEST version 00:25:50.167 ************************************ 00:25:50.167 11:36:34 -- spdk/autotest.sh@188 -- # '[' 1 -eq 1 ']' 00:25:50.167 11:36:34 -- spdk/autotest.sh@189 -- # run_test blockdev_general /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh 00:25:50.167 11:36:34 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:25:50.167 11:36:34 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:25:50.167 11:36:34 -- common/autotest_common.sh@10 -- # set +x 00:25:50.167 ************************************ 00:25:50.167 START TEST blockdev_general 00:25:50.167 ************************************ 00:25:50.167 11:36:34 blockdev_general -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh 00:25:50.427 * Looking for test storage... 00:25:50.427 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:25:50.427 11:36:34 blockdev_general -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:25:50.427 11:36:34 blockdev_general -- bdev/nbd_common.sh@6 -- # set -e 00:25:50.427 11:36:34 blockdev_general -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:25:50.427 11:36:34 blockdev_general -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:25:50.427 11:36:34 blockdev_general -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:25:50.427 11:36:34 blockdev_general -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:25:50.427 11:36:34 blockdev_general -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:25:50.427 11:36:34 blockdev_general -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:25:50.427 11:36:34 blockdev_general -- bdev/blockdev.sh@20 -- # : 00:25:50.427 11:36:34 blockdev_general -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:25:50.427 11:36:34 blockdev_general -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:25:50.427 11:36:34 blockdev_general -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:25:50.427 11:36:34 blockdev_general -- bdev/blockdev.sh@674 -- # uname -s 00:25:50.427 11:36:34 blockdev_general -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:25:50.427 11:36:34 blockdev_general -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:25:50.427 11:36:34 blockdev_general -- bdev/blockdev.sh@682 -- # test_type=bdev 00:25:50.427 11:36:34 blockdev_general -- bdev/blockdev.sh@683 -- # crypto_device= 00:25:50.427 11:36:34 blockdev_general -- bdev/blockdev.sh@684 -- # dek= 00:25:50.427 11:36:34 blockdev_general -- bdev/blockdev.sh@685 -- # env_ctx= 00:25:50.427 11:36:34 blockdev_general -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:25:50.427 11:36:34 blockdev_general -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:25:50.427 11:36:34 blockdev_general -- bdev/blockdev.sh@690 -- # [[ bdev == bdev ]] 00:25:50.427 11:36:34 blockdev_general -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:25:50.427 11:36:34 blockdev_general -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:25:50.427 11:36:34 blockdev_general -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=113222 00:25:50.427 11:36:34 blockdev_general -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:25:50.427 11:36:34 blockdev_general -- bdev/blockdev.sh@49 -- # waitforlisten 113222 00:25:50.427 11:36:34 blockdev_general -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:25:50.427 11:36:34 blockdev_general -- common/autotest_common.sh@830 -- # '[' -z 113222 ']' 00:25:50.427 11:36:34 blockdev_general -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:25:50.427 11:36:34 blockdev_general -- common/autotest_common.sh@835 -- # local max_retries=100 00:25:50.427 11:36:34 blockdev_general -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:25:50.427 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:25:50.427 11:36:34 blockdev_general -- common/autotest_common.sh@839 -- # xtrace_disable 00:25:50.427 11:36:34 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:25:50.427 [2024-06-10 11:36:34.246625] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:25:50.427 [2024-06-10 11:36:34.246684] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid113222 ] 00:25:50.427 [2024-06-10 11:36:34.333788] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:50.686 [2024-06-10 11:36:34.412733] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:25:51.254 11:36:35 blockdev_general -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:25:51.254 11:36:35 blockdev_general -- common/autotest_common.sh@863 -- # return 0 00:25:51.254 11:36:35 blockdev_general -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:25:51.254 11:36:35 blockdev_general -- bdev/blockdev.sh@696 -- # setup_bdev_conf 00:25:51.254 11:36:35 blockdev_general -- bdev/blockdev.sh@53 -- # rpc_cmd 00:25:51.254 11:36:35 blockdev_general -- common/autotest_common.sh@560 -- # xtrace_disable 00:25:51.254 11:36:35 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:25:51.513 [2024-06-10 11:36:35.277501] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:25:51.513 [2024-06-10 11:36:35.277549] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:25:51.513 00:25:51.513 [2024-06-10 11:36:35.285492] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:25:51.513 [2024-06-10 11:36:35.285510] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:25:51.513 00:25:51.513 Malloc0 00:25:51.513 Malloc1 00:25:51.513 Malloc2 00:25:51.513 Malloc3 00:25:51.513 Malloc4 00:25:51.513 Malloc5 00:25:51.513 Malloc6 00:25:51.513 Malloc7 00:25:51.513 Malloc8 00:25:51.513 Malloc9 00:25:51.513 [2024-06-10 11:36:35.424134] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:25:51.513 [2024-06-10 11:36:35.424177] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:51.513 [2024-06-10 11:36:35.424194] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe3ad20 00:25:51.513 [2024-06-10 11:36:35.424202] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:51.513 [2024-06-10 11:36:35.425195] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:51.513 [2024-06-10 11:36:35.425218] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:25:51.513 TestPT 00:25:51.772 11:36:35 blockdev_general -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:25:51.772 11:36:35 blockdev_general -- bdev/blockdev.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile bs=2048 count=5000 00:25:51.772 5000+0 records in 00:25:51.772 5000+0 records out 00:25:51.772 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0269555 s, 380 MB/s 00:25:51.772 11:36:35 blockdev_general -- bdev/blockdev.sh@77 -- # rpc_cmd bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile AIO0 2048 00:25:51.772 11:36:35 blockdev_general -- common/autotest_common.sh@560 -- # xtrace_disable 00:25:51.772 11:36:35 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:25:51.772 AIO0 00:25:51.772 11:36:35 blockdev_general -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:25:51.772 11:36:35 blockdev_general -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:25:51.772 11:36:35 blockdev_general -- common/autotest_common.sh@560 -- # xtrace_disable 00:25:51.772 11:36:35 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:25:51.772 11:36:35 blockdev_general -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:25:51.772 11:36:35 blockdev_general -- bdev/blockdev.sh@740 -- # cat 00:25:51.772 11:36:35 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:25:51.772 11:36:35 blockdev_general -- common/autotest_common.sh@560 -- # xtrace_disable 00:25:51.772 11:36:35 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:25:51.772 11:36:35 blockdev_general -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:25:51.772 11:36:35 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:25:51.772 11:36:35 blockdev_general -- common/autotest_common.sh@560 -- # xtrace_disable 00:25:51.772 11:36:35 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:25:51.772 11:36:35 blockdev_general -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:25:51.772 11:36:35 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:25:51.772 11:36:35 blockdev_general -- common/autotest_common.sh@560 -- # xtrace_disable 00:25:51.772 11:36:35 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:25:51.772 11:36:35 blockdev_general -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:25:51.772 11:36:35 blockdev_general -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:25:51.772 11:36:35 blockdev_general -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:25:51.772 11:36:35 blockdev_general -- common/autotest_common.sh@560 -- # xtrace_disable 00:25:51.772 11:36:35 blockdev_general -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:25:51.772 11:36:35 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:25:52.032 11:36:35 blockdev_general -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:25:52.032 11:36:35 blockdev_general -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:25:52.032 11:36:35 blockdev_general -- bdev/blockdev.sh@749 -- # jq -r .name 00:25:52.033 11:36:35 blockdev_general -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "f5d53c09-9e53-49be-8089-99f6ea739a46"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "f5d53c09-9e53-49be-8089-99f6ea739a46",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "ff96b347-9c79-5306-af4a-d4eda060fbd6"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "ff96b347-9c79-5306-af4a-d4eda060fbd6",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "4a1f5ed1-08b1-5714-a33f-2c9d20c76574"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "4a1f5ed1-08b1-5714-a33f-2c9d20c76574",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "ac15056b-6dd5-5e45-9750-30c4f629b612"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "ac15056b-6dd5-5e45-9750-30c4f629b612",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "77d827a2-9859-5187-a4d1-41b3bd7fee08"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "77d827a2-9859-5187-a4d1-41b3bd7fee08",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "c5a15764-07ab-5d02-a78e-ada5966d8ccc"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "c5a15764-07ab-5d02-a78e-ada5966d8ccc",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "3dbfbb0f-cb8f-5f56-92e7-4bd8274b8f3d"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "3dbfbb0f-cb8f-5f56-92e7-4bd8274b8f3d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "f8cd6e62-fec1-5d97-84ac-c35a89a85576"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "f8cd6e62-fec1-5d97-84ac-c35a89a85576",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "e092fa87-dc0d-579b-9705-de16c24c024b"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "e092fa87-dc0d-579b-9705-de16c24c024b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "042bbc58-9a08-533b-87fe-e7ff4851ae92"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "042bbc58-9a08-533b-87fe-e7ff4851ae92",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "44d9da4a-12e4-560e-b2bb-f3e5918cf979"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "44d9da4a-12e4-560e-b2bb-f3e5918cf979",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "50afbc9f-48c2-55d5-91bb-df27f9ebafd6"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "50afbc9f-48c2-55d5-91bb-df27f9ebafd6",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "a9d8ad57-cc67-41e0-85f6-d5b12a736b6b"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "a9d8ad57-cc67-41e0-85f6-d5b12a736b6b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "a9d8ad57-cc67-41e0-85f6-d5b12a736b6b",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "d36ae0e4-d523-4a48-85c8-83500b98f038",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "77ba70fd-23c6-4162-a209-ed0defe4b139",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "176297b6-dfb5-4f4c-a821-a6ff406023cc"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "176297b6-dfb5-4f4c-a821-a6ff406023cc",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "176297b6-dfb5-4f4c-a821-a6ff406023cc",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "32941329-ce4c-45f1-822e-db59c38d696a",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "a6f505be-ce00-499c-b815-12cca33eeecd",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "194af16a-9e77-4a25-9489-7ce599692d07"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "194af16a-9e77-4a25-9489-7ce599692d07",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "write_zeroes": true,' ' "flush": false,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "194af16a-9e77-4a25-9489-7ce599692d07",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "ce2617bf-6980-4466-82c5-a312e509b721",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "ad324aad-24ed-4b65-8fff-29748833e968",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "eaedbead-5076-47d8-b393-529b940bc433"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "eaedbead-5076-47d8-b393-529b940bc433",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:25:52.033 11:36:35 blockdev_general -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:25:52.033 11:36:35 blockdev_general -- bdev/blockdev.sh@752 -- # hello_world_bdev=Malloc0 00:25:52.033 11:36:35 blockdev_general -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:25:52.033 11:36:35 blockdev_general -- bdev/blockdev.sh@754 -- # killprocess 113222 00:25:52.033 11:36:35 blockdev_general -- common/autotest_common.sh@949 -- # '[' -z 113222 ']' 00:25:52.033 11:36:35 blockdev_general -- common/autotest_common.sh@953 -- # kill -0 113222 00:25:52.033 11:36:35 blockdev_general -- common/autotest_common.sh@954 -- # uname 00:25:52.033 11:36:35 blockdev_general -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:25:52.033 11:36:35 blockdev_general -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 113222 00:25:52.033 11:36:35 blockdev_general -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:25:52.033 11:36:35 blockdev_general -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:25:52.033 11:36:35 blockdev_general -- common/autotest_common.sh@967 -- # echo 'killing process with pid 113222' 00:25:52.033 killing process with pid 113222 00:25:52.033 11:36:35 blockdev_general -- common/autotest_common.sh@968 -- # kill 113222 00:25:52.033 11:36:35 blockdev_general -- common/autotest_common.sh@973 -- # wait 113222 00:25:52.601 11:36:36 blockdev_general -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:25:52.601 11:36:36 blockdev_general -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b Malloc0 '' 00:25:52.601 11:36:36 blockdev_general -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:25:52.601 11:36:36 blockdev_general -- common/autotest_common.sh@1106 -- # xtrace_disable 00:25:52.601 11:36:36 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:25:52.601 ************************************ 00:25:52.601 START TEST bdev_hello_world 00:25:52.601 ************************************ 00:25:52.601 11:36:36 blockdev_general.bdev_hello_world -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b Malloc0 '' 00:25:52.601 [2024-06-10 11:36:36.401690] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:25:52.601 [2024-06-10 11:36:36.401738] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid113555 ] 00:25:52.601 [2024-06-10 11:36:36.486216] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:52.860 [2024-06-10 11:36:36.567120] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:25:52.860 [2024-06-10 11:36:36.715458] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:25:52.860 [2024-06-10 11:36:36.715509] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:25:52.860 [2024-06-10 11:36:36.715519] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:52.860 [2024-06-10 11:36:36.723461] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:25:52.860 [2024-06-10 11:36:36.723480] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:25:52.860 [2024-06-10 11:36:36.731473] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:25:52.860 [2024-06-10 11:36:36.731490] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:25:52.860 [2024-06-10 11:36:36.802256] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:25:52.860 [2024-06-10 11:36:36.802298] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:52.861 [2024-06-10 11:36:36.802312] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25e77d0 00:25:52.861 [2024-06-10 11:36:36.802321] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:52.861 [2024-06-10 11:36:36.803358] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:52.861 [2024-06-10 11:36:36.803383] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:25:53.120 [2024-06-10 11:36:36.942176] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:25:53.120 [2024-06-10 11:36:36.942220] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev Malloc0 00:25:53.120 [2024-06-10 11:36:36.942246] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:25:53.120 [2024-06-10 11:36:36.942281] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:25:53.120 [2024-06-10 11:36:36.942318] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:25:53.120 [2024-06-10 11:36:36.942333] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:25:53.120 [2024-06-10 11:36:36.942362] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:25:53.120 00:25:53.120 [2024-06-10 11:36:36.942380] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:25:53.379 00:25:53.379 real 0m0.899s 00:25:53.379 user 0m0.596s 00:25:53.379 sys 0m0.262s 00:25:53.379 11:36:37 blockdev_general.bdev_hello_world -- common/autotest_common.sh@1125 -- # xtrace_disable 00:25:53.379 11:36:37 blockdev_general.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:25:53.379 ************************************ 00:25:53.379 END TEST bdev_hello_world 00:25:53.379 ************************************ 00:25:53.379 11:36:37 blockdev_general -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:25:53.379 11:36:37 blockdev_general -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:25:53.379 11:36:37 blockdev_general -- common/autotest_common.sh@1106 -- # xtrace_disable 00:25:53.379 11:36:37 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:25:53.639 ************************************ 00:25:53.639 START TEST bdev_bounds 00:25:53.639 ************************************ 00:25:53.639 11:36:37 blockdev_general.bdev_bounds -- common/autotest_common.sh@1124 -- # bdev_bounds '' 00:25:53.639 11:36:37 blockdev_general.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=113625 00:25:53.639 11:36:37 blockdev_general.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:25:53.639 11:36:37 blockdev_general.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:25:53.639 11:36:37 blockdev_general.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 113625' 00:25:53.639 Process bdevio pid: 113625 00:25:53.639 11:36:37 blockdev_general.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 113625 00:25:53.639 11:36:37 blockdev_general.bdev_bounds -- common/autotest_common.sh@830 -- # '[' -z 113625 ']' 00:25:53.639 11:36:37 blockdev_general.bdev_bounds -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:25:53.639 11:36:37 blockdev_general.bdev_bounds -- common/autotest_common.sh@835 -- # local max_retries=100 00:25:53.639 11:36:37 blockdev_general.bdev_bounds -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:25:53.639 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:25:53.639 11:36:37 blockdev_general.bdev_bounds -- common/autotest_common.sh@839 -- # xtrace_disable 00:25:53.639 11:36:37 blockdev_general.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:25:53.639 [2024-06-10 11:36:37.379648] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:25:53.639 [2024-06-10 11:36:37.379697] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid113625 ] 00:25:53.639 [2024-06-10 11:36:37.471087] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:25:53.639 [2024-06-10 11:36:37.554393] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:25:53.639 [2024-06-10 11:36:37.554488] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:25:53.639 [2024-06-10 11:36:37.554490] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:25:53.898 [2024-06-10 11:36:37.699007] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:25:53.898 [2024-06-10 11:36:37.699061] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:25:53.898 [2024-06-10 11:36:37.699072] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:53.898 [2024-06-10 11:36:37.707019] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:25:53.898 [2024-06-10 11:36:37.707039] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:25:53.898 [2024-06-10 11:36:37.715032] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:25:53.898 [2024-06-10 11:36:37.715050] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:25:53.898 [2024-06-10 11:36:37.788408] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:25:53.898 [2024-06-10 11:36:37.788458] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:53.898 [2024-06-10 11:36:37.788471] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf83040 00:25:53.898 [2024-06-10 11:36:37.788480] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:53.898 [2024-06-10 11:36:37.789560] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:53.898 [2024-06-10 11:36:37.789585] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:25:54.467 11:36:38 blockdev_general.bdev_bounds -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:25:54.467 11:36:38 blockdev_general.bdev_bounds -- common/autotest_common.sh@863 -- # return 0 00:25:54.467 11:36:38 blockdev_general.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:25:54.467 I/O targets: 00:25:54.467 Malloc0: 65536 blocks of 512 bytes (32 MiB) 00:25:54.467 Malloc1p0: 32768 blocks of 512 bytes (16 MiB) 00:25:54.467 Malloc1p1: 32768 blocks of 512 bytes (16 MiB) 00:25:54.467 Malloc2p0: 8192 blocks of 512 bytes (4 MiB) 00:25:54.467 Malloc2p1: 8192 blocks of 512 bytes (4 MiB) 00:25:54.467 Malloc2p2: 8192 blocks of 512 bytes (4 MiB) 00:25:54.467 Malloc2p3: 8192 blocks of 512 bytes (4 MiB) 00:25:54.467 Malloc2p4: 8192 blocks of 512 bytes (4 MiB) 00:25:54.467 Malloc2p5: 8192 blocks of 512 bytes (4 MiB) 00:25:54.467 Malloc2p6: 8192 blocks of 512 bytes (4 MiB) 00:25:54.467 Malloc2p7: 8192 blocks of 512 bytes (4 MiB) 00:25:54.467 TestPT: 65536 blocks of 512 bytes (32 MiB) 00:25:54.467 raid0: 131072 blocks of 512 bytes (64 MiB) 00:25:54.467 concat0: 131072 blocks of 512 bytes (64 MiB) 00:25:54.467 raid1: 65536 blocks of 512 bytes (32 MiB) 00:25:54.467 AIO0: 5000 blocks of 2048 bytes (10 MiB) 00:25:54.467 00:25:54.467 00:25:54.467 CUnit - A unit testing framework for C - Version 2.1-3 00:25:54.467 http://cunit.sourceforge.net/ 00:25:54.468 00:25:54.468 00:25:54.468 Suite: bdevio tests on: AIO0 00:25:54.468 Test: blockdev write read block ...passed 00:25:54.468 Test: blockdev write zeroes read block ...passed 00:25:54.468 Test: blockdev write zeroes read no split ...passed 00:25:54.468 Test: blockdev write zeroes read split ...passed 00:25:54.468 Test: blockdev write zeroes read split partial ...passed 00:25:54.468 Test: blockdev reset ...passed 00:25:54.468 Test: blockdev write read 8 blocks ...passed 00:25:54.468 Test: blockdev write read size > 128k ...passed 00:25:54.468 Test: blockdev write read invalid size ...passed 00:25:54.468 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:25:54.468 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:25:54.468 Test: blockdev write read max offset ...passed 00:25:54.468 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:25:54.468 Test: blockdev writev readv 8 blocks ...passed 00:25:54.468 Test: blockdev writev readv 30 x 1block ...passed 00:25:54.468 Test: blockdev writev readv block ...passed 00:25:54.468 Test: blockdev writev readv size > 128k ...passed 00:25:54.468 Test: blockdev writev readv size > 128k in two iovs ...passed 00:25:54.468 Test: blockdev comparev and writev ...passed 00:25:54.468 Test: blockdev nvme passthru rw ...passed 00:25:54.468 Test: blockdev nvme passthru vendor specific ...passed 00:25:54.468 Test: blockdev nvme admin passthru ...passed 00:25:54.468 Test: blockdev copy ...passed 00:25:54.468 Suite: bdevio tests on: raid1 00:25:54.468 Test: blockdev write read block ...passed 00:25:54.468 Test: blockdev write zeroes read block ...passed 00:25:54.468 Test: blockdev write zeroes read no split ...passed 00:25:54.468 Test: blockdev write zeroes read split ...passed 00:25:54.468 Test: blockdev write zeroes read split partial ...passed 00:25:54.468 Test: blockdev reset ...passed 00:25:54.468 Test: blockdev write read 8 blocks ...passed 00:25:54.468 Test: blockdev write read size > 128k ...passed 00:25:54.468 Test: blockdev write read invalid size ...passed 00:25:54.468 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:25:54.468 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:25:54.468 Test: blockdev write read max offset ...passed 00:25:54.468 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:25:54.468 Test: blockdev writev readv 8 blocks ...passed 00:25:54.468 Test: blockdev writev readv 30 x 1block ...passed 00:25:54.468 Test: blockdev writev readv block ...passed 00:25:54.468 Test: blockdev writev readv size > 128k ...passed 00:25:54.468 Test: blockdev writev readv size > 128k in two iovs ...passed 00:25:54.468 Test: blockdev comparev and writev ...passed 00:25:54.468 Test: blockdev nvme passthru rw ...passed 00:25:54.468 Test: blockdev nvme passthru vendor specific ...passed 00:25:54.468 Test: blockdev nvme admin passthru ...passed 00:25:54.468 Test: blockdev copy ...passed 00:25:54.468 Suite: bdevio tests on: concat0 00:25:54.468 Test: blockdev write read block ...passed 00:25:54.468 Test: blockdev write zeroes read block ...passed 00:25:54.468 Test: blockdev write zeroes read no split ...passed 00:25:54.468 Test: blockdev write zeroes read split ...passed 00:25:54.468 Test: blockdev write zeroes read split partial ...passed 00:25:54.468 Test: blockdev reset ...passed 00:25:54.468 Test: blockdev write read 8 blocks ...passed 00:25:54.468 Test: blockdev write read size > 128k ...passed 00:25:54.468 Test: blockdev write read invalid size ...passed 00:25:54.468 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:25:54.468 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:25:54.468 Test: blockdev write read max offset ...passed 00:25:54.468 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:25:54.468 Test: blockdev writev readv 8 blocks ...passed 00:25:54.468 Test: blockdev writev readv 30 x 1block ...passed 00:25:54.468 Test: blockdev writev readv block ...passed 00:25:54.468 Test: blockdev writev readv size > 128k ...passed 00:25:54.468 Test: blockdev writev readv size > 128k in two iovs ...passed 00:25:54.468 Test: blockdev comparev and writev ...passed 00:25:54.468 Test: blockdev nvme passthru rw ...passed 00:25:54.468 Test: blockdev nvme passthru vendor specific ...passed 00:25:54.468 Test: blockdev nvme admin passthru ...passed 00:25:54.468 Test: blockdev copy ...passed 00:25:54.468 Suite: bdevio tests on: raid0 00:25:54.468 Test: blockdev write read block ...passed 00:25:54.468 Test: blockdev write zeroes read block ...passed 00:25:54.468 Test: blockdev write zeroes read no split ...passed 00:25:54.468 Test: blockdev write zeroes read split ...passed 00:25:54.468 Test: blockdev write zeroes read split partial ...passed 00:25:54.468 Test: blockdev reset ...passed 00:25:54.468 Test: blockdev write read 8 blocks ...passed 00:25:54.468 Test: blockdev write read size > 128k ...passed 00:25:54.468 Test: blockdev write read invalid size ...passed 00:25:54.468 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:25:54.468 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:25:54.468 Test: blockdev write read max offset ...passed 00:25:54.468 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:25:54.468 Test: blockdev writev readv 8 blocks ...passed 00:25:54.468 Test: blockdev writev readv 30 x 1block ...passed 00:25:54.468 Test: blockdev writev readv block ...passed 00:25:54.468 Test: blockdev writev readv size > 128k ...passed 00:25:54.468 Test: blockdev writev readv size > 128k in two iovs ...passed 00:25:54.468 Test: blockdev comparev and writev ...passed 00:25:54.468 Test: blockdev nvme passthru rw ...passed 00:25:54.468 Test: blockdev nvme passthru vendor specific ...passed 00:25:54.468 Test: blockdev nvme admin passthru ...passed 00:25:54.468 Test: blockdev copy ...passed 00:25:54.468 Suite: bdevio tests on: TestPT 00:25:54.468 Test: blockdev write read block ...passed 00:25:54.468 Test: blockdev write zeroes read block ...passed 00:25:54.468 Test: blockdev write zeroes read no split ...passed 00:25:54.468 Test: blockdev write zeroes read split ...passed 00:25:54.468 Test: blockdev write zeroes read split partial ...passed 00:25:54.468 Test: blockdev reset ...passed 00:25:54.468 Test: blockdev write read 8 blocks ...passed 00:25:54.468 Test: blockdev write read size > 128k ...passed 00:25:54.468 Test: blockdev write read invalid size ...passed 00:25:54.468 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:25:54.468 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:25:54.468 Test: blockdev write read max offset ...passed 00:25:54.468 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:25:54.468 Test: blockdev writev readv 8 blocks ...passed 00:25:54.468 Test: blockdev writev readv 30 x 1block ...passed 00:25:54.468 Test: blockdev writev readv block ...passed 00:25:54.468 Test: blockdev writev readv size > 128k ...passed 00:25:54.468 Test: blockdev writev readv size > 128k in two iovs ...passed 00:25:54.468 Test: blockdev comparev and writev ...passed 00:25:54.468 Test: blockdev nvme passthru rw ...passed 00:25:54.468 Test: blockdev nvme passthru vendor specific ...passed 00:25:54.468 Test: blockdev nvme admin passthru ...passed 00:25:54.468 Test: blockdev copy ...passed 00:25:54.468 Suite: bdevio tests on: Malloc2p7 00:25:54.468 Test: blockdev write read block ...passed 00:25:54.468 Test: blockdev write zeroes read block ...passed 00:25:54.468 Test: blockdev write zeroes read no split ...passed 00:25:54.468 Test: blockdev write zeroes read split ...passed 00:25:54.468 Test: blockdev write zeroes read split partial ...passed 00:25:54.468 Test: blockdev reset ...passed 00:25:54.468 Test: blockdev write read 8 blocks ...passed 00:25:54.468 Test: blockdev write read size > 128k ...passed 00:25:54.468 Test: blockdev write read invalid size ...passed 00:25:54.468 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:25:54.468 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:25:54.468 Test: blockdev write read max offset ...passed 00:25:54.468 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:25:54.468 Test: blockdev writev readv 8 blocks ...passed 00:25:54.468 Test: blockdev writev readv 30 x 1block ...passed 00:25:54.468 Test: blockdev writev readv block ...passed 00:25:54.468 Test: blockdev writev readv size > 128k ...passed 00:25:54.468 Test: blockdev writev readv size > 128k in two iovs ...passed 00:25:54.468 Test: blockdev comparev and writev ...passed 00:25:54.468 Test: blockdev nvme passthru rw ...passed 00:25:54.468 Test: blockdev nvme passthru vendor specific ...passed 00:25:54.468 Test: blockdev nvme admin passthru ...passed 00:25:54.468 Test: blockdev copy ...passed 00:25:54.468 Suite: bdevio tests on: Malloc2p6 00:25:54.468 Test: blockdev write read block ...passed 00:25:54.468 Test: blockdev write zeroes read block ...passed 00:25:54.468 Test: blockdev write zeroes read no split ...passed 00:25:54.468 Test: blockdev write zeroes read split ...passed 00:25:54.468 Test: blockdev write zeroes read split partial ...passed 00:25:54.468 Test: blockdev reset ...passed 00:25:54.468 Test: blockdev write read 8 blocks ...passed 00:25:54.468 Test: blockdev write read size > 128k ...passed 00:25:54.468 Test: blockdev write read invalid size ...passed 00:25:54.468 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:25:54.468 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:25:54.468 Test: blockdev write read max offset ...passed 00:25:54.468 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:25:54.468 Test: blockdev writev readv 8 blocks ...passed 00:25:54.468 Test: blockdev writev readv 30 x 1block ...passed 00:25:54.468 Test: blockdev writev readv block ...passed 00:25:54.468 Test: blockdev writev readv size > 128k ...passed 00:25:54.468 Test: blockdev writev readv size > 128k in two iovs ...passed 00:25:54.468 Test: blockdev comparev and writev ...passed 00:25:54.468 Test: blockdev nvme passthru rw ...passed 00:25:54.468 Test: blockdev nvme passthru vendor specific ...passed 00:25:54.468 Test: blockdev nvme admin passthru ...passed 00:25:54.468 Test: blockdev copy ...passed 00:25:54.468 Suite: bdevio tests on: Malloc2p5 00:25:54.468 Test: blockdev write read block ...passed 00:25:54.468 Test: blockdev write zeroes read block ...passed 00:25:54.468 Test: blockdev write zeroes read no split ...passed 00:25:54.468 Test: blockdev write zeroes read split ...passed 00:25:54.468 Test: blockdev write zeroes read split partial ...passed 00:25:54.468 Test: blockdev reset ...passed 00:25:54.469 Test: blockdev write read 8 blocks ...passed 00:25:54.469 Test: blockdev write read size > 128k ...passed 00:25:54.469 Test: blockdev write read invalid size ...passed 00:25:54.469 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:25:54.469 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:25:54.469 Test: blockdev write read max offset ...passed 00:25:54.469 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:25:54.469 Test: blockdev writev readv 8 blocks ...passed 00:25:54.469 Test: blockdev writev readv 30 x 1block ...passed 00:25:54.469 Test: blockdev writev readv block ...passed 00:25:54.469 Test: blockdev writev readv size > 128k ...passed 00:25:54.469 Test: blockdev writev readv size > 128k in two iovs ...passed 00:25:54.469 Test: blockdev comparev and writev ...passed 00:25:54.469 Test: blockdev nvme passthru rw ...passed 00:25:54.469 Test: blockdev nvme passthru vendor specific ...passed 00:25:54.469 Test: blockdev nvme admin passthru ...passed 00:25:54.469 Test: blockdev copy ...passed 00:25:54.469 Suite: bdevio tests on: Malloc2p4 00:25:54.469 Test: blockdev write read block ...passed 00:25:54.469 Test: blockdev write zeroes read block ...passed 00:25:54.469 Test: blockdev write zeroes read no split ...passed 00:25:54.469 Test: blockdev write zeroes read split ...passed 00:25:54.469 Test: blockdev write zeroes read split partial ...passed 00:25:54.469 Test: blockdev reset ...passed 00:25:54.469 Test: blockdev write read 8 blocks ...passed 00:25:54.469 Test: blockdev write read size > 128k ...passed 00:25:54.469 Test: blockdev write read invalid size ...passed 00:25:54.469 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:25:54.469 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:25:54.469 Test: blockdev write read max offset ...passed 00:25:54.469 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:25:54.469 Test: blockdev writev readv 8 blocks ...passed 00:25:54.469 Test: blockdev writev readv 30 x 1block ...passed 00:25:54.469 Test: blockdev writev readv block ...passed 00:25:54.469 Test: blockdev writev readv size > 128k ...passed 00:25:54.469 Test: blockdev writev readv size > 128k in two iovs ...passed 00:25:54.469 Test: blockdev comparev and writev ...passed 00:25:54.469 Test: blockdev nvme passthru rw ...passed 00:25:54.469 Test: blockdev nvme passthru vendor specific ...passed 00:25:54.469 Test: blockdev nvme admin passthru ...passed 00:25:54.469 Test: blockdev copy ...passed 00:25:54.469 Suite: bdevio tests on: Malloc2p3 00:25:54.469 Test: blockdev write read block ...passed 00:25:54.469 Test: blockdev write zeroes read block ...passed 00:25:54.469 Test: blockdev write zeroes read no split ...passed 00:25:54.729 Test: blockdev write zeroes read split ...passed 00:25:54.729 Test: blockdev write zeroes read split partial ...passed 00:25:54.729 Test: blockdev reset ...passed 00:25:54.729 Test: blockdev write read 8 blocks ...passed 00:25:54.729 Test: blockdev write read size > 128k ...passed 00:25:54.729 Test: blockdev write read invalid size ...passed 00:25:54.729 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:25:54.729 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:25:54.729 Test: blockdev write read max offset ...passed 00:25:54.729 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:25:54.729 Test: blockdev writev readv 8 blocks ...passed 00:25:54.729 Test: blockdev writev readv 30 x 1block ...passed 00:25:54.729 Test: blockdev writev readv block ...passed 00:25:54.729 Test: blockdev writev readv size > 128k ...passed 00:25:54.729 Test: blockdev writev readv size > 128k in two iovs ...passed 00:25:54.729 Test: blockdev comparev and writev ...passed 00:25:54.729 Test: blockdev nvme passthru rw ...passed 00:25:54.729 Test: blockdev nvme passthru vendor specific ...passed 00:25:54.729 Test: blockdev nvme admin passthru ...passed 00:25:54.729 Test: blockdev copy ...passed 00:25:54.729 Suite: bdevio tests on: Malloc2p2 00:25:54.729 Test: blockdev write read block ...passed 00:25:54.729 Test: blockdev write zeroes read block ...passed 00:25:54.729 Test: blockdev write zeroes read no split ...passed 00:25:54.729 Test: blockdev write zeroes read split ...passed 00:25:54.729 Test: blockdev write zeroes read split partial ...passed 00:25:54.729 Test: blockdev reset ...passed 00:25:54.729 Test: blockdev write read 8 blocks ...passed 00:25:54.729 Test: blockdev write read size > 128k ...passed 00:25:54.729 Test: blockdev write read invalid size ...passed 00:25:54.729 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:25:54.729 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:25:54.729 Test: blockdev write read max offset ...passed 00:25:54.729 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:25:54.729 Test: blockdev writev readv 8 blocks ...passed 00:25:54.729 Test: blockdev writev readv 30 x 1block ...passed 00:25:54.729 Test: blockdev writev readv block ...passed 00:25:54.729 Test: blockdev writev readv size > 128k ...passed 00:25:54.729 Test: blockdev writev readv size > 128k in two iovs ...passed 00:25:54.729 Test: blockdev comparev and writev ...passed 00:25:54.729 Test: blockdev nvme passthru rw ...passed 00:25:54.729 Test: blockdev nvme passthru vendor specific ...passed 00:25:54.729 Test: blockdev nvme admin passthru ...passed 00:25:54.729 Test: blockdev copy ...passed 00:25:54.729 Suite: bdevio tests on: Malloc2p1 00:25:54.729 Test: blockdev write read block ...passed 00:25:54.729 Test: blockdev write zeroes read block ...passed 00:25:54.729 Test: blockdev write zeroes read no split ...passed 00:25:54.729 Test: blockdev write zeroes read split ...passed 00:25:54.729 Test: blockdev write zeroes read split partial ...passed 00:25:54.729 Test: blockdev reset ...passed 00:25:54.729 Test: blockdev write read 8 blocks ...passed 00:25:54.729 Test: blockdev write read size > 128k ...passed 00:25:54.729 Test: blockdev write read invalid size ...passed 00:25:54.729 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:25:54.729 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:25:54.729 Test: blockdev write read max offset ...passed 00:25:54.729 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:25:54.729 Test: blockdev writev readv 8 blocks ...passed 00:25:54.729 Test: blockdev writev readv 30 x 1block ...passed 00:25:54.729 Test: blockdev writev readv block ...passed 00:25:54.729 Test: blockdev writev readv size > 128k ...passed 00:25:54.729 Test: blockdev writev readv size > 128k in two iovs ...passed 00:25:54.729 Test: blockdev comparev and writev ...passed 00:25:54.729 Test: blockdev nvme passthru rw ...passed 00:25:54.729 Test: blockdev nvme passthru vendor specific ...passed 00:25:54.729 Test: blockdev nvme admin passthru ...passed 00:25:54.729 Test: blockdev copy ...passed 00:25:54.729 Suite: bdevio tests on: Malloc2p0 00:25:54.729 Test: blockdev write read block ...passed 00:25:54.729 Test: blockdev write zeroes read block ...passed 00:25:54.729 Test: blockdev write zeroes read no split ...passed 00:25:54.729 Test: blockdev write zeroes read split ...passed 00:25:54.729 Test: blockdev write zeroes read split partial ...passed 00:25:54.729 Test: blockdev reset ...passed 00:25:54.729 Test: blockdev write read 8 blocks ...passed 00:25:54.729 Test: blockdev write read size > 128k ...passed 00:25:54.729 Test: blockdev write read invalid size ...passed 00:25:54.729 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:25:54.729 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:25:54.729 Test: blockdev write read max offset ...passed 00:25:54.729 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:25:54.729 Test: blockdev writev readv 8 blocks ...passed 00:25:54.729 Test: blockdev writev readv 30 x 1block ...passed 00:25:54.729 Test: blockdev writev readv block ...passed 00:25:54.729 Test: blockdev writev readv size > 128k ...passed 00:25:54.729 Test: blockdev writev readv size > 128k in two iovs ...passed 00:25:54.729 Test: blockdev comparev and writev ...passed 00:25:54.729 Test: blockdev nvme passthru rw ...passed 00:25:54.729 Test: blockdev nvme passthru vendor specific ...passed 00:25:54.729 Test: blockdev nvme admin passthru ...passed 00:25:54.729 Test: blockdev copy ...passed 00:25:54.729 Suite: bdevio tests on: Malloc1p1 00:25:54.729 Test: blockdev write read block ...passed 00:25:54.729 Test: blockdev write zeroes read block ...passed 00:25:54.729 Test: blockdev write zeroes read no split ...passed 00:25:54.729 Test: blockdev write zeroes read split ...passed 00:25:54.729 Test: blockdev write zeroes read split partial ...passed 00:25:54.729 Test: blockdev reset ...passed 00:25:54.729 Test: blockdev write read 8 blocks ...passed 00:25:54.729 Test: blockdev write read size > 128k ...passed 00:25:54.729 Test: blockdev write read invalid size ...passed 00:25:54.729 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:25:54.729 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:25:54.729 Test: blockdev write read max offset ...passed 00:25:54.729 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:25:54.729 Test: blockdev writev readv 8 blocks ...passed 00:25:54.729 Test: blockdev writev readv 30 x 1block ...passed 00:25:54.729 Test: blockdev writev readv block ...passed 00:25:54.729 Test: blockdev writev readv size > 128k ...passed 00:25:54.729 Test: blockdev writev readv size > 128k in two iovs ...passed 00:25:54.729 Test: blockdev comparev and writev ...passed 00:25:54.729 Test: blockdev nvme passthru rw ...passed 00:25:54.729 Test: blockdev nvme passthru vendor specific ...passed 00:25:54.729 Test: blockdev nvme admin passthru ...passed 00:25:54.729 Test: blockdev copy ...passed 00:25:54.729 Suite: bdevio tests on: Malloc1p0 00:25:54.729 Test: blockdev write read block ...passed 00:25:54.729 Test: blockdev write zeroes read block ...passed 00:25:54.729 Test: blockdev write zeroes read no split ...passed 00:25:54.729 Test: blockdev write zeroes read split ...passed 00:25:54.729 Test: blockdev write zeroes read split partial ...passed 00:25:54.729 Test: blockdev reset ...passed 00:25:54.729 Test: blockdev write read 8 blocks ...passed 00:25:54.729 Test: blockdev write read size > 128k ...passed 00:25:54.729 Test: blockdev write read invalid size ...passed 00:25:54.729 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:25:54.729 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:25:54.729 Test: blockdev write read max offset ...passed 00:25:54.729 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:25:54.729 Test: blockdev writev readv 8 blocks ...passed 00:25:54.729 Test: blockdev writev readv 30 x 1block ...passed 00:25:54.729 Test: blockdev writev readv block ...passed 00:25:54.729 Test: blockdev writev readv size > 128k ...passed 00:25:54.729 Test: blockdev writev readv size > 128k in two iovs ...passed 00:25:54.729 Test: blockdev comparev and writev ...passed 00:25:54.729 Test: blockdev nvme passthru rw ...passed 00:25:54.729 Test: blockdev nvme passthru vendor specific ...passed 00:25:54.729 Test: blockdev nvme admin passthru ...passed 00:25:54.729 Test: blockdev copy ...passed 00:25:54.729 Suite: bdevio tests on: Malloc0 00:25:54.730 Test: blockdev write read block ...passed 00:25:54.730 Test: blockdev write zeroes read block ...passed 00:25:54.730 Test: blockdev write zeroes read no split ...passed 00:25:54.730 Test: blockdev write zeroes read split ...passed 00:25:54.730 Test: blockdev write zeroes read split partial ...passed 00:25:54.730 Test: blockdev reset ...passed 00:25:54.730 Test: blockdev write read 8 blocks ...passed 00:25:54.730 Test: blockdev write read size > 128k ...passed 00:25:54.730 Test: blockdev write read invalid size ...passed 00:25:54.730 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:25:54.730 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:25:54.730 Test: blockdev write read max offset ...passed 00:25:54.730 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:25:54.730 Test: blockdev writev readv 8 blocks ...passed 00:25:54.730 Test: blockdev writev readv 30 x 1block ...passed 00:25:54.730 Test: blockdev writev readv block ...passed 00:25:54.730 Test: blockdev writev readv size > 128k ...passed 00:25:54.730 Test: blockdev writev readv size > 128k in two iovs ...passed 00:25:54.730 Test: blockdev comparev and writev ...passed 00:25:54.730 Test: blockdev nvme passthru rw ...passed 00:25:54.730 Test: blockdev nvme passthru vendor specific ...passed 00:25:54.730 Test: blockdev nvme admin passthru ...passed 00:25:54.730 Test: blockdev copy ...passed 00:25:54.730 00:25:54.730 Run Summary: Type Total Ran Passed Failed Inactive 00:25:54.730 suites 16 16 n/a 0 0 00:25:54.730 tests 368 368 368 0 0 00:25:54.730 asserts 2224 2224 2224 0 n/a 00:25:54.730 00:25:54.730 Elapsed time = 0.473 seconds 00:25:54.730 0 00:25:54.730 11:36:38 blockdev_general.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 113625 00:25:54.730 11:36:38 blockdev_general.bdev_bounds -- common/autotest_common.sh@949 -- # '[' -z 113625 ']' 00:25:54.730 11:36:38 blockdev_general.bdev_bounds -- common/autotest_common.sh@953 -- # kill -0 113625 00:25:54.730 11:36:38 blockdev_general.bdev_bounds -- common/autotest_common.sh@954 -- # uname 00:25:54.730 11:36:38 blockdev_general.bdev_bounds -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:25:54.730 11:36:38 blockdev_general.bdev_bounds -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 113625 00:25:54.730 11:36:38 blockdev_general.bdev_bounds -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:25:54.730 11:36:38 blockdev_general.bdev_bounds -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:25:54.730 11:36:38 blockdev_general.bdev_bounds -- common/autotest_common.sh@967 -- # echo 'killing process with pid 113625' 00:25:54.730 killing process with pid 113625 00:25:54.730 11:36:38 blockdev_general.bdev_bounds -- common/autotest_common.sh@968 -- # kill 113625 00:25:54.730 11:36:38 blockdev_general.bdev_bounds -- common/autotest_common.sh@973 -- # wait 113625 00:25:54.989 11:36:38 blockdev_general.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:25:54.989 00:25:54.989 real 0m1.505s 00:25:54.989 user 0m3.695s 00:25:54.989 sys 0m0.435s 00:25:54.989 11:36:38 blockdev_general.bdev_bounds -- common/autotest_common.sh@1125 -- # xtrace_disable 00:25:54.989 11:36:38 blockdev_general.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:25:54.989 ************************************ 00:25:54.989 END TEST bdev_bounds 00:25:54.989 ************************************ 00:25:54.989 11:36:38 blockdev_general -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '' 00:25:54.989 11:36:38 blockdev_general -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:25:54.989 11:36:38 blockdev_general -- common/autotest_common.sh@1106 -- # xtrace_disable 00:25:54.989 11:36:38 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:25:54.989 ************************************ 00:25:54.989 START TEST bdev_nbd 00:25:54.989 ************************************ 00:25:54.989 11:36:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@1124 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '' 00:25:54.989 11:36:38 blockdev_general.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:25:54.989 11:36:38 blockdev_general.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:25:54.989 11:36:38 blockdev_general.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:25:54.989 11:36:38 blockdev_general.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:25:54.989 11:36:38 blockdev_general.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:25:54.989 11:36:38 blockdev_general.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:25:54.989 11:36:38 blockdev_general.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=16 00:25:54.989 11:36:38 blockdev_general.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:25:54.989 11:36:38 blockdev_general.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:25:54.989 11:36:38 blockdev_general.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:25:54.989 11:36:38 blockdev_general.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=16 00:25:54.989 11:36:38 blockdev_general.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:25:54.990 11:36:38 blockdev_general.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:25:54.990 11:36:38 blockdev_general.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:25:54.990 11:36:38 blockdev_general.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:25:54.990 11:36:38 blockdev_general.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=113903 00:25:54.990 11:36:38 blockdev_general.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:25:54.990 11:36:38 blockdev_general.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:25:54.990 11:36:38 blockdev_general.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 113903 /var/tmp/spdk-nbd.sock 00:25:54.990 11:36:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@830 -- # '[' -z 113903 ']' 00:25:54.990 11:36:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:25:54.990 11:36:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@835 -- # local max_retries=100 00:25:54.990 11:36:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:25:54.990 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:25:54.990 11:36:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@839 -- # xtrace_disable 00:25:54.990 11:36:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:25:55.248 [2024-06-10 11:36:38.969008] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:25:55.248 [2024-06-10 11:36:38.969060] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:25:55.248 [2024-06-10 11:36:39.059501] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:55.248 [2024-06-10 11:36:39.140517] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:25:55.507 [2024-06-10 11:36:39.280394] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:25:55.507 [2024-06-10 11:36:39.280440] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:25:55.507 [2024-06-10 11:36:39.280449] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:55.507 [2024-06-10 11:36:39.288401] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:25:55.507 [2024-06-10 11:36:39.288419] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:25:55.507 [2024-06-10 11:36:39.296414] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:25:55.507 [2024-06-10 11:36:39.296430] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:25:55.507 [2024-06-10 11:36:39.364143] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:25:55.507 [2024-06-10 11:36:39.364189] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:55.507 [2024-06-10 11:36:39.364204] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22aa620 00:25:55.507 [2024-06-10 11:36:39.364213] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:55.507 [2024-06-10 11:36:39.365169] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:55.507 [2024-06-10 11:36:39.365192] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:25:56.073 11:36:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:25:56.073 11:36:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@863 -- # return 0 00:25:56.073 11:36:39 blockdev_general.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' 00:25:56.073 11:36:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:25:56.073 11:36:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:25:56.073 11:36:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:25:56.073 11:36:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' 00:25:56.073 11:36:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:25:56.073 11:36:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:25:56.073 11:36:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:25:56.073 11:36:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:25:56.073 11:36:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:25:56.073 11:36:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:25:56.073 11:36:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:25:56.073 11:36:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 00:25:56.073 11:36:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:25:56.073 11:36:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:25:56.073 11:36:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:25:56.073 11:36:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:25:56.073 11:36:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:25:56.073 11:36:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:25:56.073 11:36:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:25:56.073 11:36:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:25:56.073 11:36:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:25:56.073 11:36:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:25:56.073 11:36:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:25:56.073 11:36:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:56.073 1+0 records in 00:25:56.073 1+0 records out 00:25:56.073 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000231475 s, 17.7 MB/s 00:25:56.073 11:36:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:56.073 11:36:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:25:56.073 11:36:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:56.073 11:36:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:25:56.073 11:36:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:25:56.073 11:36:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:25:56.073 11:36:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:25:56.073 11:36:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p0 00:25:56.332 11:36:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:25:56.332 11:36:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:25:56.332 11:36:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:25:56.332 11:36:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:25:56.332 11:36:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:25:56.332 11:36:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:25:56.332 11:36:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:25:56.332 11:36:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:25:56.332 11:36:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:25:56.332 11:36:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:25:56.332 11:36:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:25:56.332 11:36:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:56.332 1+0 records in 00:25:56.332 1+0 records out 00:25:56.332 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000237637 s, 17.2 MB/s 00:25:56.332 11:36:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:56.332 11:36:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:25:56.332 11:36:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:56.332 11:36:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:25:56.332 11:36:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:25:56.332 11:36:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:25:56.332 11:36:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:25:56.332 11:36:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p1 00:25:56.591 11:36:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:25:56.591 11:36:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:25:56.591 11:36:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:25:56.591 11:36:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd2 00:25:56.591 11:36:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:25:56.591 11:36:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:25:56.591 11:36:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:25:56.591 11:36:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd2 /proc/partitions 00:25:56.591 11:36:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:25:56.591 11:36:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:25:56.591 11:36:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:25:56.591 11:36:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:56.591 1+0 records in 00:25:56.591 1+0 records out 00:25:56.591 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00026128 s, 15.7 MB/s 00:25:56.591 11:36:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:56.591 11:36:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:25:56.591 11:36:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:56.591 11:36:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:25:56.591 11:36:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:25:56.591 11:36:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:25:56.591 11:36:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:25:56.591 11:36:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p0 00:25:56.849 11:36:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:25:56.849 11:36:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:25:56.849 11:36:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:25:56.849 11:36:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd3 00:25:56.849 11:36:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:25:56.849 11:36:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:25:56.849 11:36:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:25:56.849 11:36:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd3 /proc/partitions 00:25:56.849 11:36:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:25:56.849 11:36:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:25:56.849 11:36:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:25:56.849 11:36:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:56.849 1+0 records in 00:25:56.849 1+0 records out 00:25:56.849 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000263017 s, 15.6 MB/s 00:25:56.849 11:36:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:56.849 11:36:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:25:56.849 11:36:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:56.849 11:36:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:25:56.849 11:36:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:25:56.850 11:36:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:25:56.850 11:36:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:25:56.850 11:36:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p1 00:25:57.108 11:36:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd4 00:25:57.108 11:36:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd4 00:25:57.108 11:36:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd4 00:25:57.108 11:36:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd4 00:25:57.108 11:36:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:25:57.108 11:36:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:25:57.108 11:36:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:25:57.108 11:36:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd4 /proc/partitions 00:25:57.108 11:36:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:25:57.108 11:36:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:25:57.108 11:36:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:25:57.108 11:36:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd4 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:57.108 1+0 records in 00:25:57.108 1+0 records out 00:25:57.108 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000339855 s, 12.1 MB/s 00:25:57.108 11:36:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:57.108 11:36:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:25:57.108 11:36:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:57.108 11:36:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:25:57.108 11:36:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:25:57.108 11:36:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:25:57.108 11:36:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:25:57.108 11:36:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p2 00:25:57.108 11:36:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd5 00:25:57.109 11:36:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd5 00:25:57.109 11:36:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd5 00:25:57.109 11:36:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd5 00:25:57.109 11:36:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:25:57.109 11:36:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:25:57.109 11:36:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:25:57.109 11:36:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd5 /proc/partitions 00:25:57.109 11:36:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:25:57.109 11:36:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:25:57.109 11:36:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:25:57.109 11:36:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd5 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:57.109 1+0 records in 00:25:57.109 1+0 records out 00:25:57.109 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000313202 s, 13.1 MB/s 00:25:57.109 11:36:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:57.109 11:36:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:25:57.109 11:36:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:57.109 11:36:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:25:57.109 11:36:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:25:57.109 11:36:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:25:57.109 11:36:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:25:57.109 11:36:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p3 00:25:57.368 11:36:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd6 00:25:57.368 11:36:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd6 00:25:57.368 11:36:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd6 00:25:57.368 11:36:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd6 00:25:57.368 11:36:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:25:57.368 11:36:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:25:57.368 11:36:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:25:57.368 11:36:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd6 /proc/partitions 00:25:57.368 11:36:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:25:57.368 11:36:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:25:57.368 11:36:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:25:57.368 11:36:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd6 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:57.368 1+0 records in 00:25:57.368 1+0 records out 00:25:57.368 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000298227 s, 13.7 MB/s 00:25:57.368 11:36:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:57.368 11:36:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:25:57.368 11:36:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:57.368 11:36:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:25:57.368 11:36:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:25:57.368 11:36:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:25:57.368 11:36:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:25:57.368 11:36:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p4 00:25:57.627 11:36:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd7 00:25:57.627 11:36:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd7 00:25:57.627 11:36:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd7 00:25:57.627 11:36:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd7 00:25:57.627 11:36:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:25:57.627 11:36:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:25:57.627 11:36:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:25:57.627 11:36:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd7 /proc/partitions 00:25:57.627 11:36:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:25:57.627 11:36:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:25:57.627 11:36:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:25:57.627 11:36:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd7 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:57.627 1+0 records in 00:25:57.627 1+0 records out 00:25:57.627 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00031155 s, 13.1 MB/s 00:25:57.627 11:36:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:57.627 11:36:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:25:57.627 11:36:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:57.627 11:36:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:25:57.627 11:36:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:25:57.627 11:36:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:25:57.627 11:36:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:25:57.627 11:36:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p5 00:25:57.886 11:36:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd8 00:25:57.886 11:36:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd8 00:25:57.886 11:36:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd8 00:25:57.886 11:36:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd8 00:25:57.886 11:36:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:25:57.886 11:36:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:25:57.886 11:36:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:25:57.886 11:36:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd8 /proc/partitions 00:25:57.886 11:36:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:25:57.886 11:36:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:25:57.886 11:36:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:25:57.886 11:36:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd8 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:57.886 1+0 records in 00:25:57.886 1+0 records out 00:25:57.886 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000414208 s, 9.9 MB/s 00:25:57.886 11:36:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:57.886 11:36:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:25:57.886 11:36:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:57.886 11:36:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:25:57.886 11:36:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:25:57.886 11:36:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:25:57.886 11:36:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:25:57.886 11:36:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p6 00:25:58.145 11:36:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd9 00:25:58.145 11:36:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd9 00:25:58.145 11:36:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd9 00:25:58.145 11:36:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd9 00:25:58.145 11:36:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:25:58.145 11:36:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:25:58.145 11:36:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:25:58.145 11:36:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd9 /proc/partitions 00:25:58.145 11:36:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:25:58.145 11:36:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:25:58.145 11:36:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:25:58.145 11:36:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd9 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:58.145 1+0 records in 00:25:58.145 1+0 records out 00:25:58.145 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.0004028 s, 10.2 MB/s 00:25:58.145 11:36:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:58.145 11:36:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:25:58.145 11:36:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:58.145 11:36:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:25:58.145 11:36:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:25:58.145 11:36:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:25:58.145 11:36:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:25:58.145 11:36:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p7 00:25:58.403 11:36:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd10 00:25:58.403 11:36:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd10 00:25:58.403 11:36:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd10 00:25:58.403 11:36:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd10 00:25:58.403 11:36:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:25:58.403 11:36:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:25:58.403 11:36:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:25:58.403 11:36:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd10 /proc/partitions 00:25:58.403 11:36:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:25:58.403 11:36:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:25:58.403 11:36:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:25:58.403 11:36:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:58.403 1+0 records in 00:25:58.403 1+0 records out 00:25:58.403 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000426446 s, 9.6 MB/s 00:25:58.403 11:36:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:58.403 11:36:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:25:58.403 11:36:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:58.403 11:36:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:25:58.403 11:36:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:25:58.403 11:36:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:25:58.403 11:36:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:25:58.403 11:36:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk TestPT 00:25:58.403 11:36:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd11 00:25:58.403 11:36:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd11 00:25:58.403 11:36:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd11 00:25:58.661 11:36:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd11 00:25:58.661 11:36:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:25:58.661 11:36:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:25:58.661 11:36:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:25:58.661 11:36:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd11 /proc/partitions 00:25:58.661 11:36:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:25:58.661 11:36:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:25:58.661 11:36:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:25:58.661 11:36:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:58.661 1+0 records in 00:25:58.661 1+0 records out 00:25:58.661 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000429662 s, 9.5 MB/s 00:25:58.661 11:36:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:58.661 11:36:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:25:58.661 11:36:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:58.661 11:36:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:25:58.661 11:36:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:25:58.661 11:36:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:25:58.661 11:36:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:25:58.661 11:36:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid0 00:25:58.661 11:36:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd12 00:25:58.661 11:36:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd12 00:25:58.661 11:36:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd12 00:25:58.661 11:36:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd12 00:25:58.661 11:36:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:25:58.661 11:36:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:25:58.661 11:36:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:25:58.661 11:36:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd12 /proc/partitions 00:25:58.661 11:36:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:25:58.661 11:36:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:25:58.661 11:36:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:25:58.661 11:36:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd12 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:58.661 1+0 records in 00:25:58.661 1+0 records out 00:25:58.661 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000399557 s, 10.3 MB/s 00:25:58.661 11:36:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:58.661 11:36:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:25:58.661 11:36:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:58.661 11:36:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:25:58.661 11:36:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:25:58.661 11:36:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:25:58.661 11:36:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:25:58.661 11:36:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk concat0 00:25:58.928 11:36:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd13 00:25:58.928 11:36:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd13 00:25:58.928 11:36:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd13 00:25:58.928 11:36:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd13 00:25:58.928 11:36:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:25:58.928 11:36:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:25:58.928 11:36:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:25:58.928 11:36:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd13 /proc/partitions 00:25:58.928 11:36:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:25:58.928 11:36:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:25:58.928 11:36:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:25:58.928 11:36:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd13 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:58.928 1+0 records in 00:25:58.928 1+0 records out 00:25:58.928 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000537532 s, 7.6 MB/s 00:25:58.928 11:36:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:58.928 11:36:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:25:58.928 11:36:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:58.928 11:36:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:25:58.928 11:36:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:25:58.928 11:36:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:25:58.928 11:36:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:25:58.928 11:36:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid1 00:25:59.188 11:36:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd14 00:25:59.188 11:36:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd14 00:25:59.188 11:36:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd14 00:25:59.188 11:36:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd14 00:25:59.188 11:36:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:25:59.188 11:36:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:25:59.188 11:36:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:25:59.188 11:36:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd14 /proc/partitions 00:25:59.188 11:36:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:25:59.188 11:36:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:25:59.188 11:36:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:25:59.188 11:36:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd14 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:59.188 1+0 records in 00:25:59.188 1+0 records out 00:25:59.188 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000650193 s, 6.3 MB/s 00:25:59.188 11:36:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:59.188 11:36:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:25:59.188 11:36:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:59.188 11:36:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:25:59.188 11:36:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:25:59.188 11:36:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:25:59.188 11:36:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:25:59.188 11:36:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk AIO0 00:25:59.447 11:36:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd15 00:25:59.447 11:36:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd15 00:25:59.447 11:36:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd15 00:25:59.447 11:36:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd15 00:25:59.447 11:36:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:25:59.447 11:36:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:25:59.447 11:36:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:25:59.447 11:36:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd15 /proc/partitions 00:25:59.447 11:36:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:25:59.447 11:36:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:25:59.447 11:36:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:25:59.447 11:36:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd15 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:59.447 1+0 records in 00:25:59.447 1+0 records out 00:25:59.447 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000575072 s, 7.1 MB/s 00:25:59.447 11:36:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:59.447 11:36:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:25:59.447 11:36:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:59.447 11:36:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:25:59.447 11:36:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:25:59.447 11:36:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:25:59.447 11:36:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:25:59.447 11:36:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:25:59.705 11:36:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:25:59.705 { 00:25:59.705 "nbd_device": "/dev/nbd0", 00:25:59.705 "bdev_name": "Malloc0" 00:25:59.705 }, 00:25:59.705 { 00:25:59.705 "nbd_device": "/dev/nbd1", 00:25:59.705 "bdev_name": "Malloc1p0" 00:25:59.705 }, 00:25:59.705 { 00:25:59.705 "nbd_device": "/dev/nbd2", 00:25:59.705 "bdev_name": "Malloc1p1" 00:25:59.705 }, 00:25:59.705 { 00:25:59.705 "nbd_device": "/dev/nbd3", 00:25:59.705 "bdev_name": "Malloc2p0" 00:25:59.705 }, 00:25:59.705 { 00:25:59.705 "nbd_device": "/dev/nbd4", 00:25:59.705 "bdev_name": "Malloc2p1" 00:25:59.706 }, 00:25:59.706 { 00:25:59.706 "nbd_device": "/dev/nbd5", 00:25:59.706 "bdev_name": "Malloc2p2" 00:25:59.706 }, 00:25:59.706 { 00:25:59.706 "nbd_device": "/dev/nbd6", 00:25:59.706 "bdev_name": "Malloc2p3" 00:25:59.706 }, 00:25:59.706 { 00:25:59.706 "nbd_device": "/dev/nbd7", 00:25:59.706 "bdev_name": "Malloc2p4" 00:25:59.706 }, 00:25:59.706 { 00:25:59.706 "nbd_device": "/dev/nbd8", 00:25:59.706 "bdev_name": "Malloc2p5" 00:25:59.706 }, 00:25:59.706 { 00:25:59.706 "nbd_device": "/dev/nbd9", 00:25:59.706 "bdev_name": "Malloc2p6" 00:25:59.706 }, 00:25:59.706 { 00:25:59.706 "nbd_device": "/dev/nbd10", 00:25:59.706 "bdev_name": "Malloc2p7" 00:25:59.706 }, 00:25:59.706 { 00:25:59.706 "nbd_device": "/dev/nbd11", 00:25:59.706 "bdev_name": "TestPT" 00:25:59.706 }, 00:25:59.706 { 00:25:59.706 "nbd_device": "/dev/nbd12", 00:25:59.706 "bdev_name": "raid0" 00:25:59.706 }, 00:25:59.706 { 00:25:59.706 "nbd_device": "/dev/nbd13", 00:25:59.706 "bdev_name": "concat0" 00:25:59.706 }, 00:25:59.706 { 00:25:59.706 "nbd_device": "/dev/nbd14", 00:25:59.706 "bdev_name": "raid1" 00:25:59.706 }, 00:25:59.706 { 00:25:59.706 "nbd_device": "/dev/nbd15", 00:25:59.706 "bdev_name": "AIO0" 00:25:59.706 } 00:25:59.706 ]' 00:25:59.706 11:36:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:25:59.706 11:36:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:25:59.706 { 00:25:59.706 "nbd_device": "/dev/nbd0", 00:25:59.706 "bdev_name": "Malloc0" 00:25:59.706 }, 00:25:59.706 { 00:25:59.706 "nbd_device": "/dev/nbd1", 00:25:59.706 "bdev_name": "Malloc1p0" 00:25:59.706 }, 00:25:59.706 { 00:25:59.706 "nbd_device": "/dev/nbd2", 00:25:59.706 "bdev_name": "Malloc1p1" 00:25:59.706 }, 00:25:59.706 { 00:25:59.706 "nbd_device": "/dev/nbd3", 00:25:59.706 "bdev_name": "Malloc2p0" 00:25:59.706 }, 00:25:59.706 { 00:25:59.706 "nbd_device": "/dev/nbd4", 00:25:59.706 "bdev_name": "Malloc2p1" 00:25:59.706 }, 00:25:59.706 { 00:25:59.706 "nbd_device": "/dev/nbd5", 00:25:59.706 "bdev_name": "Malloc2p2" 00:25:59.706 }, 00:25:59.706 { 00:25:59.706 "nbd_device": "/dev/nbd6", 00:25:59.706 "bdev_name": "Malloc2p3" 00:25:59.706 }, 00:25:59.706 { 00:25:59.706 "nbd_device": "/dev/nbd7", 00:25:59.706 "bdev_name": "Malloc2p4" 00:25:59.706 }, 00:25:59.706 { 00:25:59.706 "nbd_device": "/dev/nbd8", 00:25:59.706 "bdev_name": "Malloc2p5" 00:25:59.706 }, 00:25:59.706 { 00:25:59.706 "nbd_device": "/dev/nbd9", 00:25:59.706 "bdev_name": "Malloc2p6" 00:25:59.706 }, 00:25:59.706 { 00:25:59.706 "nbd_device": "/dev/nbd10", 00:25:59.706 "bdev_name": "Malloc2p7" 00:25:59.706 }, 00:25:59.706 { 00:25:59.706 "nbd_device": "/dev/nbd11", 00:25:59.706 "bdev_name": "TestPT" 00:25:59.706 }, 00:25:59.706 { 00:25:59.706 "nbd_device": "/dev/nbd12", 00:25:59.706 "bdev_name": "raid0" 00:25:59.706 }, 00:25:59.706 { 00:25:59.706 "nbd_device": "/dev/nbd13", 00:25:59.706 "bdev_name": "concat0" 00:25:59.706 }, 00:25:59.706 { 00:25:59.706 "nbd_device": "/dev/nbd14", 00:25:59.706 "bdev_name": "raid1" 00:25:59.706 }, 00:25:59.706 { 00:25:59.706 "nbd_device": "/dev/nbd15", 00:25:59.706 "bdev_name": "AIO0" 00:25:59.706 } 00:25:59.706 ]' 00:25:59.706 11:36:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:25:59.706 11:36:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15' 00:25:59.706 11:36:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:25:59.706 11:36:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15') 00:25:59.706 11:36:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:59.706 11:36:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:25:59.706 11:36:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:59.706 11:36:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:25:59.706 11:36:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:59.706 11:36:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:59.706 11:36:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:59.706 11:36:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:59.706 11:36:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:59.706 11:36:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:59.706 11:36:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:25:59.706 11:36:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:25:59.706 11:36:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:59.706 11:36:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:25:59.965 11:36:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:25:59.965 11:36:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:25:59.965 11:36:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:25:59.965 11:36:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:59.965 11:36:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:59.965 11:36:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:25:59.965 11:36:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:25:59.965 11:36:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:25:59.965 11:36:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:59.965 11:36:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:26:00.224 11:36:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:26:00.224 11:36:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:26:00.224 11:36:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:26:00.224 11:36:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:00.224 11:36:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:00.224 11:36:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:26:00.224 11:36:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:26:00.224 11:36:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:26:00.224 11:36:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:00.224 11:36:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:26:00.482 11:36:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:26:00.482 11:36:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:26:00.482 11:36:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:26:00.482 11:36:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:00.482 11:36:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:00.482 11:36:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:26:00.482 11:36:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:26:00.482 11:36:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:26:00.482 11:36:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:00.482 11:36:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd4 00:26:00.482 11:36:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd4 00:26:00.482 11:36:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd4 00:26:00.482 11:36:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd4 00:26:00.482 11:36:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:00.482 11:36:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:00.482 11:36:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd4 /proc/partitions 00:26:00.482 11:36:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:26:00.482 11:36:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:26:00.482 11:36:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:00.482 11:36:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd5 00:26:00.740 11:36:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd5 00:26:00.740 11:36:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd5 00:26:00.740 11:36:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd5 00:26:00.740 11:36:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:00.740 11:36:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:00.740 11:36:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd5 /proc/partitions 00:26:00.740 11:36:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:26:00.740 11:36:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:26:00.740 11:36:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:00.740 11:36:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd6 00:26:00.999 11:36:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd6 00:26:00.999 11:36:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd6 00:26:00.999 11:36:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd6 00:26:00.999 11:36:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:00.999 11:36:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:00.999 11:36:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd6 /proc/partitions 00:26:00.999 11:36:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:26:00.999 11:36:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:26:00.999 11:36:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:00.999 11:36:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd7 00:26:01.260 11:36:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd7 00:26:01.260 11:36:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd7 00:26:01.260 11:36:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd7 00:26:01.260 11:36:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:01.260 11:36:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:01.260 11:36:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd7 /proc/partitions 00:26:01.260 11:36:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:26:01.260 11:36:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:26:01.260 11:36:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:01.260 11:36:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd8 00:26:01.260 11:36:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd8 00:26:01.260 11:36:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd8 00:26:01.260 11:36:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd8 00:26:01.260 11:36:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:01.260 11:36:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:01.260 11:36:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd8 /proc/partitions 00:26:01.260 11:36:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:26:01.260 11:36:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:26:01.260 11:36:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:01.260 11:36:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd9 00:26:01.562 11:36:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd9 00:26:01.562 11:36:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd9 00:26:01.562 11:36:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd9 00:26:01.562 11:36:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:01.562 11:36:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:01.562 11:36:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd9 /proc/partitions 00:26:01.562 11:36:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:26:01.562 11:36:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:26:01.562 11:36:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:01.562 11:36:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:26:01.821 11:36:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:26:01.821 11:36:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:26:01.821 11:36:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:26:01.821 11:36:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:01.821 11:36:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:01.821 11:36:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:26:01.821 11:36:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:26:01.821 11:36:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:26:01.821 11:36:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:01.821 11:36:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:26:01.821 11:36:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:26:01.821 11:36:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:26:01.821 11:36:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:26:01.821 11:36:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:01.821 11:36:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:01.821 11:36:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:26:01.821 11:36:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:26:01.821 11:36:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:26:01.821 11:36:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:01.821 11:36:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd12 00:26:02.080 11:36:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd12 00:26:02.080 11:36:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd12 00:26:02.080 11:36:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd12 00:26:02.080 11:36:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:02.080 11:36:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:02.080 11:36:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd12 /proc/partitions 00:26:02.080 11:36:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:26:02.080 11:36:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:26:02.080 11:36:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:02.080 11:36:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd13 00:26:02.339 11:36:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd13 00:26:02.339 11:36:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd13 00:26:02.339 11:36:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd13 00:26:02.339 11:36:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:02.339 11:36:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:02.339 11:36:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd13 /proc/partitions 00:26:02.339 11:36:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:26:02.339 11:36:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:26:02.339 11:36:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:02.339 11:36:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd14 00:26:02.339 11:36:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd14 00:26:02.599 11:36:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd14 00:26:02.599 11:36:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd14 00:26:02.599 11:36:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:02.599 11:36:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:02.599 11:36:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd14 /proc/partitions 00:26:02.599 11:36:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:26:02.599 11:36:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:26:02.599 11:36:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:02.599 11:36:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd15 00:26:02.599 11:36:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd15 00:26:02.599 11:36:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd15 00:26:02.599 11:36:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd15 00:26:02.599 11:36:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:02.599 11:36:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:02.599 11:36:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd15 /proc/partitions 00:26:02.599 11:36:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:26:02.599 11:36:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:26:02.599 11:36:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:26:02.599 11:36:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:26:02.599 11:36:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:26:02.858 11:36:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:26:02.858 11:36:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:26:02.858 11:36:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:26:02.858 11:36:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:26:02.858 11:36:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:26:02.858 11:36:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:26:02.858 11:36:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:26:02.858 11:36:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:26:02.858 11:36:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:26:02.858 11:36:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:26:02.858 11:36:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:26:02.858 11:36:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:26:02.858 11:36:46 blockdev_general.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:26:02.858 11:36:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:26:02.858 11:36:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:26:02.858 11:36:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:26:02.858 11:36:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:26:02.858 11:36:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:26:02.858 11:36:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:26:02.858 11:36:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:26:02.858 11:36:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:26:02.858 11:36:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:02.858 11:36:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:26:02.858 11:36:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:02.858 11:36:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:26:02.858 11:36:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:02.858 11:36:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:26:02.858 11:36:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:26:03.118 /dev/nbd0 00:26:03.118 11:36:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:26:03.118 11:36:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:26:03.118 11:36:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:26:03.118 11:36:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:26:03.118 11:36:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:26:03.118 11:36:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:26:03.118 11:36:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:26:03.118 11:36:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:26:03.118 11:36:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:26:03.118 11:36:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:26:03.118 11:36:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:03.118 1+0 records in 00:26:03.118 1+0 records out 00:26:03.118 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000261165 s, 15.7 MB/s 00:26:03.118 11:36:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:03.118 11:36:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:26:03.118 11:36:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:03.118 11:36:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:26:03.118 11:36:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:26:03.118 11:36:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:03.118 11:36:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:26:03.118 11:36:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p0 /dev/nbd1 00:26:03.379 /dev/nbd1 00:26:03.379 11:36:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:26:03.379 11:36:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:26:03.379 11:36:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:26:03.379 11:36:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:26:03.379 11:36:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:26:03.379 11:36:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:26:03.379 11:36:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:26:03.379 11:36:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:26:03.379 11:36:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:26:03.379 11:36:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:26:03.379 11:36:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:03.379 1+0 records in 00:26:03.379 1+0 records out 00:26:03.379 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000275015 s, 14.9 MB/s 00:26:03.379 11:36:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:03.379 11:36:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:26:03.379 11:36:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:03.379 11:36:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:26:03.379 11:36:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:26:03.379 11:36:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:03.379 11:36:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:26:03.379 11:36:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p1 /dev/nbd10 00:26:03.379 /dev/nbd10 00:26:03.639 11:36:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:26:03.639 11:36:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:26:03.639 11:36:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd10 00:26:03.639 11:36:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:26:03.639 11:36:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:26:03.639 11:36:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:26:03.639 11:36:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd10 /proc/partitions 00:26:03.639 11:36:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:26:03.639 11:36:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:26:03.639 11:36:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:26:03.639 11:36:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:03.639 1+0 records in 00:26:03.639 1+0 records out 00:26:03.639 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000285195 s, 14.4 MB/s 00:26:03.639 11:36:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:03.639 11:36:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:26:03.639 11:36:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:03.639 11:36:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:26:03.639 11:36:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:26:03.639 11:36:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:03.639 11:36:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:26:03.639 11:36:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p0 /dev/nbd11 00:26:03.639 /dev/nbd11 00:26:03.639 11:36:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:26:03.639 11:36:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:26:03.639 11:36:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd11 00:26:03.639 11:36:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:26:03.639 11:36:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:26:03.639 11:36:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:26:03.639 11:36:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd11 /proc/partitions 00:26:03.639 11:36:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:26:03.639 11:36:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:26:03.639 11:36:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:26:03.639 11:36:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:03.639 1+0 records in 00:26:03.639 1+0 records out 00:26:03.639 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000265218 s, 15.4 MB/s 00:26:03.639 11:36:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:03.639 11:36:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:26:03.639 11:36:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:03.639 11:36:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:26:03.639 11:36:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:26:03.639 11:36:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:03.639 11:36:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:26:03.639 11:36:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p1 /dev/nbd12 00:26:03.898 /dev/nbd12 00:26:03.898 11:36:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd12 00:26:03.898 11:36:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd12 00:26:03.898 11:36:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd12 00:26:03.898 11:36:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:26:03.898 11:36:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:26:03.898 11:36:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:26:03.898 11:36:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd12 /proc/partitions 00:26:03.898 11:36:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:26:03.898 11:36:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:26:03.898 11:36:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:26:03.898 11:36:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd12 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:03.898 1+0 records in 00:26:03.898 1+0 records out 00:26:03.898 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000308098 s, 13.3 MB/s 00:26:03.898 11:36:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:03.898 11:36:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:26:03.898 11:36:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:03.898 11:36:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:26:03.898 11:36:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:26:03.898 11:36:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:03.898 11:36:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:26:03.898 11:36:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p2 /dev/nbd13 00:26:04.157 /dev/nbd13 00:26:04.157 11:36:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd13 00:26:04.157 11:36:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd13 00:26:04.157 11:36:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd13 00:26:04.157 11:36:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:26:04.157 11:36:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:26:04.157 11:36:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:26:04.157 11:36:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd13 /proc/partitions 00:26:04.157 11:36:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:26:04.157 11:36:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:26:04.157 11:36:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:26:04.157 11:36:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd13 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:04.157 1+0 records in 00:26:04.157 1+0 records out 00:26:04.157 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000346688 s, 11.8 MB/s 00:26:04.157 11:36:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:04.157 11:36:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:26:04.157 11:36:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:04.157 11:36:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:26:04.157 11:36:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:26:04.157 11:36:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:04.157 11:36:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:26:04.157 11:36:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p3 /dev/nbd14 00:26:04.416 /dev/nbd14 00:26:04.416 11:36:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd14 00:26:04.416 11:36:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd14 00:26:04.416 11:36:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd14 00:26:04.416 11:36:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:26:04.416 11:36:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:26:04.416 11:36:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:26:04.416 11:36:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd14 /proc/partitions 00:26:04.416 11:36:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:26:04.416 11:36:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:26:04.416 11:36:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:26:04.416 11:36:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd14 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:04.416 1+0 records in 00:26:04.416 1+0 records out 00:26:04.416 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000359667 s, 11.4 MB/s 00:26:04.416 11:36:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:04.416 11:36:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:26:04.416 11:36:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:04.416 11:36:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:26:04.416 11:36:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:26:04.416 11:36:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:04.416 11:36:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:26:04.416 11:36:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p4 /dev/nbd15 00:26:04.675 /dev/nbd15 00:26:04.675 11:36:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd15 00:26:04.675 11:36:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd15 00:26:04.675 11:36:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd15 00:26:04.675 11:36:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:26:04.675 11:36:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:26:04.675 11:36:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:26:04.675 11:36:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd15 /proc/partitions 00:26:04.675 11:36:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:26:04.675 11:36:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:26:04.675 11:36:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:26:04.675 11:36:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd15 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:04.675 1+0 records in 00:26:04.675 1+0 records out 00:26:04.675 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000500659 s, 8.2 MB/s 00:26:04.675 11:36:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:04.675 11:36:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:26:04.675 11:36:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:04.675 11:36:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:26:04.675 11:36:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:26:04.675 11:36:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:04.675 11:36:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:26:04.675 11:36:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p5 /dev/nbd2 00:26:04.675 /dev/nbd2 00:26:04.934 11:36:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd2 00:26:04.934 11:36:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd2 00:26:04.934 11:36:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd2 00:26:04.934 11:36:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:26:04.934 11:36:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:26:04.934 11:36:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:26:04.934 11:36:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd2 /proc/partitions 00:26:04.934 11:36:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:26:04.934 11:36:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:26:04.934 11:36:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:26:04.934 11:36:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:04.934 1+0 records in 00:26:04.934 1+0 records out 00:26:04.934 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000448208 s, 9.1 MB/s 00:26:04.934 11:36:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:04.934 11:36:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:26:04.934 11:36:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:04.934 11:36:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:26:04.934 11:36:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:26:04.934 11:36:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:04.934 11:36:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:26:04.934 11:36:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p6 /dev/nbd3 00:26:04.935 /dev/nbd3 00:26:04.935 11:36:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd3 00:26:04.935 11:36:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd3 00:26:04.935 11:36:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd3 00:26:04.935 11:36:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:26:04.935 11:36:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:26:04.935 11:36:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:26:04.935 11:36:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd3 /proc/partitions 00:26:04.935 11:36:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:26:04.935 11:36:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:26:04.935 11:36:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:26:04.935 11:36:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:04.935 1+0 records in 00:26:04.935 1+0 records out 00:26:04.935 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000426034 s, 9.6 MB/s 00:26:04.935 11:36:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:04.935 11:36:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:26:04.935 11:36:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:05.194 11:36:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:26:05.194 11:36:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:26:05.194 11:36:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:05.194 11:36:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:26:05.194 11:36:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p7 /dev/nbd4 00:26:05.194 /dev/nbd4 00:26:05.194 11:36:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd4 00:26:05.194 11:36:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd4 00:26:05.194 11:36:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd4 00:26:05.194 11:36:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:26:05.194 11:36:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:26:05.194 11:36:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:26:05.194 11:36:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd4 /proc/partitions 00:26:05.194 11:36:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:26:05.194 11:36:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:26:05.194 11:36:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:26:05.194 11:36:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd4 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:05.194 1+0 records in 00:26:05.194 1+0 records out 00:26:05.194 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00046126 s, 8.9 MB/s 00:26:05.194 11:36:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:05.194 11:36:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:26:05.194 11:36:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:05.194 11:36:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:26:05.194 11:36:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:26:05.194 11:36:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:05.194 11:36:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:26:05.194 11:36:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk TestPT /dev/nbd5 00:26:05.453 /dev/nbd5 00:26:05.453 11:36:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd5 00:26:05.453 11:36:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd5 00:26:05.453 11:36:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd5 00:26:05.453 11:36:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:26:05.453 11:36:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:26:05.453 11:36:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:26:05.453 11:36:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd5 /proc/partitions 00:26:05.453 11:36:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:26:05.453 11:36:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:26:05.453 11:36:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:26:05.453 11:36:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd5 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:05.453 1+0 records in 00:26:05.453 1+0 records out 00:26:05.453 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000527145 s, 7.8 MB/s 00:26:05.453 11:36:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:05.453 11:36:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:26:05.453 11:36:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:05.453 11:36:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:26:05.453 11:36:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:26:05.453 11:36:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:05.453 11:36:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:26:05.453 11:36:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid0 /dev/nbd6 00:26:05.712 /dev/nbd6 00:26:05.712 11:36:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd6 00:26:05.712 11:36:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd6 00:26:05.712 11:36:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd6 00:26:05.712 11:36:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:26:05.712 11:36:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:26:05.712 11:36:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:26:05.712 11:36:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd6 /proc/partitions 00:26:05.712 11:36:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:26:05.712 11:36:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:26:05.712 11:36:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:26:05.712 11:36:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd6 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:05.712 1+0 records in 00:26:05.712 1+0 records out 00:26:05.712 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000488964 s, 8.4 MB/s 00:26:05.712 11:36:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:05.712 11:36:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:26:05.712 11:36:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:05.712 11:36:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:26:05.712 11:36:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:26:05.712 11:36:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:05.712 11:36:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:26:05.712 11:36:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk concat0 /dev/nbd7 00:26:05.971 /dev/nbd7 00:26:05.971 11:36:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd7 00:26:05.971 11:36:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd7 00:26:05.971 11:36:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd7 00:26:05.971 11:36:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:26:05.971 11:36:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:26:05.971 11:36:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:26:05.971 11:36:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd7 /proc/partitions 00:26:05.971 11:36:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:26:05.971 11:36:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:26:05.971 11:36:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:26:05.971 11:36:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd7 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:05.971 1+0 records in 00:26:05.971 1+0 records out 00:26:05.971 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000667025 s, 6.1 MB/s 00:26:05.971 11:36:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:05.971 11:36:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:26:05.971 11:36:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:05.971 11:36:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:26:05.971 11:36:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:26:05.971 11:36:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:05.971 11:36:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:26:05.972 11:36:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid1 /dev/nbd8 00:26:05.972 /dev/nbd8 00:26:06.230 11:36:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd8 00:26:06.230 11:36:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd8 00:26:06.230 11:36:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd8 00:26:06.230 11:36:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:26:06.230 11:36:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:26:06.230 11:36:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:26:06.230 11:36:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd8 /proc/partitions 00:26:06.230 11:36:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:26:06.230 11:36:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:26:06.230 11:36:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:26:06.230 11:36:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd8 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:06.230 1+0 records in 00:26:06.230 1+0 records out 00:26:06.230 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000447604 s, 9.2 MB/s 00:26:06.230 11:36:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:06.230 11:36:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:26:06.230 11:36:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:06.230 11:36:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:26:06.230 11:36:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:26:06.230 11:36:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:06.230 11:36:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:26:06.230 11:36:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk AIO0 /dev/nbd9 00:26:06.230 /dev/nbd9 00:26:06.230 11:36:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd9 00:26:06.230 11:36:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd9 00:26:06.230 11:36:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd9 00:26:06.230 11:36:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:26:06.230 11:36:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:26:06.230 11:36:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:26:06.230 11:36:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd9 /proc/partitions 00:26:06.230 11:36:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:26:06.230 11:36:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:26:06.230 11:36:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:26:06.230 11:36:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd9 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:06.230 1+0 records in 00:26:06.230 1+0 records out 00:26:06.230 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000522325 s, 7.8 MB/s 00:26:06.230 11:36:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:06.230 11:36:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:26:06.230 11:36:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:06.230 11:36:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:26:06.230 11:36:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:26:06.230 11:36:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:06.230 11:36:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:26:06.488 11:36:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:26:06.488 11:36:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:26:06.488 11:36:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:26:06.488 11:36:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:26:06.488 { 00:26:06.488 "nbd_device": "/dev/nbd0", 00:26:06.488 "bdev_name": "Malloc0" 00:26:06.488 }, 00:26:06.488 { 00:26:06.488 "nbd_device": "/dev/nbd1", 00:26:06.488 "bdev_name": "Malloc1p0" 00:26:06.488 }, 00:26:06.488 { 00:26:06.488 "nbd_device": "/dev/nbd10", 00:26:06.488 "bdev_name": "Malloc1p1" 00:26:06.488 }, 00:26:06.488 { 00:26:06.488 "nbd_device": "/dev/nbd11", 00:26:06.488 "bdev_name": "Malloc2p0" 00:26:06.488 }, 00:26:06.488 { 00:26:06.488 "nbd_device": "/dev/nbd12", 00:26:06.488 "bdev_name": "Malloc2p1" 00:26:06.488 }, 00:26:06.488 { 00:26:06.488 "nbd_device": "/dev/nbd13", 00:26:06.488 "bdev_name": "Malloc2p2" 00:26:06.488 }, 00:26:06.488 { 00:26:06.488 "nbd_device": "/dev/nbd14", 00:26:06.488 "bdev_name": "Malloc2p3" 00:26:06.488 }, 00:26:06.488 { 00:26:06.488 "nbd_device": "/dev/nbd15", 00:26:06.488 "bdev_name": "Malloc2p4" 00:26:06.488 }, 00:26:06.488 { 00:26:06.488 "nbd_device": "/dev/nbd2", 00:26:06.488 "bdev_name": "Malloc2p5" 00:26:06.488 }, 00:26:06.488 { 00:26:06.488 "nbd_device": "/dev/nbd3", 00:26:06.488 "bdev_name": "Malloc2p6" 00:26:06.488 }, 00:26:06.488 { 00:26:06.488 "nbd_device": "/dev/nbd4", 00:26:06.488 "bdev_name": "Malloc2p7" 00:26:06.488 }, 00:26:06.488 { 00:26:06.488 "nbd_device": "/dev/nbd5", 00:26:06.488 "bdev_name": "TestPT" 00:26:06.489 }, 00:26:06.489 { 00:26:06.489 "nbd_device": "/dev/nbd6", 00:26:06.489 "bdev_name": "raid0" 00:26:06.489 }, 00:26:06.489 { 00:26:06.489 "nbd_device": "/dev/nbd7", 00:26:06.489 "bdev_name": "concat0" 00:26:06.489 }, 00:26:06.489 { 00:26:06.489 "nbd_device": "/dev/nbd8", 00:26:06.489 "bdev_name": "raid1" 00:26:06.489 }, 00:26:06.489 { 00:26:06.489 "nbd_device": "/dev/nbd9", 00:26:06.489 "bdev_name": "AIO0" 00:26:06.489 } 00:26:06.489 ]' 00:26:06.489 11:36:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:26:06.489 { 00:26:06.489 "nbd_device": "/dev/nbd0", 00:26:06.489 "bdev_name": "Malloc0" 00:26:06.489 }, 00:26:06.489 { 00:26:06.489 "nbd_device": "/dev/nbd1", 00:26:06.489 "bdev_name": "Malloc1p0" 00:26:06.489 }, 00:26:06.489 { 00:26:06.489 "nbd_device": "/dev/nbd10", 00:26:06.489 "bdev_name": "Malloc1p1" 00:26:06.489 }, 00:26:06.489 { 00:26:06.489 "nbd_device": "/dev/nbd11", 00:26:06.489 "bdev_name": "Malloc2p0" 00:26:06.489 }, 00:26:06.489 { 00:26:06.489 "nbd_device": "/dev/nbd12", 00:26:06.489 "bdev_name": "Malloc2p1" 00:26:06.489 }, 00:26:06.489 { 00:26:06.489 "nbd_device": "/dev/nbd13", 00:26:06.489 "bdev_name": "Malloc2p2" 00:26:06.489 }, 00:26:06.489 { 00:26:06.489 "nbd_device": "/dev/nbd14", 00:26:06.489 "bdev_name": "Malloc2p3" 00:26:06.489 }, 00:26:06.489 { 00:26:06.489 "nbd_device": "/dev/nbd15", 00:26:06.489 "bdev_name": "Malloc2p4" 00:26:06.489 }, 00:26:06.489 { 00:26:06.489 "nbd_device": "/dev/nbd2", 00:26:06.489 "bdev_name": "Malloc2p5" 00:26:06.489 }, 00:26:06.489 { 00:26:06.489 "nbd_device": "/dev/nbd3", 00:26:06.489 "bdev_name": "Malloc2p6" 00:26:06.489 }, 00:26:06.489 { 00:26:06.489 "nbd_device": "/dev/nbd4", 00:26:06.489 "bdev_name": "Malloc2p7" 00:26:06.489 }, 00:26:06.489 { 00:26:06.489 "nbd_device": "/dev/nbd5", 00:26:06.489 "bdev_name": "TestPT" 00:26:06.489 }, 00:26:06.489 { 00:26:06.489 "nbd_device": "/dev/nbd6", 00:26:06.489 "bdev_name": "raid0" 00:26:06.489 }, 00:26:06.489 { 00:26:06.489 "nbd_device": "/dev/nbd7", 00:26:06.489 "bdev_name": "concat0" 00:26:06.489 }, 00:26:06.489 { 00:26:06.489 "nbd_device": "/dev/nbd8", 00:26:06.489 "bdev_name": "raid1" 00:26:06.489 }, 00:26:06.489 { 00:26:06.489 "nbd_device": "/dev/nbd9", 00:26:06.489 "bdev_name": "AIO0" 00:26:06.489 } 00:26:06.489 ]' 00:26:06.489 11:36:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:26:06.489 11:36:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:26:06.489 /dev/nbd1 00:26:06.489 /dev/nbd10 00:26:06.489 /dev/nbd11 00:26:06.489 /dev/nbd12 00:26:06.489 /dev/nbd13 00:26:06.489 /dev/nbd14 00:26:06.489 /dev/nbd15 00:26:06.489 /dev/nbd2 00:26:06.489 /dev/nbd3 00:26:06.489 /dev/nbd4 00:26:06.489 /dev/nbd5 00:26:06.489 /dev/nbd6 00:26:06.489 /dev/nbd7 00:26:06.489 /dev/nbd8 00:26:06.489 /dev/nbd9' 00:26:06.489 11:36:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:26:06.489 /dev/nbd1 00:26:06.489 /dev/nbd10 00:26:06.489 /dev/nbd11 00:26:06.489 /dev/nbd12 00:26:06.489 /dev/nbd13 00:26:06.489 /dev/nbd14 00:26:06.489 /dev/nbd15 00:26:06.489 /dev/nbd2 00:26:06.489 /dev/nbd3 00:26:06.489 /dev/nbd4 00:26:06.489 /dev/nbd5 00:26:06.489 /dev/nbd6 00:26:06.489 /dev/nbd7 00:26:06.489 /dev/nbd8 00:26:06.489 /dev/nbd9' 00:26:06.489 11:36:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:26:06.489 11:36:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=16 00:26:06.489 11:36:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 16 00:26:06.489 11:36:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=16 00:26:06.489 11:36:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 16 -ne 16 ']' 00:26:06.489 11:36:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' write 00:26:06.489 11:36:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:26:06.489 11:36:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:26:06.489 11:36:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:26:06.489 11:36:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:26:06.489 11:36:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:26:06.489 11:36:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:26:06.489 256+0 records in 00:26:06.489 256+0 records out 00:26:06.489 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0112679 s, 93.1 MB/s 00:26:06.489 11:36:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:26:06.489 11:36:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:26:06.747 256+0 records in 00:26:06.747 256+0 records out 00:26:06.747 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.117419 s, 8.9 MB/s 00:26:06.747 11:36:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:26:06.747 11:36:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:26:06.747 256+0 records in 00:26:06.747 256+0 records out 00:26:06.747 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.121175 s, 8.7 MB/s 00:26:06.747 11:36:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:26:06.747 11:36:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:26:07.005 256+0 records in 00:26:07.005 256+0 records out 00:26:07.005 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.122029 s, 8.6 MB/s 00:26:07.005 11:36:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:26:07.005 11:36:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:26:07.005 256+0 records in 00:26:07.005 256+0 records out 00:26:07.005 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.122957 s, 8.5 MB/s 00:26:07.005 11:36:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:26:07.005 11:36:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd12 bs=4096 count=256 oflag=direct 00:26:07.263 256+0 records in 00:26:07.263 256+0 records out 00:26:07.263 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.121017 s, 8.7 MB/s 00:26:07.263 11:36:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:26:07.263 11:36:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd13 bs=4096 count=256 oflag=direct 00:26:07.263 256+0 records in 00:26:07.263 256+0 records out 00:26:07.263 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.121423 s, 8.6 MB/s 00:26:07.263 11:36:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:26:07.263 11:36:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd14 bs=4096 count=256 oflag=direct 00:26:07.521 256+0 records in 00:26:07.521 256+0 records out 00:26:07.521 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.120272 s, 8.7 MB/s 00:26:07.521 11:36:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:26:07.521 11:36:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd15 bs=4096 count=256 oflag=direct 00:26:07.521 256+0 records in 00:26:07.521 256+0 records out 00:26:07.521 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.121842 s, 8.6 MB/s 00:26:07.521 11:36:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:26:07.521 11:36:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd2 bs=4096 count=256 oflag=direct 00:26:07.779 256+0 records in 00:26:07.779 256+0 records out 00:26:07.779 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.121311 s, 8.6 MB/s 00:26:07.779 11:36:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:26:07.779 11:36:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd3 bs=4096 count=256 oflag=direct 00:26:07.779 256+0 records in 00:26:07.779 256+0 records out 00:26:07.779 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.122008 s, 8.6 MB/s 00:26:07.779 11:36:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:26:07.779 11:36:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd4 bs=4096 count=256 oflag=direct 00:26:08.037 256+0 records in 00:26:08.037 256+0 records out 00:26:08.037 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.120989 s, 8.7 MB/s 00:26:08.037 11:36:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:26:08.037 11:36:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd5 bs=4096 count=256 oflag=direct 00:26:08.037 256+0 records in 00:26:08.037 256+0 records out 00:26:08.037 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.122343 s, 8.6 MB/s 00:26:08.037 11:36:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:26:08.037 11:36:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd6 bs=4096 count=256 oflag=direct 00:26:08.300 256+0 records in 00:26:08.301 256+0 records out 00:26:08.301 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.120344 s, 8.7 MB/s 00:26:08.301 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:26:08.301 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd7 bs=4096 count=256 oflag=direct 00:26:08.301 256+0 records in 00:26:08.301 256+0 records out 00:26:08.301 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.123501 s, 8.5 MB/s 00:26:08.301 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:26:08.301 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd8 bs=4096 count=256 oflag=direct 00:26:08.566 256+0 records in 00:26:08.566 256+0 records out 00:26:08.566 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.123712 s, 8.5 MB/s 00:26:08.566 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:26:08.566 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd9 bs=4096 count=256 oflag=direct 00:26:08.566 256+0 records in 00:26:08.566 256+0 records out 00:26:08.566 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.121303 s, 8.6 MB/s 00:26:08.566 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' verify 00:26:08.566 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:26:08.566 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:26:08.566 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:26:08.566 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:26:08.566 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:26:08.566 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:26:08.566 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:26:08.566 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:26:08.566 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:26:08.567 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:26:08.567 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:26:08.567 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:26:08.567 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:26:08.567 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:26:08.567 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:26:08.567 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd12 00:26:08.567 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:26:08.567 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd13 00:26:08.567 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:26:08.567 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd14 00:26:08.567 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:26:08.567 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd15 00:26:08.567 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:26:08.567 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd2 00:26:08.567 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:26:08.567 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd3 00:26:08.826 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:26:08.826 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd4 00:26:08.826 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:26:08.826 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd5 00:26:08.826 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:26:08.826 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd6 00:26:08.826 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:26:08.826 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd7 00:26:08.826 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:26:08.826 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd8 00:26:08.826 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:26:08.826 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd9 00:26:08.826 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:26:08.826 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:26:08.826 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:26:08.826 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:26:08.826 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:08.826 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:26:08.826 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:08.826 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:26:08.826 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:26:08.826 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:26:08.826 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:26:08.826 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:08.826 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:08.826 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:26:08.826 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:26:08.826 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:26:08.826 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:08.826 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:26:09.085 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:26:09.085 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:26:09.085 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:26:09.085 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:09.085 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:09.085 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:26:09.085 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:26:09.085 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:26:09.085 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:09.085 11:36:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:26:09.344 11:36:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:26:09.344 11:36:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:26:09.344 11:36:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:26:09.344 11:36:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:09.344 11:36:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:09.344 11:36:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:26:09.344 11:36:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:26:09.344 11:36:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:26:09.344 11:36:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:09.344 11:36:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:26:09.603 11:36:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:26:09.603 11:36:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:26:09.603 11:36:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:26:09.603 11:36:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:09.603 11:36:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:09.603 11:36:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:26:09.603 11:36:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:26:09.603 11:36:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:26:09.603 11:36:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:09.603 11:36:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd12 00:26:09.603 11:36:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd12 00:26:09.603 11:36:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd12 00:26:09.603 11:36:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd12 00:26:09.603 11:36:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:09.603 11:36:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:09.603 11:36:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd12 /proc/partitions 00:26:09.603 11:36:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:26:09.862 11:36:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:26:09.862 11:36:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:09.862 11:36:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd13 00:26:09.862 11:36:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd13 00:26:09.862 11:36:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd13 00:26:09.862 11:36:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd13 00:26:09.862 11:36:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:09.862 11:36:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:09.862 11:36:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd13 /proc/partitions 00:26:09.862 11:36:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:26:09.862 11:36:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:26:09.862 11:36:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:09.862 11:36:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd14 00:26:10.121 11:36:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd14 00:26:10.121 11:36:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd14 00:26:10.121 11:36:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd14 00:26:10.121 11:36:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:10.121 11:36:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:10.121 11:36:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd14 /proc/partitions 00:26:10.121 11:36:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:26:10.121 11:36:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:26:10.121 11:36:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:10.121 11:36:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd15 00:26:10.380 11:36:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd15 00:26:10.380 11:36:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd15 00:26:10.380 11:36:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd15 00:26:10.380 11:36:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:10.380 11:36:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:10.380 11:36:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd15 /proc/partitions 00:26:10.380 11:36:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:26:10.380 11:36:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:26:10.380 11:36:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:10.381 11:36:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:26:10.381 11:36:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:26:10.381 11:36:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:26:10.381 11:36:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:26:10.381 11:36:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:10.381 11:36:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:10.381 11:36:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:26:10.381 11:36:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:26:10.381 11:36:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:26:10.381 11:36:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:10.381 11:36:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:26:10.639 11:36:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:26:10.639 11:36:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:26:10.639 11:36:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:26:10.639 11:36:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:10.639 11:36:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:10.639 11:36:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:26:10.639 11:36:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:26:10.639 11:36:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:26:10.639 11:36:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:10.639 11:36:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd4 00:26:10.897 11:36:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd4 00:26:10.897 11:36:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd4 00:26:10.897 11:36:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd4 00:26:10.897 11:36:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:10.897 11:36:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:10.897 11:36:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd4 /proc/partitions 00:26:10.897 11:36:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:26:10.897 11:36:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:26:10.897 11:36:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:10.897 11:36:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd5 00:26:11.155 11:36:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd5 00:26:11.155 11:36:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd5 00:26:11.155 11:36:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd5 00:26:11.155 11:36:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:11.155 11:36:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:11.155 11:36:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd5 /proc/partitions 00:26:11.155 11:36:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:26:11.155 11:36:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:26:11.155 11:36:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:11.155 11:36:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd6 00:26:11.155 11:36:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd6 00:26:11.155 11:36:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd6 00:26:11.155 11:36:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd6 00:26:11.155 11:36:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:11.155 11:36:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:11.155 11:36:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd6 /proc/partitions 00:26:11.413 11:36:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:26:11.413 11:36:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:26:11.413 11:36:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:11.413 11:36:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd7 00:26:11.413 11:36:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd7 00:26:11.413 11:36:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd7 00:26:11.413 11:36:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd7 00:26:11.413 11:36:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:11.413 11:36:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:11.413 11:36:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd7 /proc/partitions 00:26:11.413 11:36:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:26:11.413 11:36:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:26:11.413 11:36:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:11.413 11:36:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd8 00:26:11.671 11:36:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd8 00:26:11.671 11:36:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd8 00:26:11.671 11:36:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd8 00:26:11.671 11:36:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:11.671 11:36:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:11.671 11:36:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd8 /proc/partitions 00:26:11.671 11:36:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:26:11.671 11:36:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:26:11.671 11:36:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:11.671 11:36:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd9 00:26:11.928 11:36:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd9 00:26:11.928 11:36:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd9 00:26:11.928 11:36:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd9 00:26:11.928 11:36:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:11.928 11:36:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:11.928 11:36:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd9 /proc/partitions 00:26:11.928 11:36:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:26:11.928 11:36:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:26:11.928 11:36:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:26:11.928 11:36:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:26:11.928 11:36:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:26:11.928 11:36:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:26:11.928 11:36:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:26:11.928 11:36:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:26:12.187 11:36:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:26:12.187 11:36:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:26:12.187 11:36:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:26:12.187 11:36:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:26:12.187 11:36:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:26:12.187 11:36:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:26:12.187 11:36:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:26:12.187 11:36:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:26:12.187 11:36:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:26:12.187 11:36:55 blockdev_general.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:26:12.187 11:36:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:26:12.187 11:36:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:26:12.187 11:36:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:26:12.187 11:36:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:26:12.187 11:36:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:26:12.187 malloc_lvol_verify 00:26:12.187 11:36:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:26:12.445 56c21332-dbd2-413a-971a-df3881238c2c 00:26:12.445 11:36:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:26:12.703 76669854-0ec2-4553-b88a-7f0e26514df6 00:26:12.703 11:36:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:26:12.703 /dev/nbd0 00:26:12.703 11:36:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:26:12.703 mke2fs 1.46.5 (30-Dec-2021) 00:26:12.703 Discarding device blocks: 0/4096 done 00:26:12.703 Creating filesystem with 4096 1k blocks and 1024 inodes 00:26:12.703 00:26:12.703 Allocating group tables: 0/1 done 00:26:12.703 Writing inode tables: 0/1 done 00:26:12.703 Creating journal (1024 blocks): done 00:26:12.703 Writing superblocks and filesystem accounting information: 0/1 done 00:26:12.703 00:26:12.703 11:36:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:26:12.703 11:36:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:26:12.703 11:36:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:26:12.703 11:36:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:26:12.703 11:36:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:12.703 11:36:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:26:12.703 11:36:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:12.703 11:36:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:26:12.961 11:36:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:26:12.961 11:36:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:26:12.961 11:36:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:26:12.961 11:36:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:12.961 11:36:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:12.961 11:36:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:26:12.961 11:36:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:26:12.961 11:36:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:26:12.961 11:36:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:26:12.961 11:36:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:26:12.961 11:36:56 blockdev_general.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 113903 00:26:12.961 11:36:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@949 -- # '[' -z 113903 ']' 00:26:12.961 11:36:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@953 -- # kill -0 113903 00:26:12.961 11:36:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@954 -- # uname 00:26:12.961 11:36:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:26:12.961 11:36:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 113903 00:26:12.961 11:36:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:26:12.961 11:36:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:26:12.961 11:36:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@967 -- # echo 'killing process with pid 113903' 00:26:12.961 killing process with pid 113903 00:26:12.961 11:36:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@968 -- # kill 113903 00:26:12.961 11:36:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@973 -- # wait 113903 00:26:13.220 11:36:57 blockdev_general.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:26:13.220 00:26:13.220 real 0m18.237s 00:26:13.220 user 0m21.605s 00:26:13.220 sys 0m10.997s 00:26:13.220 11:36:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@1125 -- # xtrace_disable 00:26:13.220 11:36:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:26:13.220 ************************************ 00:26:13.220 END TEST bdev_nbd 00:26:13.220 ************************************ 00:26:13.478 11:36:57 blockdev_general -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:26:13.478 11:36:57 blockdev_general -- bdev/blockdev.sh@764 -- # '[' bdev = nvme ']' 00:26:13.478 11:36:57 blockdev_general -- bdev/blockdev.sh@764 -- # '[' bdev = gpt ']' 00:26:13.478 11:36:57 blockdev_general -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:26:13.478 11:36:57 blockdev_general -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:26:13.478 11:36:57 blockdev_general -- common/autotest_common.sh@1106 -- # xtrace_disable 00:26:13.478 11:36:57 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:26:13.478 ************************************ 00:26:13.478 START TEST bdev_fio 00:26:13.478 ************************************ 00:26:13.478 11:36:57 blockdev_general.bdev_fio -- common/autotest_common.sh@1124 -- # fio_test_suite '' 00:26:13.478 11:36:57 blockdev_general.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:26:13.478 11:36:57 blockdev_general.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:26:13.478 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:26:13.478 11:36:57 blockdev_general.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:26:13.478 11:36:57 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:26:13.478 11:36:57 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:26:13.478 11:36:57 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:26:13.479 11:36:57 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:26:13.479 11:36:57 blockdev_general.bdev_fio -- common/autotest_common.sh@1279 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:26:13.479 11:36:57 blockdev_general.bdev_fio -- common/autotest_common.sh@1280 -- # local workload=verify 00:26:13.479 11:36:57 blockdev_general.bdev_fio -- common/autotest_common.sh@1281 -- # local bdev_type=AIO 00:26:13.479 11:36:57 blockdev_general.bdev_fio -- common/autotest_common.sh@1282 -- # local env_context= 00:26:13.479 11:36:57 blockdev_general.bdev_fio -- common/autotest_common.sh@1283 -- # local fio_dir=/usr/src/fio 00:26:13.479 11:36:57 blockdev_general.bdev_fio -- common/autotest_common.sh@1285 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:26:13.479 11:36:57 blockdev_general.bdev_fio -- common/autotest_common.sh@1290 -- # '[' -z verify ']' 00:26:13.479 11:36:57 blockdev_general.bdev_fio -- common/autotest_common.sh@1294 -- # '[' -n '' ']' 00:26:13.479 11:36:57 blockdev_general.bdev_fio -- common/autotest_common.sh@1298 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:26:13.479 11:36:57 blockdev_general.bdev_fio -- common/autotest_common.sh@1300 -- # cat 00:26:13.479 11:36:57 blockdev_general.bdev_fio -- common/autotest_common.sh@1312 -- # '[' verify == verify ']' 00:26:13.479 11:36:57 blockdev_general.bdev_fio -- common/autotest_common.sh@1313 -- # cat 00:26:13.479 11:36:57 blockdev_general.bdev_fio -- common/autotest_common.sh@1322 -- # '[' AIO == AIO ']' 00:26:13.479 11:36:57 blockdev_general.bdev_fio -- common/autotest_common.sh@1323 -- # /usr/src/fio/fio --version 00:26:13.479 11:36:57 blockdev_general.bdev_fio -- common/autotest_common.sh@1323 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:26:13.479 11:36:57 blockdev_general.bdev_fio -- common/autotest_common.sh@1324 -- # echo serialize_overlap=1 00:26:13.479 11:36:57 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:26:13.479 11:36:57 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc0]' 00:26:13.479 11:36:57 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc0 00:26:13.479 11:36:57 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:26:13.479 11:36:57 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc1p0]' 00:26:13.479 11:36:57 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc1p0 00:26:13.479 11:36:57 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:26:13.479 11:36:57 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc1p1]' 00:26:13.479 11:36:57 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc1p1 00:26:13.479 11:36:57 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:26:13.479 11:36:57 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p0]' 00:26:13.479 11:36:57 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p0 00:26:13.479 11:36:57 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:26:13.479 11:36:57 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p1]' 00:26:13.479 11:36:57 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p1 00:26:13.479 11:36:57 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:26:13.479 11:36:57 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p2]' 00:26:13.479 11:36:57 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p2 00:26:13.479 11:36:57 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:26:13.479 11:36:57 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p3]' 00:26:13.479 11:36:57 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p3 00:26:13.479 11:36:57 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:26:13.479 11:36:57 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p4]' 00:26:13.479 11:36:57 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p4 00:26:13.479 11:36:57 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:26:13.479 11:36:57 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p5]' 00:26:13.479 11:36:57 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p5 00:26:13.479 11:36:57 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:26:13.479 11:36:57 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p6]' 00:26:13.479 11:36:57 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p6 00:26:13.479 11:36:57 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:26:13.479 11:36:57 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p7]' 00:26:13.479 11:36:57 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p7 00:26:13.479 11:36:57 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:26:13.479 11:36:57 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_TestPT]' 00:26:13.479 11:36:57 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=TestPT 00:26:13.479 11:36:57 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:26:13.479 11:36:57 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_raid0]' 00:26:13.479 11:36:57 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=raid0 00:26:13.479 11:36:57 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:26:13.479 11:36:57 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_concat0]' 00:26:13.479 11:36:57 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=concat0 00:26:13.479 11:36:57 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:26:13.479 11:36:57 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_raid1]' 00:26:13.479 11:36:57 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=raid1 00:26:13.479 11:36:57 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:26:13.479 11:36:57 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_AIO0]' 00:26:13.479 11:36:57 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=AIO0 00:26:13.479 11:36:57 blockdev_general.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:26:13.479 11:36:57 blockdev_general.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:26:13.479 11:36:57 blockdev_general.bdev_fio -- common/autotest_common.sh@1100 -- # '[' 11 -le 1 ']' 00:26:13.479 11:36:57 blockdev_general.bdev_fio -- common/autotest_common.sh@1106 -- # xtrace_disable 00:26:13.479 11:36:57 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:26:13.479 ************************************ 00:26:13.479 START TEST bdev_fio_rw_verify 00:26:13.479 ************************************ 00:26:13.479 11:36:57 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:26:13.479 11:36:57 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1355 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:26:13.479 11:36:57 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1336 -- # local fio_dir=/usr/src/fio 00:26:13.479 11:36:57 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1338 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:26:13.479 11:36:57 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1338 -- # local sanitizers 00:26:13.479 11:36:57 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:26:13.479 11:36:57 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # shift 00:26:13.479 11:36:57 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1342 -- # local asan_lib= 00:26:13.479 11:36:57 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # for sanitizer in "${sanitizers[@]}" 00:26:13.479 11:36:57 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:26:13.479 11:36:57 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # grep libasan 00:26:13.479 11:36:57 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # awk '{print $3}' 00:26:13.479 11:36:57 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # asan_lib= 00:26:13.479 11:36:57 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # [[ -n '' ]] 00:26:13.479 11:36:57 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # for sanitizer in "${sanitizers[@]}" 00:26:13.479 11:36:57 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:26:13.479 11:36:57 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # grep libclang_rt.asan 00:26:13.479 11:36:57 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # awk '{print $3}' 00:26:13.479 11:36:57 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # asan_lib= 00:26:13.479 11:36:57 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # [[ -n '' ]] 00:26:13.479 11:36:57 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1351 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:26:13.479 11:36:57 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1351 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:26:13.738 job_Malloc0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:26:13.738 job_Malloc1p0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:26:13.738 job_Malloc1p1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:26:13.738 job_Malloc2p0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:26:13.738 job_Malloc2p1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:26:13.738 job_Malloc2p2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:26:13.738 job_Malloc2p3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:26:13.738 job_Malloc2p4: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:26:13.738 job_Malloc2p5: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:26:13.738 job_Malloc2p6: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:26:13.738 job_Malloc2p7: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:26:13.738 job_TestPT: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:26:13.738 job_raid0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:26:13.738 job_concat0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:26:13.738 job_raid1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:26:13.738 job_AIO0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:26:13.738 fio-3.35 00:26:13.738 Starting 16 threads 00:26:25.942 00:26:25.942 job_Malloc0: (groupid=0, jobs=16): err= 0: pid=117140: Mon Jun 10 11:37:08 2024 00:26:25.942 read: IOPS=105k, BW=409MiB/s (429MB/s)(4095MiB/10001msec) 00:26:25.942 slat (nsec): min=1909, max=3448.8k, avg=30648.85, stdev=13861.93 00:26:25.942 clat (usec): min=8, max=3568, avg=258.82, stdev=125.91 00:26:25.942 lat (usec): min=13, max=3575, avg=289.47, stdev=133.12 00:26:25.942 clat percentiles (usec): 00:26:25.942 | 50.000th=[ 251], 99.000th=[ 545], 99.900th=[ 734], 99.990th=[ 938], 00:26:25.942 | 99.999th=[ 1074] 00:26:25.942 write: IOPS=163k, BW=635MiB/s (666MB/s)(6271MiB/9874msec); 0 zone resets 00:26:25.942 slat (usec): min=4, max=280, avg=41.46, stdev=13.35 00:26:25.942 clat (usec): min=11, max=1525, avg=303.87, stdev=139.79 00:26:25.942 lat (usec): min=26, max=1638, avg=345.33, stdev=146.64 00:26:25.942 clat percentiles (usec): 00:26:25.942 | 50.000th=[ 293], 99.000th=[ 627], 99.900th=[ 775], 99.990th=[ 971], 00:26:25.942 | 99.999th=[ 1254] 00:26:25.942 bw ( KiB/s): min=528028, max=872230, per=99.03%, avg=644027.74, stdev=5671.39, samples=304 00:26:25.942 iops : min=132006, max=218055, avg=161006.79, stdev=1417.83, samples=304 00:26:25.942 lat (usec) : 10=0.01%, 20=0.03%, 50=1.12%, 100=6.72%, 250=35.72% 00:26:25.942 lat (usec) : 500=49.77%, 750=6.52%, 1000=0.12% 00:26:25.942 lat (msec) : 2=0.01%, 4=0.01% 00:26:25.942 cpu : usr=99.28%, sys=0.35%, ctx=650, majf=0, minf=2939 00:26:25.942 IO depths : 1=12.4%, 2=24.7%, 4=50.3%, 8=12.6%, 16=0.0%, 32=0.0%, >=64=0.0% 00:26:25.942 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:26:25.942 complete : 0=0.0%, 4=89.1%, 8=10.9%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:26:25.942 issued rwts: total=1048237,1605331,0,0 short=0,0,0,0 dropped=0,0,0,0 00:26:25.942 latency : target=0, window=0, percentile=100.00%, depth=8 00:26:25.942 00:26:25.942 Run status group 0 (all jobs): 00:26:25.942 READ: bw=409MiB/s (429MB/s), 409MiB/s-409MiB/s (429MB/s-429MB/s), io=4095MiB (4294MB), run=10001-10001msec 00:26:25.942 WRITE: bw=635MiB/s (666MB/s), 635MiB/s-635MiB/s (666MB/s-666MB/s), io=6271MiB (6575MB), run=9874-9874msec 00:26:25.942 00:26:25.942 real 0m11.391s 00:26:25.942 user 2m45.107s 00:26:25.942 sys 0m1.220s 00:26:25.942 11:37:08 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1125 -- # xtrace_disable 00:26:25.942 11:37:08 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:26:25.942 ************************************ 00:26:25.942 END TEST bdev_fio_rw_verify 00:26:25.942 ************************************ 00:26:25.942 11:37:08 blockdev_general.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:26:25.942 11:37:08 blockdev_general.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:26:25.942 11:37:08 blockdev_general.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:26:25.943 11:37:08 blockdev_general.bdev_fio -- common/autotest_common.sh@1279 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:26:25.943 11:37:08 blockdev_general.bdev_fio -- common/autotest_common.sh@1280 -- # local workload=trim 00:26:25.943 11:37:08 blockdev_general.bdev_fio -- common/autotest_common.sh@1281 -- # local bdev_type= 00:26:25.943 11:37:08 blockdev_general.bdev_fio -- common/autotest_common.sh@1282 -- # local env_context= 00:26:25.943 11:37:08 blockdev_general.bdev_fio -- common/autotest_common.sh@1283 -- # local fio_dir=/usr/src/fio 00:26:25.943 11:37:08 blockdev_general.bdev_fio -- common/autotest_common.sh@1285 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:26:25.943 11:37:08 blockdev_general.bdev_fio -- common/autotest_common.sh@1290 -- # '[' -z trim ']' 00:26:25.943 11:37:08 blockdev_general.bdev_fio -- common/autotest_common.sh@1294 -- # '[' -n '' ']' 00:26:25.943 11:37:08 blockdev_general.bdev_fio -- common/autotest_common.sh@1298 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:26:25.943 11:37:08 blockdev_general.bdev_fio -- common/autotest_common.sh@1300 -- # cat 00:26:25.943 11:37:08 blockdev_general.bdev_fio -- common/autotest_common.sh@1312 -- # '[' trim == verify ']' 00:26:25.943 11:37:08 blockdev_general.bdev_fio -- common/autotest_common.sh@1327 -- # '[' trim == trim ']' 00:26:25.943 11:37:08 blockdev_general.bdev_fio -- common/autotest_common.sh@1328 -- # echo rw=trimwrite 00:26:25.943 11:37:08 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:26:25.944 11:37:08 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "f5d53c09-9e53-49be-8089-99f6ea739a46"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "f5d53c09-9e53-49be-8089-99f6ea739a46",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "ff96b347-9c79-5306-af4a-d4eda060fbd6"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "ff96b347-9c79-5306-af4a-d4eda060fbd6",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "4a1f5ed1-08b1-5714-a33f-2c9d20c76574"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "4a1f5ed1-08b1-5714-a33f-2c9d20c76574",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "ac15056b-6dd5-5e45-9750-30c4f629b612"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "ac15056b-6dd5-5e45-9750-30c4f629b612",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "77d827a2-9859-5187-a4d1-41b3bd7fee08"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "77d827a2-9859-5187-a4d1-41b3bd7fee08",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "c5a15764-07ab-5d02-a78e-ada5966d8ccc"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "c5a15764-07ab-5d02-a78e-ada5966d8ccc",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "3dbfbb0f-cb8f-5f56-92e7-4bd8274b8f3d"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "3dbfbb0f-cb8f-5f56-92e7-4bd8274b8f3d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "f8cd6e62-fec1-5d97-84ac-c35a89a85576"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "f8cd6e62-fec1-5d97-84ac-c35a89a85576",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "e092fa87-dc0d-579b-9705-de16c24c024b"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "e092fa87-dc0d-579b-9705-de16c24c024b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "042bbc58-9a08-533b-87fe-e7ff4851ae92"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "042bbc58-9a08-533b-87fe-e7ff4851ae92",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "44d9da4a-12e4-560e-b2bb-f3e5918cf979"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "44d9da4a-12e4-560e-b2bb-f3e5918cf979",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "50afbc9f-48c2-55d5-91bb-df27f9ebafd6"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "50afbc9f-48c2-55d5-91bb-df27f9ebafd6",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "a9d8ad57-cc67-41e0-85f6-d5b12a736b6b"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "a9d8ad57-cc67-41e0-85f6-d5b12a736b6b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "a9d8ad57-cc67-41e0-85f6-d5b12a736b6b",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "d36ae0e4-d523-4a48-85c8-83500b98f038",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "77ba70fd-23c6-4162-a209-ed0defe4b139",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "176297b6-dfb5-4f4c-a821-a6ff406023cc"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "176297b6-dfb5-4f4c-a821-a6ff406023cc",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "176297b6-dfb5-4f4c-a821-a6ff406023cc",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "32941329-ce4c-45f1-822e-db59c38d696a",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "a6f505be-ce00-499c-b815-12cca33eeecd",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "194af16a-9e77-4a25-9489-7ce599692d07"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "194af16a-9e77-4a25-9489-7ce599692d07",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "write_zeroes": true,' ' "flush": false,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "194af16a-9e77-4a25-9489-7ce599692d07",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "ce2617bf-6980-4466-82c5-a312e509b721",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "ad324aad-24ed-4b65-8fff-29748833e968",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "eaedbead-5076-47d8-b393-529b940bc433"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "eaedbead-5076-47d8-b393-529b940bc433",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:26:25.944 11:37:08 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n Malloc0 00:26:25.944 Malloc1p0 00:26:25.944 Malloc1p1 00:26:25.944 Malloc2p0 00:26:25.944 Malloc2p1 00:26:25.944 Malloc2p2 00:26:25.944 Malloc2p3 00:26:25.944 Malloc2p4 00:26:25.944 Malloc2p5 00:26:25.944 Malloc2p6 00:26:25.944 Malloc2p7 00:26:25.944 TestPT 00:26:25.944 raid0 00:26:25.944 concat0 ]] 00:26:25.944 11:37:08 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:26:25.945 11:37:08 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "f5d53c09-9e53-49be-8089-99f6ea739a46"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "f5d53c09-9e53-49be-8089-99f6ea739a46",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "ff96b347-9c79-5306-af4a-d4eda060fbd6"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "ff96b347-9c79-5306-af4a-d4eda060fbd6",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "4a1f5ed1-08b1-5714-a33f-2c9d20c76574"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "4a1f5ed1-08b1-5714-a33f-2c9d20c76574",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "ac15056b-6dd5-5e45-9750-30c4f629b612"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "ac15056b-6dd5-5e45-9750-30c4f629b612",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "77d827a2-9859-5187-a4d1-41b3bd7fee08"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "77d827a2-9859-5187-a4d1-41b3bd7fee08",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "c5a15764-07ab-5d02-a78e-ada5966d8ccc"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "c5a15764-07ab-5d02-a78e-ada5966d8ccc",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "3dbfbb0f-cb8f-5f56-92e7-4bd8274b8f3d"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "3dbfbb0f-cb8f-5f56-92e7-4bd8274b8f3d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "f8cd6e62-fec1-5d97-84ac-c35a89a85576"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "f8cd6e62-fec1-5d97-84ac-c35a89a85576",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "e092fa87-dc0d-579b-9705-de16c24c024b"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "e092fa87-dc0d-579b-9705-de16c24c024b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "042bbc58-9a08-533b-87fe-e7ff4851ae92"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "042bbc58-9a08-533b-87fe-e7ff4851ae92",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "44d9da4a-12e4-560e-b2bb-f3e5918cf979"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "44d9da4a-12e4-560e-b2bb-f3e5918cf979",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "50afbc9f-48c2-55d5-91bb-df27f9ebafd6"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "50afbc9f-48c2-55d5-91bb-df27f9ebafd6",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "a9d8ad57-cc67-41e0-85f6-d5b12a736b6b"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "a9d8ad57-cc67-41e0-85f6-d5b12a736b6b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "a9d8ad57-cc67-41e0-85f6-d5b12a736b6b",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "d36ae0e4-d523-4a48-85c8-83500b98f038",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "77ba70fd-23c6-4162-a209-ed0defe4b139",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "176297b6-dfb5-4f4c-a821-a6ff406023cc"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "176297b6-dfb5-4f4c-a821-a6ff406023cc",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "176297b6-dfb5-4f4c-a821-a6ff406023cc",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "32941329-ce4c-45f1-822e-db59c38d696a",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "a6f505be-ce00-499c-b815-12cca33eeecd",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "194af16a-9e77-4a25-9489-7ce599692d07"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "194af16a-9e77-4a25-9489-7ce599692d07",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "write_zeroes": true,' ' "flush": false,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "194af16a-9e77-4a25-9489-7ce599692d07",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "ce2617bf-6980-4466-82c5-a312e509b721",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "ad324aad-24ed-4b65-8fff-29748833e968",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "eaedbead-5076-47d8-b393-529b940bc433"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "eaedbead-5076-47d8-b393-529b940bc433",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:26:25.945 11:37:08 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:26:25.945 11:37:08 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc0]' 00:26:25.945 11:37:08 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc0 00:26:25.945 11:37:08 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:26:25.945 11:37:08 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc1p0]' 00:26:25.945 11:37:08 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc1p0 00:26:25.945 11:37:08 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:26:25.945 11:37:08 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc1p1]' 00:26:25.945 11:37:08 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc1p1 00:26:25.945 11:37:08 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:26:25.945 11:37:08 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p0]' 00:26:25.945 11:37:08 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p0 00:26:25.945 11:37:08 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:26:25.945 11:37:08 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p1]' 00:26:25.945 11:37:08 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p1 00:26:25.945 11:37:08 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:26:25.945 11:37:08 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p2]' 00:26:25.945 11:37:08 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p2 00:26:25.945 11:37:08 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:26:25.945 11:37:08 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p3]' 00:26:25.945 11:37:08 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p3 00:26:25.945 11:37:08 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:26:25.945 11:37:08 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p4]' 00:26:25.945 11:37:08 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p4 00:26:25.945 11:37:08 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:26:25.945 11:37:08 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p5]' 00:26:25.945 11:37:08 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p5 00:26:25.945 11:37:08 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:26:25.945 11:37:08 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p6]' 00:26:25.945 11:37:08 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p6 00:26:25.945 11:37:08 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:26:25.945 11:37:08 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p7]' 00:26:25.945 11:37:08 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p7 00:26:25.945 11:37:08 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:26:25.945 11:37:08 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_TestPT]' 00:26:25.945 11:37:08 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=TestPT 00:26:25.945 11:37:08 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:26:25.945 11:37:08 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_raid0]' 00:26:25.945 11:37:08 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=raid0 00:26:25.945 11:37:08 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:26:25.946 11:37:08 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_concat0]' 00:26:25.946 11:37:08 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=concat0 00:26:25.946 11:37:08 blockdev_general.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:26:25.946 11:37:08 blockdev_general.bdev_fio -- common/autotest_common.sh@1100 -- # '[' 11 -le 1 ']' 00:26:25.946 11:37:08 blockdev_general.bdev_fio -- common/autotest_common.sh@1106 -- # xtrace_disable 00:26:25.946 11:37:08 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:26:25.946 ************************************ 00:26:25.946 START TEST bdev_fio_trim 00:26:25.946 ************************************ 00:26:25.946 11:37:08 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:26:25.946 11:37:08 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1355 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:26:25.946 11:37:08 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1336 -- # local fio_dir=/usr/src/fio 00:26:25.946 11:37:08 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1338 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:26:25.946 11:37:08 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1338 -- # local sanitizers 00:26:25.946 11:37:08 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:26:25.946 11:37:08 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # shift 00:26:25.946 11:37:08 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1342 -- # local asan_lib= 00:26:25.946 11:37:08 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # for sanitizer in "${sanitizers[@]}" 00:26:25.946 11:37:08 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # grep libasan 00:26:25.946 11:37:08 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:26:25.946 11:37:08 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # awk '{print $3}' 00:26:25.946 11:37:08 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # asan_lib= 00:26:25.946 11:37:08 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # [[ -n '' ]] 00:26:25.946 11:37:08 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # for sanitizer in "${sanitizers[@]}" 00:26:25.946 11:37:08 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:26:25.946 11:37:08 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # grep libclang_rt.asan 00:26:25.946 11:37:08 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # awk '{print $3}' 00:26:25.946 11:37:09 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # asan_lib= 00:26:25.946 11:37:09 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # [[ -n '' ]] 00:26:25.946 11:37:09 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1351 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:26:25.946 11:37:09 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1351 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:26:25.946 job_Malloc0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:26:25.946 job_Malloc1p0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:26:25.946 job_Malloc1p1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:26:25.946 job_Malloc2p0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:26:25.946 job_Malloc2p1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:26:25.946 job_Malloc2p2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:26:25.946 job_Malloc2p3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:26:25.946 job_Malloc2p4: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:26:25.946 job_Malloc2p5: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:26:25.946 job_Malloc2p6: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:26:25.946 job_Malloc2p7: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:26:25.946 job_TestPT: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:26:25.946 job_raid0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:26:25.946 job_concat0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:26:25.946 fio-3.35 00:26:25.946 Starting 14 threads 00:26:38.141 00:26:38.141 job_Malloc0: (groupid=0, jobs=14): err= 0: pid=118816: Mon Jun 10 11:37:20 2024 00:26:38.141 write: IOPS=144k, BW=563MiB/s (590MB/s)(5626MiB/10001msec); 0 zone resets 00:26:38.141 slat (nsec): min=1900, max=874827, avg=33920.99, stdev=10013.20 00:26:38.141 clat (usec): min=21, max=1541, avg=245.08, stdev=84.15 00:26:38.141 lat (usec): min=35, max=1554, avg=279.01, stdev=87.73 00:26:38.141 clat percentiles (usec): 00:26:38.141 | 50.000th=[ 239], 99.000th=[ 424], 99.900th=[ 465], 99.990th=[ 545], 00:26:38.141 | 99.999th=[ 840] 00:26:38.141 bw ( KiB/s): min=524320, max=779000, per=100.00%, avg=577932.63, stdev=4054.96, samples=266 00:26:38.141 iops : min=131080, max=194747, avg=144483.00, stdev=1013.71, samples=266 00:26:38.141 trim: IOPS=144k, BW=563MiB/s (590MB/s)(5626MiB/10001msec); 0 zone resets 00:26:38.141 slat (usec): min=3, max=2776, avg=23.30, stdev= 6.87 00:26:38.141 clat (usec): min=4, max=1162, avg=275.80, stdev=91.27 00:26:38.141 lat (usec): min=13, max=3130, avg=299.10, stdev=94.18 00:26:38.141 clat percentiles (usec): 00:26:38.141 | 50.000th=[ 273], 99.000th=[ 465], 99.900th=[ 506], 99.990th=[ 586], 00:26:38.141 | 99.999th=[ 816] 00:26:38.141 bw ( KiB/s): min=524320, max=779032, per=100.00%, avg=577932.63, stdev=4055.22, samples=266 00:26:38.141 iops : min=131080, max=194755, avg=144483.00, stdev=1013.77, samples=266 00:26:38.141 lat (usec) : 10=0.01%, 20=0.03%, 50=0.21%, 100=2.00%, 250=46.61% 00:26:38.141 lat (usec) : 500=51.06%, 750=0.08%, 1000=0.01% 00:26:38.141 lat (msec) : 2=0.01% 00:26:38.141 cpu : usr=99.64%, sys=0.00%, ctx=515, majf=0, minf=1051 00:26:38.141 IO depths : 1=12.5%, 2=24.9%, 4=50.0%, 8=12.6%, 16=0.0%, 32=0.0%, >=64=0.0% 00:26:38.141 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:26:38.141 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:26:38.141 issued rwts: total=0,1440367,1440369,0 short=0,0,0,0 dropped=0,0,0,0 00:26:38.141 latency : target=0, window=0, percentile=100.00%, depth=8 00:26:38.141 00:26:38.141 Run status group 0 (all jobs): 00:26:38.141 WRITE: bw=563MiB/s (590MB/s), 563MiB/s-563MiB/s (590MB/s-590MB/s), io=5626MiB (5900MB), run=10001-10001msec 00:26:38.141 TRIM: bw=563MiB/s (590MB/s), 563MiB/s-563MiB/s (590MB/s-590MB/s), io=5626MiB (5900MB), run=10001-10001msec 00:26:38.141 00:26:38.141 real 0m11.475s 00:26:38.141 user 2m25.461s 00:26:38.141 sys 0m0.636s 00:26:38.141 11:37:20 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1125 -- # xtrace_disable 00:26:38.141 11:37:20 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:26:38.141 ************************************ 00:26:38.141 END TEST bdev_fio_trim 00:26:38.141 ************************************ 00:26:38.141 11:37:20 blockdev_general.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:26:38.141 11:37:20 blockdev_general.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:26:38.141 11:37:20 blockdev_general.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:26:38.141 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:26:38.141 11:37:20 blockdev_general.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:26:38.141 00:26:38.141 real 0m23.239s 00:26:38.141 user 5m10.776s 00:26:38.141 sys 0m2.049s 00:26:38.141 11:37:20 blockdev_general.bdev_fio -- common/autotest_common.sh@1125 -- # xtrace_disable 00:26:38.141 11:37:20 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:26:38.141 ************************************ 00:26:38.141 END TEST bdev_fio 00:26:38.141 ************************************ 00:26:38.141 11:37:20 blockdev_general -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:26:38.141 11:37:20 blockdev_general -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:26:38.141 11:37:20 blockdev_general -- common/autotest_common.sh@1100 -- # '[' 16 -le 1 ']' 00:26:38.141 11:37:20 blockdev_general -- common/autotest_common.sh@1106 -- # xtrace_disable 00:26:38.141 11:37:20 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:26:38.141 ************************************ 00:26:38.141 START TEST bdev_verify 00:26:38.141 ************************************ 00:26:38.141 11:37:20 blockdev_general.bdev_verify -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:26:38.141 [2024-06-10 11:37:20.594284] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:26:38.141 [2024-06-10 11:37:20.594329] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid120291 ] 00:26:38.141 [2024-06-10 11:37:20.680637] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:26:38.141 [2024-06-10 11:37:20.767396] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:26:38.141 [2024-06-10 11:37:20.767398] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:26:38.141 [2024-06-10 11:37:20.908498] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:26:38.141 [2024-06-10 11:37:20.908543] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:26:38.141 [2024-06-10 11:37:20.908552] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:38.141 [2024-06-10 11:37:20.916499] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:26:38.141 [2024-06-10 11:37:20.916516] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:26:38.141 [2024-06-10 11:37:20.924517] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:26:38.141 [2024-06-10 11:37:20.924533] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:26:38.141 [2024-06-10 11:37:20.994569] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:26:38.141 [2024-06-10 11:37:20.994613] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:38.141 [2024-06-10 11:37:20.994625] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f40c30 00:26:38.141 [2024-06-10 11:37:20.994634] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:38.141 [2024-06-10 11:37:20.995697] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:38.141 [2024-06-10 11:37:20.995722] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:26:38.141 Running I/O for 5 seconds... 00:26:43.471 00:26:43.471 Latency(us) 00:26:43.471 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:43.471 Job: Malloc0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:26:43.471 Verification LBA range: start 0x0 length 0x1000 00:26:43.471 Malloc0 : 5.09 1685.40 6.58 0.00 0.00 75831.67 381.11 175066.60 00:26:43.471 Job: Malloc0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:26:43.471 Verification LBA range: start 0x1000 length 0x1000 00:26:43.471 Malloc0 : 5.07 1667.43 6.51 0.00 0.00 76644.54 430.97 277188.79 00:26:43.471 Job: Malloc1p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:26:43.471 Verification LBA range: start 0x0 length 0x800 00:26:43.471 Malloc1p0 : 5.09 855.02 3.34 0.00 0.00 149098.41 2706.92 164124.94 00:26:43.471 Job: Malloc1p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:26:43.471 Verification LBA range: start 0x800 length 0x800 00:26:43.471 Malloc1p0 : 5.07 858.70 3.35 0.00 0.00 148471.00 2721.17 155006.89 00:26:43.471 Job: Malloc1p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:26:43.471 Verification LBA range: start 0x0 length 0x800 00:26:43.471 Malloc1p1 : 5.09 854.78 3.34 0.00 0.00 148831.09 2664.18 161389.52 00:26:43.471 Job: Malloc1p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:26:43.471 Verification LBA range: start 0x800 length 0x800 00:26:43.471 Malloc1p1 : 5.07 858.43 3.35 0.00 0.00 148206.11 2635.69 152271.47 00:26:43.471 Job: Malloc2p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:26:43.471 Verification LBA range: start 0x0 length 0x200 00:26:43.471 Malloc2p0 : 5.09 854.52 3.34 0.00 0.00 148557.23 2621.44 158654.11 00:26:43.471 Job: Malloc2p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:26:43.471 Verification LBA range: start 0x200 length 0x200 00:26:43.471 Malloc2p0 : 5.07 858.15 3.35 0.00 0.00 147930.79 2621.44 149536.06 00:26:43.471 Job: Malloc2p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:26:43.471 Verification LBA range: start 0x0 length 0x200 00:26:43.471 Malloc2p1 : 5.09 854.28 3.34 0.00 0.00 148283.55 2735.42 156830.50 00:26:43.471 Job: Malloc2p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:26:43.471 Verification LBA range: start 0x200 length 0x200 00:26:43.471 Malloc2p1 : 5.17 866.69 3.39 0.00 0.00 146181.30 2692.67 146800.64 00:26:43.471 Job: Malloc2p2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:26:43.472 Verification LBA range: start 0x0 length 0x200 00:26:43.472 Malloc2p2 : 5.10 854.03 3.34 0.00 0.00 148017.59 2493.22 153183.28 00:26:43.472 Job: Malloc2p2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:26:43.472 Verification LBA range: start 0x200 length 0x200 00:26:43.472 Malloc2p2 : 5.17 866.43 3.38 0.00 0.00 145934.68 2535.96 144065.22 00:26:43.472 Job: Malloc2p3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:26:43.472 Verification LBA range: start 0x0 length 0x200 00:26:43.472 Malloc2p3 : 5.18 865.54 3.38 0.00 0.00 145785.16 2564.45 150447.86 00:26:43.472 Job: Malloc2p3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:26:43.472 Verification LBA range: start 0x200 length 0x200 00:26:43.472 Malloc2p3 : 5.17 866.16 3.38 0.00 0.00 145697.34 2592.95 142241.61 00:26:43.472 Job: Malloc2p4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:26:43.472 Verification LBA range: start 0x0 length 0x200 00:26:43.472 Malloc2p4 : 5.18 865.30 3.38 0.00 0.00 145530.17 2592.95 148624.25 00:26:43.472 Job: Malloc2p4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:26:43.472 Verification LBA range: start 0x200 length 0x200 00:26:43.472 Malloc2p4 : 5.17 865.89 3.38 0.00 0.00 145446.67 2578.70 138594.39 00:26:43.472 Job: Malloc2p5 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:26:43.472 Verification LBA range: start 0x0 length 0x200 00:26:43.472 Malloc2p5 : 5.18 865.07 3.38 0.00 0.00 145279.43 2464.72 146800.64 00:26:43.472 Job: Malloc2p5 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:26:43.472 Verification LBA range: start 0x200 length 0x200 00:26:43.472 Malloc2p5 : 5.18 865.64 3.38 0.00 0.00 145205.40 2478.97 137682.59 00:26:43.472 Job: Malloc2p6 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:26:43.472 Verification LBA range: start 0x0 length 0x200 00:26:43.472 Malloc2p6 : 5.18 864.75 3.38 0.00 0.00 145054.88 2436.23 144977.03 00:26:43.472 Job: Malloc2p6 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:26:43.472 Verification LBA range: start 0x200 length 0x200 00:26:43.472 Malloc2p6 : 5.18 865.41 3.38 0.00 0.00 144973.40 2478.97 135858.98 00:26:43.472 Job: Malloc2p7 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:26:43.472 Verification LBA range: start 0x0 length 0x200 00:26:43.472 Malloc2p7 : 5.18 864.27 3.38 0.00 0.00 144866.36 2521.71 141329.81 00:26:43.472 Job: Malloc2p7 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:26:43.472 Verification LBA range: start 0x200 length 0x200 00:26:43.472 Malloc2p7 : 5.18 865.18 3.38 0.00 0.00 144743.06 2521.71 132211.76 00:26:43.472 Job: TestPT (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:26:43.472 Verification LBA range: start 0x0 length 0x1000 00:26:43.472 TestPT : 5.19 840.24 3.28 0.00 0.00 148139.26 16754.42 141329.81 00:26:43.472 Job: TestPT (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:26:43.472 Verification LBA range: start 0x1000 length 0x1000 00:26:43.472 TestPT : 5.19 842.69 3.29 0.00 0.00 148012.62 9573.95 189655.49 00:26:43.472 Job: raid0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:26:43.472 Verification LBA range: start 0x0 length 0x2000 00:26:43.472 raid0 : 5.19 863.67 3.37 0.00 0.00 144188.62 2592.95 122181.90 00:26:43.472 Job: raid0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:26:43.472 Verification LBA range: start 0x2000 length 0x2000 00:26:43.472 raid0 : 5.18 864.72 3.38 0.00 0.00 144028.41 2564.45 113063.85 00:26:43.472 Job: concat0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:26:43.472 Verification LBA range: start 0x0 length 0x2000 00:26:43.472 concat0 : 5.19 863.35 3.37 0.00 0.00 143963.06 2436.23 120358.29 00:26:43.472 Job: concat0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:26:43.472 Verification LBA range: start 0x2000 length 0x2000 00:26:43.472 concat0 : 5.18 864.24 3.38 0.00 0.00 143836.98 2421.98 108960.72 00:26:43.472 Job: raid1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:26:43.472 Verification LBA range: start 0x0 length 0x1000 00:26:43.472 raid1 : 5.19 862.94 3.37 0.00 0.00 143742.18 2906.38 115343.36 00:26:43.472 Job: raid1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:26:43.472 Verification LBA range: start 0x1000 length 0x1000 00:26:43.472 raid1 : 5.19 863.76 3.37 0.00 0.00 143635.55 3077.34 112152.04 00:26:43.472 Job: AIO0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:26:43.472 Verification LBA range: start 0x0 length 0x4e2 00:26:43.472 AIO0 : 5.19 862.77 3.37 0.00 0.00 143469.67 1139.76 114431.55 00:26:43.472 Job: AIO0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:26:43.472 Verification LBA range: start 0x4e2 length 0x4e2 00:26:43.472 AIO0 : 5.19 863.53 3.37 0.00 0.00 143370.74 1154.00 116711.07 00:26:43.472 =================================================================================================================== 00:26:43.472 Total : 29178.95 113.98 0.00 0.00 138026.06 381.11 277188.79 00:26:43.472 00:26:43.472 real 0m6.277s 00:26:43.472 user 0m11.770s 00:26:43.472 sys 0m0.336s 00:26:43.472 11:37:26 blockdev_general.bdev_verify -- common/autotest_common.sh@1125 -- # xtrace_disable 00:26:43.472 11:37:26 blockdev_general.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:26:43.472 ************************************ 00:26:43.472 END TEST bdev_verify 00:26:43.472 ************************************ 00:26:43.472 11:37:26 blockdev_general -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:26:43.472 11:37:26 blockdev_general -- common/autotest_common.sh@1100 -- # '[' 16 -le 1 ']' 00:26:43.472 11:37:26 blockdev_general -- common/autotest_common.sh@1106 -- # xtrace_disable 00:26:43.472 11:37:26 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:26:43.472 ************************************ 00:26:43.472 START TEST bdev_verify_big_io 00:26:43.472 ************************************ 00:26:43.472 11:37:26 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:26:43.472 [2024-06-10 11:37:26.959099] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:26:43.472 [2024-06-10 11:37:26.959147] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid121035 ] 00:26:43.472 [2024-06-10 11:37:27.046779] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:26:43.472 [2024-06-10 11:37:27.127220] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:26:43.472 [2024-06-10 11:37:27.127222] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:26:43.472 [2024-06-10 11:37:27.263717] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:26:43.472 [2024-06-10 11:37:27.263768] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:26:43.472 [2024-06-10 11:37:27.263779] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:43.472 [2024-06-10 11:37:27.271721] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:26:43.472 [2024-06-10 11:37:27.271739] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:26:43.472 [2024-06-10 11:37:27.279737] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:26:43.472 [2024-06-10 11:37:27.279753] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:26:43.472 [2024-06-10 11:37:27.350697] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:26:43.472 [2024-06-10 11:37:27.350741] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:43.472 [2024-06-10 11:37:27.350754] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1076c30 00:26:43.472 [2024-06-10 11:37:27.350763] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:43.472 [2024-06-10 11:37:27.351806] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:43.472 [2024-06-10 11:37:27.351832] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:26:43.732 [2024-06-10 11:37:27.505980] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p0 simultaneously (32). Queue depth is limited to 32 00:26:43.732 [2024-06-10 11:37:27.506802] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p0 simultaneously (32). Queue depth is limited to 32 00:26:43.732 [2024-06-10 11:37:27.508116] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p1 simultaneously (32). Queue depth is limited to 32 00:26:43.732 [2024-06-10 11:37:27.508929] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p1 simultaneously (32). Queue depth is limited to 32 00:26:43.732 [2024-06-10 11:37:27.510277] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p2 simultaneously (32). Queue depth is limited to 32 00:26:43.732 [2024-06-10 11:37:27.511122] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p2 simultaneously (32). Queue depth is limited to 32 00:26:43.732 [2024-06-10 11:37:27.512477] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p3 simultaneously (32). Queue depth is limited to 32 00:26:43.732 [2024-06-10 11:37:27.513827] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p3 simultaneously (32). Queue depth is limited to 32 00:26:43.732 [2024-06-10 11:37:27.514619] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p4 simultaneously (32). Queue depth is limited to 32 00:26:43.732 [2024-06-10 11:37:27.515843] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p4 simultaneously (32). Queue depth is limited to 32 00:26:43.732 [2024-06-10 11:37:27.516586] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p5 simultaneously (32). Queue depth is limited to 32 00:26:43.732 [2024-06-10 11:37:27.517787] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p5 simultaneously (32). Queue depth is limited to 32 00:26:43.732 [2024-06-10 11:37:27.518513] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p6 simultaneously (32). Queue depth is limited to 32 00:26:43.732 [2024-06-10 11:37:27.519721] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p6 simultaneously (32). Queue depth is limited to 32 00:26:43.732 [2024-06-10 11:37:27.520454] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p7 simultaneously (32). Queue depth is limited to 32 00:26:43.732 [2024-06-10 11:37:27.521659] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p7 simultaneously (32). Queue depth is limited to 32 00:26:43.732 [2024-06-10 11:37:27.542974] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev AIO0 simultaneously (78). Queue depth is limited to 78 00:26:43.732 [2024-06-10 11:37:27.544714] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev AIO0 simultaneously (78). Queue depth is limited to 78 00:26:43.732 Running I/O for 5 seconds... 00:26:50.298 00:26:50.298 Latency(us) 00:26:50.298 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:50.298 Job: Malloc0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:26:50.298 Verification LBA range: start 0x0 length 0x100 00:26:50.298 Malloc0 : 5.33 336.35 21.02 0.00 0.00 375323.72 587.69 1116049.59 00:26:50.298 Job: Malloc0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:26:50.298 Verification LBA range: start 0x100 length 0x100 00:26:50.298 Malloc0 : 5.54 323.44 20.22 0.00 0.00 391035.25 562.75 1240055.10 00:26:50.298 Job: Malloc1p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:26:50.298 Verification LBA range: start 0x0 length 0x80 00:26:50.298 Malloc1p0 : 6.03 58.34 3.65 0.00 0.00 2056197.47 1082.77 3267909.90 00:26:50.298 Job: Malloc1p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:26:50.298 Verification LBA range: start 0x80 length 0x80 00:26:50.298 Malloc1p0 : 5.71 169.56 10.60 0.00 0.00 719430.10 1937.59 1458888.35 00:26:50.298 Job: Malloc1p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:26:50.298 Verification LBA range: start 0x0 length 0x80 00:26:50.298 Malloc1p1 : 6.03 58.34 3.65 0.00 0.00 2014174.36 1047.15 3165787.71 00:26:50.298 Job: Malloc1p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:26:50.298 Verification LBA range: start 0x80 length 0x80 00:26:50.298 Malloc1p1 : 5.93 59.39 3.71 0.00 0.00 1996473.24 1068.52 3078254.41 00:26:50.298 Job: Malloc2p0 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:26:50.298 Verification LBA range: start 0x0 length 0x20 00:26:50.298 Malloc2p0 : 5.68 45.06 2.82 0.00 0.00 654348.61 487.96 1203582.89 00:26:50.298 Job: Malloc2p0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:26:50.298 Verification LBA range: start 0x20 length 0x20 00:26:50.298 Malloc2p0 : 5.66 45.20 2.83 0.00 0.00 657472.09 495.08 1072282.94 00:26:50.298 Job: Malloc2p1 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:26:50.298 Verification LBA range: start 0x0 length 0x20 00:26:50.298 Malloc2p1 : 5.75 47.34 2.96 0.00 0.00 624653.48 484.40 1188994.00 00:26:50.298 Job: Malloc2p1 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:26:50.298 Verification LBA range: start 0x20 length 0x20 00:26:50.298 Malloc2p1 : 5.66 45.20 2.82 0.00 0.00 654421.71 509.33 1057694.05 00:26:50.298 Job: Malloc2p2 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:26:50.298 Verification LBA range: start 0x0 length 0x20 00:26:50.298 Malloc2p2 : 5.75 47.33 2.96 0.00 0.00 620999.06 484.40 1174405.12 00:26:50.298 Job: Malloc2p2 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:26:50.298 Verification LBA range: start 0x20 length 0x20 00:26:50.298 Malloc2p2 : 5.66 45.19 2.82 0.00 0.00 651583.26 502.21 1043105.17 00:26:50.298 Job: Malloc2p3 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:26:50.298 Verification LBA range: start 0x0 length 0x20 00:26:50.298 Malloc2p3 : 5.75 47.32 2.96 0.00 0.00 617793.98 480.83 1167110.68 00:26:50.298 Job: Malloc2p3 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:26:50.298 Verification LBA range: start 0x20 length 0x20 00:26:50.298 Malloc2p3 : 5.67 45.18 2.82 0.00 0.00 647983.27 498.64 1028516.29 00:26:50.298 Job: Malloc2p4 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:26:50.298 Verification LBA range: start 0x0 length 0x20 00:26:50.298 Malloc2p4 : 5.75 47.32 2.96 0.00 0.00 614326.83 480.83 1152521.79 00:26:50.298 Job: Malloc2p4 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:26:50.298 Verification LBA range: start 0x20 length 0x20 00:26:50.298 Malloc2p4 : 5.71 47.63 2.98 0.00 0.00 616559.65 502.21 1013927.40 00:26:50.298 Job: Malloc2p5 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:26:50.298 Verification LBA range: start 0x0 length 0x20 00:26:50.298 Malloc2p5 : 5.75 47.31 2.96 0.00 0.00 610820.51 495.08 1137932.91 00:26:50.298 Job: Malloc2p5 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:26:50.298 Verification LBA range: start 0x20 length 0x20 00:26:50.299 Malloc2p5 : 5.71 47.62 2.98 0.00 0.00 613743.83 495.08 1006632.96 00:26:50.299 Job: Malloc2p6 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:26:50.299 Verification LBA range: start 0x0 length 0x20 00:26:50.299 Malloc2p6 : 5.75 47.30 2.96 0.00 0.00 607290.20 473.71 1116049.59 00:26:50.299 Job: Malloc2p6 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:26:50.299 Verification LBA range: start 0x20 length 0x20 00:26:50.299 Malloc2p6 : 5.71 47.62 2.98 0.00 0.00 610496.64 498.64 992044.08 00:26:50.299 Job: Malloc2p7 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:26:50.299 Verification LBA range: start 0x0 length 0x20 00:26:50.299 Malloc2p7 : 5.75 47.29 2.96 0.00 0.00 604408.37 512.89 1108755.14 00:26:50.299 Job: Malloc2p7 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:26:50.299 Verification LBA range: start 0x20 length 0x20 00:26:50.299 Malloc2p7 : 5.71 47.61 2.98 0.00 0.00 607514.88 502.21 977455.19 00:26:50.299 Job: TestPT (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:26:50.299 Verification LBA range: start 0x0 length 0x100 00:26:50.299 TestPT : 6.11 60.26 3.77 0.00 0.00 1831588.60 1089.89 2946954.46 00:26:50.299 Job: TestPT (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:26:50.299 Verification LBA range: start 0x100 length 0x100 00:26:50.299 TestPT : 5.96 56.73 3.55 0.00 0.00 1978903.27 59267.34 2655176.79 00:26:50.299 Job: raid0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:26:50.299 Verification LBA range: start 0x0 length 0x200 00:26:50.299 raid0 : 6.04 66.27 4.14 0.00 0.00 1660492.23 1111.26 2844832.28 00:26:50.299 Job: raid0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:26:50.299 Verification LBA range: start 0x200 length 0x200 00:26:50.299 raid0 : 5.87 65.38 4.09 0.00 0.00 1698235.67 1132.63 2786476.74 00:26:50.299 Job: concat0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:26:50.299 Verification LBA range: start 0x0 length 0x200 00:26:50.299 concat0 : 5.96 75.82 4.74 0.00 0.00 1428860.04 1104.14 2742710.09 00:26:50.299 Job: concat0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:26:50.299 Verification LBA range: start 0x200 length 0x200 00:26:50.299 concat0 : 5.93 70.17 4.39 0.00 0.00 1555859.38 1089.89 2684354.56 00:26:50.299 Job: raid1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:26:50.299 Verification LBA range: start 0x0 length 0x100 00:26:50.299 raid1 : 6.04 98.05 6.13 0.00 0.00 1088359.87 1474.56 2640587.91 00:26:50.299 Job: raid1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:26:50.299 Verification LBA range: start 0x100 length 0x100 00:26:50.299 raid1 : 5.96 80.53 5.03 0.00 0.00 1344123.24 1467.44 2596821.26 00:26:50.299 Job: AIO0 (Core Mask 0x1, workload: verify, depth: 78, IO size: 65536) 00:26:50.299 Verification LBA range: start 0x0 length 0x4e 00:26:50.299 AIO0 : 6.07 98.79 6.17 0.00 0.00 647345.66 619.74 1619366.07 00:26:50.299 Job: AIO0 (Core Mask 0x2, workload: verify, depth: 78, IO size: 65536) 00:26:50.299 Verification LBA range: start 0x4e length 0x4e 00:26:50.299 AIO0 : 5.99 93.92 5.87 0.00 0.00 692214.36 598.37 1546421.65 00:26:50.299 =================================================================================================================== 00:26:50.299 Total : 2518.86 157.43 0.00 0.00 891095.78 473.71 3267909.90 00:26:50.299 00:26:50.299 real 0m7.214s 00:26:50.299 user 0m13.603s 00:26:50.299 sys 0m0.373s 00:26:50.299 11:37:34 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@1125 -- # xtrace_disable 00:26:50.299 11:37:34 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:26:50.299 ************************************ 00:26:50.299 END TEST bdev_verify_big_io 00:26:50.299 ************************************ 00:26:50.299 11:37:34 blockdev_general -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:26:50.299 11:37:34 blockdev_general -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:26:50.299 11:37:34 blockdev_general -- common/autotest_common.sh@1106 -- # xtrace_disable 00:26:50.299 11:37:34 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:26:50.299 ************************************ 00:26:50.299 START TEST bdev_write_zeroes 00:26:50.299 ************************************ 00:26:50.299 11:37:34 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:26:50.558 [2024-06-10 11:37:34.260116] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:26:50.558 [2024-06-10 11:37:34.260161] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid122087 ] 00:26:50.558 [2024-06-10 11:37:34.344555] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:50.558 [2024-06-10 11:37:34.429713] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:26:50.817 [2024-06-10 11:37:34.576407] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:26:50.817 [2024-06-10 11:37:34.576457] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:26:50.817 [2024-06-10 11:37:34.576467] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:50.817 [2024-06-10 11:37:34.584398] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:26:50.817 [2024-06-10 11:37:34.584417] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:26:50.817 [2024-06-10 11:37:34.592409] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:26:50.817 [2024-06-10 11:37:34.592426] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:26:50.817 [2024-06-10 11:37:34.664981] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:26:50.817 [2024-06-10 11:37:34.665029] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:50.817 [2024-06-10 11:37:34.665043] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1cc5a90 00:26:50.818 [2024-06-10 11:37:34.665051] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:50.818 [2024-06-10 11:37:34.666031] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:50.818 [2024-06-10 11:37:34.666055] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:26:51.076 Running I/O for 1 seconds... 00:26:52.017 00:26:52.017 Latency(us) 00:26:52.017 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:52.017 Job: Malloc0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:26:52.017 Malloc0 : 1.03 7693.37 30.05 0.00 0.00 16636.54 441.66 27582.11 00:26:52.017 Job: Malloc1p0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:26:52.017 Malloc1p0 : 1.03 7686.43 30.03 0.00 0.00 16631.93 601.93 27012.23 00:26:52.018 Job: Malloc1p1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:26:52.018 Malloc1p1 : 1.03 7679.48 30.00 0.00 0.00 16620.91 594.81 26442.35 00:26:52.018 Job: Malloc2p0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:26:52.018 Malloc2p0 : 1.03 7672.56 29.97 0.00 0.00 16612.22 601.93 25872.47 00:26:52.018 Job: Malloc2p1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:26:52.018 Malloc2p1 : 1.04 7665.72 29.94 0.00 0.00 16605.31 594.81 25302.59 00:26:52.018 Job: Malloc2p2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:26:52.018 Malloc2p2 : 1.04 7658.88 29.92 0.00 0.00 16594.30 598.37 24732.72 00:26:52.018 Job: Malloc2p3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:26:52.018 Malloc2p3 : 1.04 7652.02 29.89 0.00 0.00 16587.95 598.37 24048.86 00:26:52.018 Job: Malloc2p4 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:26:52.018 Malloc2p4 : 1.04 7645.17 29.86 0.00 0.00 16577.02 598.37 23478.98 00:26:52.018 Job: Malloc2p5 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:26:52.018 Malloc2p5 : 1.04 7638.40 29.84 0.00 0.00 16566.25 598.37 22909.11 00:26:52.018 Job: Malloc2p6 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:26:52.018 Malloc2p6 : 1.04 7631.53 29.81 0.00 0.00 16555.52 598.37 22339.23 00:26:52.018 Job: Malloc2p7 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:26:52.018 Malloc2p7 : 1.04 7624.71 29.78 0.00 0.00 16550.46 594.81 21769.35 00:26:52.018 Job: TestPT (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:26:52.018 TestPT : 1.04 7617.97 29.76 0.00 0.00 16540.43 619.74 21199.47 00:26:52.018 Job: raid0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:26:52.018 raid0 : 1.04 7610.17 29.73 0.00 0.00 16530.83 1032.90 20173.69 00:26:52.018 Job: concat0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:26:52.018 concat0 : 1.04 7602.51 29.70 0.00 0.00 16508.47 1032.90 19147.91 00:26:52.018 Job: raid1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:26:52.018 raid1 : 1.05 7592.97 29.66 0.00 0.00 16482.26 1645.52 17666.23 00:26:52.018 Job: AIO0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:26:52.018 AIO0 : 1.05 7587.14 29.64 0.00 0.00 16439.95 690.98 17552.25 00:26:52.018 =================================================================================================================== 00:26:52.018 Total : 122259.03 477.57 0.00 0.00 16565.02 441.66 27582.11 00:26:52.276 00:26:52.277 real 0m2.015s 00:26:52.277 user 0m1.653s 00:26:52.277 sys 0m0.307s 00:26:52.277 11:37:36 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@1125 -- # xtrace_disable 00:26:52.277 11:37:36 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:26:52.277 ************************************ 00:26:52.277 END TEST bdev_write_zeroes 00:26:52.277 ************************************ 00:26:52.536 11:37:36 blockdev_general -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:26:52.536 11:37:36 blockdev_general -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:26:52.536 11:37:36 blockdev_general -- common/autotest_common.sh@1106 -- # xtrace_disable 00:26:52.536 11:37:36 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:26:52.536 ************************************ 00:26:52.536 START TEST bdev_json_nonenclosed 00:26:52.536 ************************************ 00:26:52.536 11:37:36 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:26:52.536 [2024-06-10 11:37:36.362516] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:26:52.536 [2024-06-10 11:37:36.362565] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid122295 ] 00:26:52.536 [2024-06-10 11:37:36.454604] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:52.795 [2024-06-10 11:37:36.539437] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:26:52.795 [2024-06-10 11:37:36.539506] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:26:52.795 [2024-06-10 11:37:36.539523] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:26:52.795 [2024-06-10 11:37:36.539532] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:26:52.795 00:26:52.795 real 0m0.321s 00:26:52.795 user 0m0.193s 00:26:52.795 sys 0m0.125s 00:26:52.795 11:37:36 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1125 -- # xtrace_disable 00:26:52.795 11:37:36 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:26:52.795 ************************************ 00:26:52.795 END TEST bdev_json_nonenclosed 00:26:52.795 ************************************ 00:26:52.795 11:37:36 blockdev_general -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:26:52.795 11:37:36 blockdev_general -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:26:52.795 11:37:36 blockdev_general -- common/autotest_common.sh@1106 -- # xtrace_disable 00:26:52.795 11:37:36 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:26:52.795 ************************************ 00:26:52.795 START TEST bdev_json_nonarray 00:26:52.795 ************************************ 00:26:52.795 11:37:36 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:26:53.055 [2024-06-10 11:37:36.767334] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:26:53.055 [2024-06-10 11:37:36.767379] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid122483 ] 00:26:53.055 [2024-06-10 11:37:36.851035] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:53.055 [2024-06-10 11:37:36.932198] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:26:53.055 [2024-06-10 11:37:36.932263] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:26:53.055 [2024-06-10 11:37:36.932277] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:26:53.055 [2024-06-10 11:37:36.932285] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:26:53.314 00:26:53.314 real 0m0.302s 00:26:53.314 user 0m0.186s 00:26:53.314 sys 0m0.114s 00:26:53.314 11:37:37 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1125 -- # xtrace_disable 00:26:53.314 11:37:37 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:26:53.314 ************************************ 00:26:53.314 END TEST bdev_json_nonarray 00:26:53.314 ************************************ 00:26:53.314 11:37:37 blockdev_general -- bdev/blockdev.sh@787 -- # [[ bdev == bdev ]] 00:26:53.314 11:37:37 blockdev_general -- bdev/blockdev.sh@788 -- # run_test bdev_qos qos_test_suite '' 00:26:53.314 11:37:37 blockdev_general -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:26:53.314 11:37:37 blockdev_general -- common/autotest_common.sh@1106 -- # xtrace_disable 00:26:53.314 11:37:37 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:26:53.314 ************************************ 00:26:53.314 START TEST bdev_qos 00:26:53.314 ************************************ 00:26:53.314 11:37:37 blockdev_general.bdev_qos -- common/autotest_common.sh@1124 -- # qos_test_suite '' 00:26:53.314 11:37:37 blockdev_general.bdev_qos -- bdev/blockdev.sh@446 -- # QOS_PID=122506 00:26:53.314 11:37:37 blockdev_general.bdev_qos -- bdev/blockdev.sh@447 -- # echo 'Process qos testing pid: 122506' 00:26:53.314 Process qos testing pid: 122506 00:26:53.314 11:37:37 blockdev_general.bdev_qos -- bdev/blockdev.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 256 -o 4096 -w randread -t 60 '' 00:26:53.314 11:37:37 blockdev_general.bdev_qos -- bdev/blockdev.sh@448 -- # trap 'cleanup; killprocess $QOS_PID; exit 1' SIGINT SIGTERM EXIT 00:26:53.314 11:37:37 blockdev_general.bdev_qos -- bdev/blockdev.sh@449 -- # waitforlisten 122506 00:26:53.314 11:37:37 blockdev_general.bdev_qos -- common/autotest_common.sh@830 -- # '[' -z 122506 ']' 00:26:53.314 11:37:37 blockdev_general.bdev_qos -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:26:53.314 11:37:37 blockdev_general.bdev_qos -- common/autotest_common.sh@835 -- # local max_retries=100 00:26:53.314 11:37:37 blockdev_general.bdev_qos -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:26:53.314 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:26:53.314 11:37:37 blockdev_general.bdev_qos -- common/autotest_common.sh@839 -- # xtrace_disable 00:26:53.314 11:37:37 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:26:53.314 [2024-06-10 11:37:37.151041] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:26:53.314 [2024-06-10 11:37:37.151097] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid122506 ] 00:26:53.314 [2024-06-10 11:37:37.237799] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:53.573 [2024-06-10 11:37:37.322844] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:26:54.142 11:37:37 blockdev_general.bdev_qos -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:26:54.142 11:37:37 blockdev_general.bdev_qos -- common/autotest_common.sh@863 -- # return 0 00:26:54.142 11:37:37 blockdev_general.bdev_qos -- bdev/blockdev.sh@451 -- # rpc_cmd bdev_malloc_create -b Malloc_0 128 512 00:26:54.142 11:37:37 blockdev_general.bdev_qos -- common/autotest_common.sh@560 -- # xtrace_disable 00:26:54.142 11:37:37 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:26:54.142 Malloc_0 00:26:54.142 11:37:37 blockdev_general.bdev_qos -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:26:54.142 11:37:37 blockdev_general.bdev_qos -- bdev/blockdev.sh@452 -- # waitforbdev Malloc_0 00:26:54.142 11:37:37 blockdev_general.bdev_qos -- common/autotest_common.sh@898 -- # local bdev_name=Malloc_0 00:26:54.142 11:37:37 blockdev_general.bdev_qos -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:26:54.142 11:37:37 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # local i 00:26:54.142 11:37:37 blockdev_general.bdev_qos -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:26:54.142 11:37:37 blockdev_general.bdev_qos -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:26:54.142 11:37:37 blockdev_general.bdev_qos -- common/autotest_common.sh@903 -- # rpc_cmd bdev_wait_for_examine 00:26:54.142 11:37:37 blockdev_general.bdev_qos -- common/autotest_common.sh@560 -- # xtrace_disable 00:26:54.142 11:37:37 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:26:54.142 11:37:37 blockdev_general.bdev_qos -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:26:54.142 11:37:37 blockdev_general.bdev_qos -- common/autotest_common.sh@905 -- # rpc_cmd bdev_get_bdevs -b Malloc_0 -t 2000 00:26:54.142 11:37:37 blockdev_general.bdev_qos -- common/autotest_common.sh@560 -- # xtrace_disable 00:26:54.142 11:37:37 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:26:54.142 [ 00:26:54.142 { 00:26:54.142 "name": "Malloc_0", 00:26:54.142 "aliases": [ 00:26:54.142 "e48f2fe4-1e35-49d5-ae78-9a74de5501ae" 00:26:54.142 ], 00:26:54.142 "product_name": "Malloc disk", 00:26:54.142 "block_size": 512, 00:26:54.142 "num_blocks": 262144, 00:26:54.142 "uuid": "e48f2fe4-1e35-49d5-ae78-9a74de5501ae", 00:26:54.142 "assigned_rate_limits": { 00:26:54.142 "rw_ios_per_sec": 0, 00:26:54.142 "rw_mbytes_per_sec": 0, 00:26:54.143 "r_mbytes_per_sec": 0, 00:26:54.143 "w_mbytes_per_sec": 0 00:26:54.143 }, 00:26:54.143 "claimed": false, 00:26:54.143 "zoned": false, 00:26:54.143 "supported_io_types": { 00:26:54.143 "read": true, 00:26:54.143 "write": true, 00:26:54.143 "unmap": true, 00:26:54.143 "write_zeroes": true, 00:26:54.143 "flush": true, 00:26:54.143 "reset": true, 00:26:54.143 "compare": false, 00:26:54.143 "compare_and_write": false, 00:26:54.143 "abort": true, 00:26:54.143 "nvme_admin": false, 00:26:54.143 "nvme_io": false 00:26:54.143 }, 00:26:54.143 "memory_domains": [ 00:26:54.143 { 00:26:54.143 "dma_device_id": "system", 00:26:54.143 "dma_device_type": 1 00:26:54.143 }, 00:26:54.143 { 00:26:54.143 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:54.143 "dma_device_type": 2 00:26:54.143 } 00:26:54.143 ], 00:26:54.143 "driver_specific": {} 00:26:54.143 } 00:26:54.143 ] 00:26:54.143 11:37:38 blockdev_general.bdev_qos -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:26:54.143 11:37:38 blockdev_general.bdev_qos -- common/autotest_common.sh@906 -- # return 0 00:26:54.143 11:37:38 blockdev_general.bdev_qos -- bdev/blockdev.sh@453 -- # rpc_cmd bdev_null_create Null_1 128 512 00:26:54.143 11:37:38 blockdev_general.bdev_qos -- common/autotest_common.sh@560 -- # xtrace_disable 00:26:54.143 11:37:38 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:26:54.143 Null_1 00:26:54.143 11:37:38 blockdev_general.bdev_qos -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:26:54.143 11:37:38 blockdev_general.bdev_qos -- bdev/blockdev.sh@454 -- # waitforbdev Null_1 00:26:54.143 11:37:38 blockdev_general.bdev_qos -- common/autotest_common.sh@898 -- # local bdev_name=Null_1 00:26:54.143 11:37:38 blockdev_general.bdev_qos -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:26:54.143 11:37:38 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # local i 00:26:54.143 11:37:38 blockdev_general.bdev_qos -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:26:54.143 11:37:38 blockdev_general.bdev_qos -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:26:54.143 11:37:38 blockdev_general.bdev_qos -- common/autotest_common.sh@903 -- # rpc_cmd bdev_wait_for_examine 00:26:54.143 11:37:38 blockdev_general.bdev_qos -- common/autotest_common.sh@560 -- # xtrace_disable 00:26:54.143 11:37:38 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:26:54.143 11:37:38 blockdev_general.bdev_qos -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:26:54.143 11:37:38 blockdev_general.bdev_qos -- common/autotest_common.sh@905 -- # rpc_cmd bdev_get_bdevs -b Null_1 -t 2000 00:26:54.143 11:37:38 blockdev_general.bdev_qos -- common/autotest_common.sh@560 -- # xtrace_disable 00:26:54.143 11:37:38 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:26:54.143 [ 00:26:54.143 { 00:26:54.143 "name": "Null_1", 00:26:54.143 "aliases": [ 00:26:54.143 "33673813-1287-401e-b948-79ef5357ea06" 00:26:54.143 ], 00:26:54.143 "product_name": "Null disk", 00:26:54.143 "block_size": 512, 00:26:54.143 "num_blocks": 262144, 00:26:54.143 "uuid": "33673813-1287-401e-b948-79ef5357ea06", 00:26:54.143 "assigned_rate_limits": { 00:26:54.143 "rw_ios_per_sec": 0, 00:26:54.143 "rw_mbytes_per_sec": 0, 00:26:54.143 "r_mbytes_per_sec": 0, 00:26:54.143 "w_mbytes_per_sec": 0 00:26:54.143 }, 00:26:54.143 "claimed": false, 00:26:54.143 "zoned": false, 00:26:54.143 "supported_io_types": { 00:26:54.143 "read": true, 00:26:54.143 "write": true, 00:26:54.143 "unmap": false, 00:26:54.143 "write_zeroes": true, 00:26:54.143 "flush": false, 00:26:54.143 "reset": true, 00:26:54.143 "compare": false, 00:26:54.143 "compare_and_write": false, 00:26:54.143 "abort": true, 00:26:54.143 "nvme_admin": false, 00:26:54.143 "nvme_io": false 00:26:54.143 }, 00:26:54.143 "driver_specific": {} 00:26:54.143 } 00:26:54.143 ] 00:26:54.143 11:37:38 blockdev_general.bdev_qos -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:26:54.143 11:37:38 blockdev_general.bdev_qos -- common/autotest_common.sh@906 -- # return 0 00:26:54.143 11:37:38 blockdev_general.bdev_qos -- bdev/blockdev.sh@457 -- # qos_function_test 00:26:54.143 11:37:38 blockdev_general.bdev_qos -- bdev/blockdev.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:26:54.143 11:37:38 blockdev_general.bdev_qos -- bdev/blockdev.sh@410 -- # local qos_lower_iops_limit=1000 00:26:54.143 11:37:38 blockdev_general.bdev_qos -- bdev/blockdev.sh@411 -- # local qos_lower_bw_limit=2 00:26:54.143 11:37:38 blockdev_general.bdev_qos -- bdev/blockdev.sh@412 -- # local io_result=0 00:26:54.143 11:37:38 blockdev_general.bdev_qos -- bdev/blockdev.sh@413 -- # local iops_limit=0 00:26:54.143 11:37:38 blockdev_general.bdev_qos -- bdev/blockdev.sh@414 -- # local bw_limit=0 00:26:54.143 11:37:38 blockdev_general.bdev_qos -- bdev/blockdev.sh@416 -- # get_io_result IOPS Malloc_0 00:26:54.143 11:37:38 blockdev_general.bdev_qos -- bdev/blockdev.sh@375 -- # local limit_type=IOPS 00:26:54.143 11:37:38 blockdev_general.bdev_qos -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:26:54.143 11:37:38 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # local iostat_result 00:26:54.143 11:37:38 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:26:54.143 11:37:38 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:26:54.143 11:37:38 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # tail -1 00:26:54.402 Running I/O for 60 seconds... 00:26:59.676 11:37:43 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 99317.26 397269.05 0.00 0.00 399360.00 0.00 0.00 ' 00:26:59.676 11:37:43 blockdev_general.bdev_qos -- bdev/blockdev.sh@379 -- # '[' IOPS = IOPS ']' 00:26:59.676 11:37:43 blockdev_general.bdev_qos -- bdev/blockdev.sh@380 -- # awk '{print $2}' 00:26:59.676 11:37:43 blockdev_general.bdev_qos -- bdev/blockdev.sh@380 -- # iostat_result=99317.26 00:26:59.676 11:37:43 blockdev_general.bdev_qos -- bdev/blockdev.sh@385 -- # echo 99317 00:26:59.676 11:37:43 blockdev_general.bdev_qos -- bdev/blockdev.sh@416 -- # io_result=99317 00:26:59.676 11:37:43 blockdev_general.bdev_qos -- bdev/blockdev.sh@418 -- # iops_limit=24000 00:26:59.676 11:37:43 blockdev_general.bdev_qos -- bdev/blockdev.sh@419 -- # '[' 24000 -gt 1000 ']' 00:26:59.676 11:37:43 blockdev_general.bdev_qos -- bdev/blockdev.sh@422 -- # rpc_cmd bdev_set_qos_limit --rw_ios_per_sec 24000 Malloc_0 00:26:59.676 11:37:43 blockdev_general.bdev_qos -- common/autotest_common.sh@560 -- # xtrace_disable 00:26:59.676 11:37:43 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:26:59.676 11:37:43 blockdev_general.bdev_qos -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:26:59.676 11:37:43 blockdev_general.bdev_qos -- bdev/blockdev.sh@423 -- # run_test bdev_qos_iops run_qos_test 24000 IOPS Malloc_0 00:26:59.676 11:37:43 blockdev_general.bdev_qos -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:26:59.676 11:37:43 blockdev_general.bdev_qos -- common/autotest_common.sh@1106 -- # xtrace_disable 00:26:59.676 11:37:43 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:26:59.676 ************************************ 00:26:59.676 START TEST bdev_qos_iops 00:26:59.676 ************************************ 00:26:59.676 11:37:43 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@1124 -- # run_qos_test 24000 IOPS Malloc_0 00:26:59.676 11:37:43 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@389 -- # local qos_limit=24000 00:26:59.676 11:37:43 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@390 -- # local qos_result=0 00:26:59.676 11:37:43 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@392 -- # get_io_result IOPS Malloc_0 00:26:59.676 11:37:43 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@375 -- # local limit_type=IOPS 00:26:59.676 11:37:43 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:26:59.676 11:37:43 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@377 -- # local iostat_result 00:26:59.676 11:37:43 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:26:59.676 11:37:43 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:26:59.676 11:37:43 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # tail -1 00:27:04.948 11:37:48 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 23944.24 95776.94 0.00 0.00 97248.00 0.00 0.00 ' 00:27:04.948 11:37:48 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@379 -- # '[' IOPS = IOPS ']' 00:27:04.948 11:37:48 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@380 -- # awk '{print $2}' 00:27:04.948 11:37:48 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@380 -- # iostat_result=23944.24 00:27:04.948 11:37:48 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@385 -- # echo 23944 00:27:04.948 11:37:48 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@392 -- # qos_result=23944 00:27:04.948 11:37:48 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@393 -- # '[' IOPS = BANDWIDTH ']' 00:27:04.948 11:37:48 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@396 -- # lower_limit=21600 00:27:04.948 11:37:48 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@397 -- # upper_limit=26400 00:27:04.948 11:37:48 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@400 -- # '[' 23944 -lt 21600 ']' 00:27:04.948 11:37:48 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@400 -- # '[' 23944 -gt 26400 ']' 00:27:04.948 00:27:04.948 real 0m5.193s 00:27:04.948 user 0m0.085s 00:27:04.948 sys 0m0.044s 00:27:04.948 11:37:48 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@1125 -- # xtrace_disable 00:27:04.948 11:37:48 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@10 -- # set +x 00:27:04.948 ************************************ 00:27:04.948 END TEST bdev_qos_iops 00:27:04.948 ************************************ 00:27:04.948 11:37:48 blockdev_general.bdev_qos -- bdev/blockdev.sh@427 -- # get_io_result BANDWIDTH Null_1 00:27:04.948 11:37:48 blockdev_general.bdev_qos -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:27:04.948 11:37:48 blockdev_general.bdev_qos -- bdev/blockdev.sh@376 -- # local qos_dev=Null_1 00:27:04.948 11:37:48 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # local iostat_result 00:27:04.948 11:37:48 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:27:04.948 11:37:48 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # grep Null_1 00:27:04.948 11:37:48 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # tail -1 00:27:10.215 11:37:53 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # iostat_result='Null_1 30306.15 121224.60 0.00 0.00 122880.00 0.00 0.00 ' 00:27:10.215 11:37:53 blockdev_general.bdev_qos -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:27:10.215 11:37:53 blockdev_general.bdev_qos -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:27:10.215 11:37:53 blockdev_general.bdev_qos -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:27:10.215 11:37:53 blockdev_general.bdev_qos -- bdev/blockdev.sh@382 -- # iostat_result=122880.00 00:27:10.215 11:37:53 blockdev_general.bdev_qos -- bdev/blockdev.sh@385 -- # echo 122880 00:27:10.215 11:37:53 blockdev_general.bdev_qos -- bdev/blockdev.sh@427 -- # bw_limit=122880 00:27:10.215 11:37:53 blockdev_general.bdev_qos -- bdev/blockdev.sh@428 -- # bw_limit=12 00:27:10.215 11:37:53 blockdev_general.bdev_qos -- bdev/blockdev.sh@429 -- # '[' 12 -lt 2 ']' 00:27:10.215 11:37:53 blockdev_general.bdev_qos -- bdev/blockdev.sh@432 -- # rpc_cmd bdev_set_qos_limit --rw_mbytes_per_sec 12 Null_1 00:27:10.215 11:37:53 blockdev_general.bdev_qos -- common/autotest_common.sh@560 -- # xtrace_disable 00:27:10.215 11:37:53 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:27:10.215 11:37:53 blockdev_general.bdev_qos -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:27:10.215 11:37:53 blockdev_general.bdev_qos -- bdev/blockdev.sh@433 -- # run_test bdev_qos_bw run_qos_test 12 BANDWIDTH Null_1 00:27:10.215 11:37:53 blockdev_general.bdev_qos -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:27:10.215 11:37:53 blockdev_general.bdev_qos -- common/autotest_common.sh@1106 -- # xtrace_disable 00:27:10.215 11:37:53 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:27:10.215 ************************************ 00:27:10.215 START TEST bdev_qos_bw 00:27:10.215 ************************************ 00:27:10.215 11:37:53 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@1124 -- # run_qos_test 12 BANDWIDTH Null_1 00:27:10.215 11:37:53 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@389 -- # local qos_limit=12 00:27:10.215 11:37:53 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@390 -- # local qos_result=0 00:27:10.215 11:37:53 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@392 -- # get_io_result BANDWIDTH Null_1 00:27:10.215 11:37:53 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:27:10.215 11:37:53 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@376 -- # local qos_dev=Null_1 00:27:10.215 11:37:53 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@377 -- # local iostat_result 00:27:10.215 11:37:53 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # grep Null_1 00:27:10.215 11:37:53 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:27:10.215 11:37:53 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # tail -1 00:27:15.481 11:37:58 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # iostat_result='Null_1 3070.25 12281.01 0.00 0.00 12460.00 0.00 0.00 ' 00:27:15.481 11:37:58 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:27:15.481 11:37:58 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:27:15.481 11:37:58 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:27:15.481 11:37:58 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@382 -- # iostat_result=12460.00 00:27:15.481 11:37:58 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@385 -- # echo 12460 00:27:15.481 11:37:58 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@392 -- # qos_result=12460 00:27:15.481 11:37:58 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@393 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:27:15.481 11:37:58 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@394 -- # qos_limit=12288 00:27:15.481 11:37:58 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@396 -- # lower_limit=11059 00:27:15.481 11:37:58 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@397 -- # upper_limit=13516 00:27:15.481 11:37:58 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@400 -- # '[' 12460 -lt 11059 ']' 00:27:15.481 11:37:58 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@400 -- # '[' 12460 -gt 13516 ']' 00:27:15.481 00:27:15.481 real 0m5.208s 00:27:15.481 user 0m0.087s 00:27:15.481 sys 0m0.041s 00:27:15.481 11:37:58 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@1125 -- # xtrace_disable 00:27:15.481 11:37:58 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@10 -- # set +x 00:27:15.481 ************************************ 00:27:15.481 END TEST bdev_qos_bw 00:27:15.481 ************************************ 00:27:15.481 11:37:59 blockdev_general.bdev_qos -- bdev/blockdev.sh@436 -- # rpc_cmd bdev_set_qos_limit --r_mbytes_per_sec 2 Malloc_0 00:27:15.481 11:37:59 blockdev_general.bdev_qos -- common/autotest_common.sh@560 -- # xtrace_disable 00:27:15.481 11:37:59 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:27:15.481 11:37:59 blockdev_general.bdev_qos -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:27:15.481 11:37:59 blockdev_general.bdev_qos -- bdev/blockdev.sh@437 -- # run_test bdev_qos_ro_bw run_qos_test 2 BANDWIDTH Malloc_0 00:27:15.481 11:37:59 blockdev_general.bdev_qos -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:27:15.481 11:37:59 blockdev_general.bdev_qos -- common/autotest_common.sh@1106 -- # xtrace_disable 00:27:15.481 11:37:59 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:27:15.481 ************************************ 00:27:15.481 START TEST bdev_qos_ro_bw 00:27:15.481 ************************************ 00:27:15.481 11:37:59 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@1124 -- # run_qos_test 2 BANDWIDTH Malloc_0 00:27:15.481 11:37:59 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@389 -- # local qos_limit=2 00:27:15.481 11:37:59 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@390 -- # local qos_result=0 00:27:15.481 11:37:59 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@392 -- # get_io_result BANDWIDTH Malloc_0 00:27:15.481 11:37:59 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:27:15.481 11:37:59 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:27:15.481 11:37:59 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@377 -- # local iostat_result 00:27:15.481 11:37:59 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:27:15.481 11:37:59 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:27:15.481 11:37:59 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # tail -1 00:27:20.754 11:38:04 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 511.34 2045.38 0.00 0.00 2056.00 0.00 0.00 ' 00:27:20.754 11:38:04 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:27:20.754 11:38:04 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:27:20.754 11:38:04 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:27:20.754 11:38:04 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@382 -- # iostat_result=2056.00 00:27:20.754 11:38:04 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@385 -- # echo 2056 00:27:20.755 11:38:04 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@392 -- # qos_result=2056 00:27:20.755 11:38:04 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@393 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:27:20.755 11:38:04 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@394 -- # qos_limit=2048 00:27:20.755 11:38:04 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@396 -- # lower_limit=1843 00:27:20.755 11:38:04 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@397 -- # upper_limit=2252 00:27:20.755 11:38:04 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@400 -- # '[' 2056 -lt 1843 ']' 00:27:20.755 11:38:04 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@400 -- # '[' 2056 -gt 2252 ']' 00:27:20.755 00:27:20.755 real 0m5.156s 00:27:20.755 user 0m0.093s 00:27:20.755 sys 0m0.041s 00:27:20.755 11:38:04 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@1125 -- # xtrace_disable 00:27:20.755 11:38:04 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@10 -- # set +x 00:27:20.755 ************************************ 00:27:20.755 END TEST bdev_qos_ro_bw 00:27:20.755 ************************************ 00:27:20.755 11:38:04 blockdev_general.bdev_qos -- bdev/blockdev.sh@459 -- # rpc_cmd bdev_malloc_delete Malloc_0 00:27:20.755 11:38:04 blockdev_general.bdev_qos -- common/autotest_common.sh@560 -- # xtrace_disable 00:27:20.755 11:38:04 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:27:21.014 11:38:04 blockdev_general.bdev_qos -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:27:21.014 11:38:04 blockdev_general.bdev_qos -- bdev/blockdev.sh@460 -- # rpc_cmd bdev_null_delete Null_1 00:27:21.014 11:38:04 blockdev_general.bdev_qos -- common/autotest_common.sh@560 -- # xtrace_disable 00:27:21.014 11:38:04 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:27:21.014 00:27:21.014 Latency(us) 00:27:21.014 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:21.014 Job: Malloc_0 (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:27:21.014 Malloc_0 : 26.60 33253.87 129.90 0.00 0.00 7623.47 1367.71 503316.48 00:27:21.014 Job: Null_1 (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:27:21.014 Null_1 : 26.71 31857.63 124.44 0.00 0.00 8020.92 516.45 105313.50 00:27:21.014 =================================================================================================================== 00:27:21.014 Total : 65111.50 254.34 0.00 0.00 7818.33 516.45 503316.48 00:27:21.014 0 00:27:21.014 11:38:04 blockdev_general.bdev_qos -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:27:21.014 11:38:04 blockdev_general.bdev_qos -- bdev/blockdev.sh@461 -- # killprocess 122506 00:27:21.014 11:38:04 blockdev_general.bdev_qos -- common/autotest_common.sh@949 -- # '[' -z 122506 ']' 00:27:21.014 11:38:04 blockdev_general.bdev_qos -- common/autotest_common.sh@953 -- # kill -0 122506 00:27:21.014 11:38:04 blockdev_general.bdev_qos -- common/autotest_common.sh@954 -- # uname 00:27:21.014 11:38:04 blockdev_general.bdev_qos -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:27:21.014 11:38:04 blockdev_general.bdev_qos -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 122506 00:27:21.273 11:38:04 blockdev_general.bdev_qos -- common/autotest_common.sh@955 -- # process_name=reactor_1 00:27:21.273 11:38:04 blockdev_general.bdev_qos -- common/autotest_common.sh@959 -- # '[' reactor_1 = sudo ']' 00:27:21.273 11:38:04 blockdev_general.bdev_qos -- common/autotest_common.sh@967 -- # echo 'killing process with pid 122506' 00:27:21.273 killing process with pid 122506 00:27:21.274 11:38:04 blockdev_general.bdev_qos -- common/autotest_common.sh@968 -- # kill 122506 00:27:21.274 Received shutdown signal, test time was about 26.768278 seconds 00:27:21.274 00:27:21.274 Latency(us) 00:27:21.274 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:21.274 =================================================================================================================== 00:27:21.274 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:27:21.274 11:38:04 blockdev_general.bdev_qos -- common/autotest_common.sh@973 -- # wait 122506 00:27:21.274 11:38:05 blockdev_general.bdev_qos -- bdev/blockdev.sh@462 -- # trap - SIGINT SIGTERM EXIT 00:27:21.274 00:27:21.274 real 0m28.071s 00:27:21.274 user 0m28.641s 00:27:21.274 sys 0m0.782s 00:27:21.274 11:38:05 blockdev_general.bdev_qos -- common/autotest_common.sh@1125 -- # xtrace_disable 00:27:21.274 11:38:05 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:27:21.274 ************************************ 00:27:21.274 END TEST bdev_qos 00:27:21.274 ************************************ 00:27:21.274 11:38:05 blockdev_general -- bdev/blockdev.sh@789 -- # run_test bdev_qd_sampling qd_sampling_test_suite '' 00:27:21.274 11:38:05 blockdev_general -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:27:21.274 11:38:05 blockdev_general -- common/autotest_common.sh@1106 -- # xtrace_disable 00:27:21.274 11:38:05 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:27:21.534 ************************************ 00:27:21.534 START TEST bdev_qd_sampling 00:27:21.534 ************************************ 00:27:21.534 11:38:05 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@1124 -- # qd_sampling_test_suite '' 00:27:21.534 11:38:05 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@538 -- # QD_DEV=Malloc_QD 00:27:21.534 11:38:05 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@540 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x3 -q 256 -o 4096 -w randread -t 5 -C '' 00:27:21.534 11:38:05 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@541 -- # QD_PID=126298 00:27:21.534 11:38:05 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@542 -- # echo 'Process bdev QD sampling period testing pid: 126298' 00:27:21.534 Process bdev QD sampling period testing pid: 126298 00:27:21.534 11:38:05 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@543 -- # trap 'cleanup; killprocess $QD_PID; exit 1' SIGINT SIGTERM EXIT 00:27:21.534 11:38:05 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@544 -- # waitforlisten 126298 00:27:21.534 11:38:05 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@830 -- # '[' -z 126298 ']' 00:27:21.534 11:38:05 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:27:21.534 11:38:05 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@835 -- # local max_retries=100 00:27:21.534 11:38:05 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:27:21.534 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:27:21.534 11:38:05 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@839 -- # xtrace_disable 00:27:21.534 11:38:05 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:27:21.534 [2024-06-10 11:38:05.274967] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:27:21.534 [2024-06-10 11:38:05.275014] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid126298 ] 00:27:21.534 [2024-06-10 11:38:05.362333] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:27:21.534 [2024-06-10 11:38:05.451544] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:27:21.534 [2024-06-10 11:38:05.451546] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:27:22.473 11:38:06 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:27:22.473 11:38:06 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@863 -- # return 0 00:27:22.473 11:38:06 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@546 -- # rpc_cmd bdev_malloc_create -b Malloc_QD 128 512 00:27:22.473 11:38:06 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@560 -- # xtrace_disable 00:27:22.473 11:38:06 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:27:22.473 Malloc_QD 00:27:22.473 11:38:06 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:27:22.473 11:38:06 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@547 -- # waitforbdev Malloc_QD 00:27:22.473 11:38:06 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@898 -- # local bdev_name=Malloc_QD 00:27:22.473 11:38:06 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:27:22.473 11:38:06 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@900 -- # local i 00:27:22.473 11:38:06 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:27:22.473 11:38:06 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:27:22.473 11:38:06 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@903 -- # rpc_cmd bdev_wait_for_examine 00:27:22.473 11:38:06 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@560 -- # xtrace_disable 00:27:22.473 11:38:06 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:27:22.473 11:38:06 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:27:22.473 11:38:06 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@905 -- # rpc_cmd bdev_get_bdevs -b Malloc_QD -t 2000 00:27:22.473 11:38:06 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@560 -- # xtrace_disable 00:27:22.473 11:38:06 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:27:22.473 [ 00:27:22.473 { 00:27:22.473 "name": "Malloc_QD", 00:27:22.473 "aliases": [ 00:27:22.473 "7e3570b7-8942-467b-808a-fb05dc8f3020" 00:27:22.473 ], 00:27:22.473 "product_name": "Malloc disk", 00:27:22.473 "block_size": 512, 00:27:22.473 "num_blocks": 262144, 00:27:22.473 "uuid": "7e3570b7-8942-467b-808a-fb05dc8f3020", 00:27:22.473 "assigned_rate_limits": { 00:27:22.473 "rw_ios_per_sec": 0, 00:27:22.473 "rw_mbytes_per_sec": 0, 00:27:22.473 "r_mbytes_per_sec": 0, 00:27:22.473 "w_mbytes_per_sec": 0 00:27:22.473 }, 00:27:22.473 "claimed": false, 00:27:22.473 "zoned": false, 00:27:22.473 "supported_io_types": { 00:27:22.473 "read": true, 00:27:22.473 "write": true, 00:27:22.473 "unmap": true, 00:27:22.473 "write_zeroes": true, 00:27:22.473 "flush": true, 00:27:22.473 "reset": true, 00:27:22.473 "compare": false, 00:27:22.473 "compare_and_write": false, 00:27:22.473 "abort": true, 00:27:22.473 "nvme_admin": false, 00:27:22.473 "nvme_io": false 00:27:22.473 }, 00:27:22.473 "memory_domains": [ 00:27:22.473 { 00:27:22.473 "dma_device_id": "system", 00:27:22.473 "dma_device_type": 1 00:27:22.473 }, 00:27:22.473 { 00:27:22.473 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:22.473 "dma_device_type": 2 00:27:22.473 } 00:27:22.473 ], 00:27:22.473 "driver_specific": {} 00:27:22.473 } 00:27:22.473 ] 00:27:22.473 11:38:06 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:27:22.473 11:38:06 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@906 -- # return 0 00:27:22.473 11:38:06 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@550 -- # sleep 2 00:27:22.473 11:38:06 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@549 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:27:22.473 Running I/O for 5 seconds... 00:27:24.442 11:38:08 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@551 -- # qd_sampling_function_test Malloc_QD 00:27:24.442 11:38:08 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@519 -- # local bdev_name=Malloc_QD 00:27:24.442 11:38:08 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@520 -- # local sampling_period=10 00:27:24.442 11:38:08 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@521 -- # local iostats 00:27:24.442 11:38:08 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@523 -- # rpc_cmd bdev_set_qd_sampling_period Malloc_QD 10 00:27:24.442 11:38:08 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@560 -- # xtrace_disable 00:27:24.442 11:38:08 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:27:24.442 11:38:08 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:27:24.442 11:38:08 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@525 -- # rpc_cmd bdev_get_iostat -b Malloc_QD 00:27:24.442 11:38:08 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@560 -- # xtrace_disable 00:27:24.442 11:38:08 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:27:24.442 11:38:08 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:27:24.442 11:38:08 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@525 -- # iostats='{ 00:27:24.442 "tick_rate": 2300000000, 00:27:24.442 "ticks": 4184314162627830, 00:27:24.442 "bdevs": [ 00:27:24.442 { 00:27:24.442 "name": "Malloc_QD", 00:27:24.442 "bytes_read": 1015067136, 00:27:24.442 "num_read_ops": 247812, 00:27:24.442 "bytes_written": 0, 00:27:24.442 "num_write_ops": 0, 00:27:24.442 "bytes_unmapped": 0, 00:27:24.442 "num_unmap_ops": 0, 00:27:24.442 "bytes_copied": 0, 00:27:24.442 "num_copy_ops": 0, 00:27:24.442 "read_latency_ticks": 2281936622204, 00:27:24.442 "max_read_latency_ticks": 11437884, 00:27:24.442 "min_read_latency_ticks": 201614, 00:27:24.442 "write_latency_ticks": 0, 00:27:24.442 "max_write_latency_ticks": 0, 00:27:24.442 "min_write_latency_ticks": 0, 00:27:24.442 "unmap_latency_ticks": 0, 00:27:24.442 "max_unmap_latency_ticks": 0, 00:27:24.442 "min_unmap_latency_ticks": 0, 00:27:24.442 "copy_latency_ticks": 0, 00:27:24.442 "max_copy_latency_ticks": 0, 00:27:24.442 "min_copy_latency_ticks": 0, 00:27:24.442 "io_error": {}, 00:27:24.442 "queue_depth_polling_period": 10, 00:27:24.442 "queue_depth": 512, 00:27:24.442 "io_time": 30, 00:27:24.442 "weighted_io_time": 15360 00:27:24.442 } 00:27:24.442 ] 00:27:24.442 }' 00:27:24.442 11:38:08 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@527 -- # jq -r '.bdevs[0].queue_depth_polling_period' 00:27:24.442 11:38:08 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@527 -- # qd_sampling_period=10 00:27:24.442 11:38:08 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@529 -- # '[' 10 == null ']' 00:27:24.442 11:38:08 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@529 -- # '[' 10 -ne 10 ']' 00:27:24.442 11:38:08 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@553 -- # rpc_cmd bdev_malloc_delete Malloc_QD 00:27:24.442 11:38:08 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@560 -- # xtrace_disable 00:27:24.442 11:38:08 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:27:24.442 00:27:24.442 Latency(us) 00:27:24.443 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:24.443 Job: Malloc_QD (Core Mask 0x1, workload: randread, depth: 256, IO size: 4096) 00:27:24.443 Malloc_QD : 2.01 62676.02 244.83 0.00 0.00 4075.73 1061.40 4530.53 00:27:24.443 Job: Malloc_QD (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:27:24.443 Malloc_QD : 2.02 65042.43 254.07 0.00 0.00 3927.99 658.92 4986.43 00:27:24.443 =================================================================================================================== 00:27:24.443 Total : 127718.44 498.90 0.00 0.00 4000.46 658.92 4986.43 00:27:24.443 0 00:27:24.443 11:38:08 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:27:24.443 11:38:08 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@554 -- # killprocess 126298 00:27:24.443 11:38:08 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@949 -- # '[' -z 126298 ']' 00:27:24.443 11:38:08 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@953 -- # kill -0 126298 00:27:24.443 11:38:08 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@954 -- # uname 00:27:24.443 11:38:08 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:27:24.443 11:38:08 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 126298 00:27:24.443 11:38:08 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:27:24.443 11:38:08 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:27:24.443 11:38:08 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@967 -- # echo 'killing process with pid 126298' 00:27:24.443 killing process with pid 126298 00:27:24.443 11:38:08 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@968 -- # kill 126298 00:27:24.443 Received shutdown signal, test time was about 2.091678 seconds 00:27:24.443 00:27:24.443 Latency(us) 00:27:24.443 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:24.443 =================================================================================================================== 00:27:24.443 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:27:24.443 11:38:08 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@973 -- # wait 126298 00:27:24.702 11:38:08 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@555 -- # trap - SIGINT SIGTERM EXIT 00:27:24.702 00:27:24.702 real 0m3.287s 00:27:24.702 user 0m6.448s 00:27:24.702 sys 0m0.378s 00:27:24.702 11:38:08 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@1125 -- # xtrace_disable 00:27:24.702 11:38:08 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:27:24.703 ************************************ 00:27:24.703 END TEST bdev_qd_sampling 00:27:24.703 ************************************ 00:27:24.703 11:38:08 blockdev_general -- bdev/blockdev.sh@790 -- # run_test bdev_error error_test_suite '' 00:27:24.703 11:38:08 blockdev_general -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:27:24.703 11:38:08 blockdev_general -- common/autotest_common.sh@1106 -- # xtrace_disable 00:27:24.703 11:38:08 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:27:24.703 ************************************ 00:27:24.703 START TEST bdev_error 00:27:24.703 ************************************ 00:27:24.703 11:38:08 blockdev_general.bdev_error -- common/autotest_common.sh@1124 -- # error_test_suite '' 00:27:24.703 11:38:08 blockdev_general.bdev_error -- bdev/blockdev.sh@466 -- # DEV_1=Dev_1 00:27:24.703 11:38:08 blockdev_general.bdev_error -- bdev/blockdev.sh@467 -- # DEV_2=Dev_2 00:27:24.703 11:38:08 blockdev_general.bdev_error -- bdev/blockdev.sh@468 -- # ERR_DEV=EE_Dev_1 00:27:24.703 11:38:08 blockdev_general.bdev_error -- bdev/blockdev.sh@472 -- # ERR_PID=126693 00:27:24.703 11:38:08 blockdev_general.bdev_error -- bdev/blockdev.sh@473 -- # echo 'Process error testing pid: 126693' 00:27:24.703 Process error testing pid: 126693 00:27:24.703 11:38:08 blockdev_general.bdev_error -- bdev/blockdev.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 16 -o 4096 -w randread -t 5 -f '' 00:27:24.703 11:38:08 blockdev_general.bdev_error -- bdev/blockdev.sh@474 -- # waitforlisten 126693 00:27:24.703 11:38:08 blockdev_general.bdev_error -- common/autotest_common.sh@830 -- # '[' -z 126693 ']' 00:27:24.703 11:38:08 blockdev_general.bdev_error -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:27:24.703 11:38:08 blockdev_general.bdev_error -- common/autotest_common.sh@835 -- # local max_retries=100 00:27:24.703 11:38:08 blockdev_general.bdev_error -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:27:24.703 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:27:24.703 11:38:08 blockdev_general.bdev_error -- common/autotest_common.sh@839 -- # xtrace_disable 00:27:24.703 11:38:08 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:27:24.961 [2024-06-10 11:38:08.655347] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:27:24.961 [2024-06-10 11:38:08.655400] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid126693 ] 00:27:24.961 [2024-06-10 11:38:08.742390] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:24.961 [2024-06-10 11:38:08.828118] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:27:25.529 11:38:09 blockdev_general.bdev_error -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:27:25.529 11:38:09 blockdev_general.bdev_error -- common/autotest_common.sh@863 -- # return 0 00:27:25.529 11:38:09 blockdev_general.bdev_error -- bdev/blockdev.sh@476 -- # rpc_cmd bdev_malloc_create -b Dev_1 128 512 00:27:25.529 11:38:09 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:27:25.529 11:38:09 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:27:25.789 Dev_1 00:27:25.789 11:38:09 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:27:25.789 11:38:09 blockdev_general.bdev_error -- bdev/blockdev.sh@477 -- # waitforbdev Dev_1 00:27:25.789 11:38:09 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_name=Dev_1 00:27:25.789 11:38:09 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:27:25.789 11:38:09 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # local i 00:27:25.789 11:38:09 blockdev_general.bdev_error -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:27:25.789 11:38:09 blockdev_general.bdev_error -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:27:25.789 11:38:09 blockdev_general.bdev_error -- common/autotest_common.sh@903 -- # rpc_cmd bdev_wait_for_examine 00:27:25.789 11:38:09 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:27:25.789 11:38:09 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:27:25.789 11:38:09 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:27:25.789 11:38:09 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # rpc_cmd bdev_get_bdevs -b Dev_1 -t 2000 00:27:25.789 11:38:09 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:27:25.789 11:38:09 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:27:25.789 [ 00:27:25.789 { 00:27:25.789 "name": "Dev_1", 00:27:25.789 "aliases": [ 00:27:25.789 "d4be4d59-02a2-45c4-b204-3a019110b105" 00:27:25.789 ], 00:27:25.789 "product_name": "Malloc disk", 00:27:25.789 "block_size": 512, 00:27:25.789 "num_blocks": 262144, 00:27:25.789 "uuid": "d4be4d59-02a2-45c4-b204-3a019110b105", 00:27:25.789 "assigned_rate_limits": { 00:27:25.789 "rw_ios_per_sec": 0, 00:27:25.789 "rw_mbytes_per_sec": 0, 00:27:25.789 "r_mbytes_per_sec": 0, 00:27:25.789 "w_mbytes_per_sec": 0 00:27:25.789 }, 00:27:25.789 "claimed": false, 00:27:25.789 "zoned": false, 00:27:25.789 "supported_io_types": { 00:27:25.789 "read": true, 00:27:25.789 "write": true, 00:27:25.789 "unmap": true, 00:27:25.789 "write_zeroes": true, 00:27:25.789 "flush": true, 00:27:25.789 "reset": true, 00:27:25.789 "compare": false, 00:27:25.789 "compare_and_write": false, 00:27:25.789 "abort": true, 00:27:25.789 "nvme_admin": false, 00:27:25.789 "nvme_io": false 00:27:25.789 }, 00:27:25.789 "memory_domains": [ 00:27:25.789 { 00:27:25.789 "dma_device_id": "system", 00:27:25.789 "dma_device_type": 1 00:27:25.789 }, 00:27:25.789 { 00:27:25.789 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:25.789 "dma_device_type": 2 00:27:25.789 } 00:27:25.789 ], 00:27:25.789 "driver_specific": {} 00:27:25.789 } 00:27:25.789 ] 00:27:25.789 11:38:09 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:27:25.789 11:38:09 blockdev_general.bdev_error -- common/autotest_common.sh@906 -- # return 0 00:27:25.789 11:38:09 blockdev_general.bdev_error -- bdev/blockdev.sh@478 -- # rpc_cmd bdev_error_create Dev_1 00:27:25.789 11:38:09 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:27:25.789 11:38:09 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:27:25.789 true 00:27:25.789 11:38:09 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:27:25.789 11:38:09 blockdev_general.bdev_error -- bdev/blockdev.sh@479 -- # rpc_cmd bdev_malloc_create -b Dev_2 128 512 00:27:25.789 11:38:09 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:27:25.789 11:38:09 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:27:25.789 Dev_2 00:27:25.789 11:38:09 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:27:25.789 11:38:09 blockdev_general.bdev_error -- bdev/blockdev.sh@480 -- # waitforbdev Dev_2 00:27:25.789 11:38:09 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_name=Dev_2 00:27:25.789 11:38:09 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:27:25.789 11:38:09 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # local i 00:27:25.789 11:38:09 blockdev_general.bdev_error -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:27:25.789 11:38:09 blockdev_general.bdev_error -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:27:25.789 11:38:09 blockdev_general.bdev_error -- common/autotest_common.sh@903 -- # rpc_cmd bdev_wait_for_examine 00:27:25.789 11:38:09 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:27:25.789 11:38:09 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:27:25.789 11:38:09 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:27:25.789 11:38:09 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # rpc_cmd bdev_get_bdevs -b Dev_2 -t 2000 00:27:25.789 11:38:09 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:27:25.789 11:38:09 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:27:25.789 [ 00:27:25.789 { 00:27:25.789 "name": "Dev_2", 00:27:25.789 "aliases": [ 00:27:25.790 "67edd54b-860a-4d2e-90e6-c0ad25d05843" 00:27:25.790 ], 00:27:25.790 "product_name": "Malloc disk", 00:27:25.790 "block_size": 512, 00:27:25.790 "num_blocks": 262144, 00:27:25.790 "uuid": "67edd54b-860a-4d2e-90e6-c0ad25d05843", 00:27:25.790 "assigned_rate_limits": { 00:27:25.790 "rw_ios_per_sec": 0, 00:27:25.790 "rw_mbytes_per_sec": 0, 00:27:25.790 "r_mbytes_per_sec": 0, 00:27:25.790 "w_mbytes_per_sec": 0 00:27:25.790 }, 00:27:25.790 "claimed": false, 00:27:25.790 "zoned": false, 00:27:25.790 "supported_io_types": { 00:27:25.790 "read": true, 00:27:25.790 "write": true, 00:27:25.790 "unmap": true, 00:27:25.790 "write_zeroes": true, 00:27:25.790 "flush": true, 00:27:25.790 "reset": true, 00:27:25.790 "compare": false, 00:27:25.790 "compare_and_write": false, 00:27:25.790 "abort": true, 00:27:25.790 "nvme_admin": false, 00:27:25.790 "nvme_io": false 00:27:25.790 }, 00:27:25.790 "memory_domains": [ 00:27:25.790 { 00:27:25.790 "dma_device_id": "system", 00:27:25.790 "dma_device_type": 1 00:27:25.790 }, 00:27:25.790 { 00:27:25.790 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:25.790 "dma_device_type": 2 00:27:25.790 } 00:27:25.790 ], 00:27:25.790 "driver_specific": {} 00:27:25.790 } 00:27:25.790 ] 00:27:25.790 11:38:09 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:27:25.790 11:38:09 blockdev_general.bdev_error -- common/autotest_common.sh@906 -- # return 0 00:27:25.790 11:38:09 blockdev_general.bdev_error -- bdev/blockdev.sh@481 -- # rpc_cmd bdev_error_inject_error EE_Dev_1 all failure -n 5 00:27:25.790 11:38:09 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:27:25.790 11:38:09 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:27:25.790 11:38:09 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:27:25.790 11:38:09 blockdev_general.bdev_error -- bdev/blockdev.sh@484 -- # sleep 1 00:27:25.790 11:38:09 blockdev_general.bdev_error -- bdev/blockdev.sh@483 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -t 1 perform_tests 00:27:25.790 Running I/O for 5 seconds... 00:27:26.727 11:38:10 blockdev_general.bdev_error -- bdev/blockdev.sh@487 -- # kill -0 126693 00:27:26.727 11:38:10 blockdev_general.bdev_error -- bdev/blockdev.sh@488 -- # echo 'Process is existed as continue on error is set. Pid: 126693' 00:27:26.727 Process is existed as continue on error is set. Pid: 126693 00:27:26.727 11:38:10 blockdev_general.bdev_error -- bdev/blockdev.sh@495 -- # rpc_cmd bdev_error_delete EE_Dev_1 00:27:26.727 11:38:10 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:27:26.727 11:38:10 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:27:26.727 11:38:10 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:27:26.727 11:38:10 blockdev_general.bdev_error -- bdev/blockdev.sh@496 -- # rpc_cmd bdev_malloc_delete Dev_1 00:27:26.727 11:38:10 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:27:26.727 11:38:10 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:27:26.727 11:38:10 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:27:26.727 11:38:10 blockdev_general.bdev_error -- bdev/blockdev.sh@497 -- # sleep 5 00:27:26.986 Timeout while waiting for response: 00:27:26.986 00:27:26.986 00:27:31.186 00:27:31.186 Latency(us) 00:27:31.186 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:31.186 Job: EE_Dev_1 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:27:31.186 EE_Dev_1 : 0.93 57708.04 225.42 5.39 0.00 274.98 90.82 673.17 00:27:31.186 Job: Dev_2 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:27:31.186 Dev_2 : 5.00 128324.63 501.27 0.00 0.00 122.49 40.96 21997.30 00:27:31.186 =================================================================================================================== 00:27:31.186 Total : 186032.67 726.69 5.39 0.00 134.24 40.96 21997.30 00:27:31.754 11:38:15 blockdev_general.bdev_error -- bdev/blockdev.sh@499 -- # killprocess 126693 00:27:31.755 11:38:15 blockdev_general.bdev_error -- common/autotest_common.sh@949 -- # '[' -z 126693 ']' 00:27:31.755 11:38:15 blockdev_general.bdev_error -- common/autotest_common.sh@953 -- # kill -0 126693 00:27:31.755 11:38:15 blockdev_general.bdev_error -- common/autotest_common.sh@954 -- # uname 00:27:31.755 11:38:15 blockdev_general.bdev_error -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:27:31.755 11:38:15 blockdev_general.bdev_error -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 126693 00:27:31.755 11:38:15 blockdev_general.bdev_error -- common/autotest_common.sh@955 -- # process_name=reactor_1 00:27:31.755 11:38:15 blockdev_general.bdev_error -- common/autotest_common.sh@959 -- # '[' reactor_1 = sudo ']' 00:27:31.755 11:38:15 blockdev_general.bdev_error -- common/autotest_common.sh@967 -- # echo 'killing process with pid 126693' 00:27:31.755 killing process with pid 126693 00:27:31.755 11:38:15 blockdev_general.bdev_error -- common/autotest_common.sh@968 -- # kill 126693 00:27:31.755 Received shutdown signal, test time was about 5.000000 seconds 00:27:31.755 00:27:31.755 Latency(us) 00:27:31.755 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:31.755 =================================================================================================================== 00:27:31.755 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:27:31.755 11:38:15 blockdev_general.bdev_error -- common/autotest_common.sh@973 -- # wait 126693 00:27:32.014 11:38:15 blockdev_general.bdev_error -- bdev/blockdev.sh@503 -- # ERR_PID=127744 00:27:32.014 11:38:15 blockdev_general.bdev_error -- bdev/blockdev.sh@504 -- # echo 'Process error testing pid: 127744' 00:27:32.014 Process error testing pid: 127744 00:27:32.014 11:38:15 blockdev_general.bdev_error -- bdev/blockdev.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 16 -o 4096 -w randread -t 5 '' 00:27:32.014 11:38:15 blockdev_general.bdev_error -- bdev/blockdev.sh@505 -- # waitforlisten 127744 00:27:32.014 11:38:15 blockdev_general.bdev_error -- common/autotest_common.sh@830 -- # '[' -z 127744 ']' 00:27:32.014 11:38:15 blockdev_general.bdev_error -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:27:32.014 11:38:15 blockdev_general.bdev_error -- common/autotest_common.sh@835 -- # local max_retries=100 00:27:32.014 11:38:15 blockdev_general.bdev_error -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:27:32.014 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:27:32.014 11:38:15 blockdev_general.bdev_error -- common/autotest_common.sh@839 -- # xtrace_disable 00:27:32.014 11:38:15 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:27:32.273 [2024-06-10 11:38:15.984369] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:27:32.273 [2024-06-10 11:38:15.984428] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid127744 ] 00:27:32.273 [2024-06-10 11:38:16.070612] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:32.273 [2024-06-10 11:38:16.156384] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:27:32.841 11:38:16 blockdev_general.bdev_error -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:27:32.841 11:38:16 blockdev_general.bdev_error -- common/autotest_common.sh@863 -- # return 0 00:27:32.841 11:38:16 blockdev_general.bdev_error -- bdev/blockdev.sh@507 -- # rpc_cmd bdev_malloc_create -b Dev_1 128 512 00:27:32.841 11:38:16 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:27:32.842 11:38:16 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:27:33.101 Dev_1 00:27:33.101 11:38:16 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:27:33.101 11:38:16 blockdev_general.bdev_error -- bdev/blockdev.sh@508 -- # waitforbdev Dev_1 00:27:33.101 11:38:16 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_name=Dev_1 00:27:33.101 11:38:16 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:27:33.101 11:38:16 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # local i 00:27:33.101 11:38:16 blockdev_general.bdev_error -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:27:33.101 11:38:16 blockdev_general.bdev_error -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:27:33.101 11:38:16 blockdev_general.bdev_error -- common/autotest_common.sh@903 -- # rpc_cmd bdev_wait_for_examine 00:27:33.101 11:38:16 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:27:33.101 11:38:16 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:27:33.101 11:38:16 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:27:33.101 11:38:16 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # rpc_cmd bdev_get_bdevs -b Dev_1 -t 2000 00:27:33.101 11:38:16 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:27:33.101 11:38:16 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:27:33.101 [ 00:27:33.101 { 00:27:33.101 "name": "Dev_1", 00:27:33.101 "aliases": [ 00:27:33.101 "4e999582-05f7-4c12-840b-c5e87ede34dc" 00:27:33.101 ], 00:27:33.101 "product_name": "Malloc disk", 00:27:33.101 "block_size": 512, 00:27:33.101 "num_blocks": 262144, 00:27:33.101 "uuid": "4e999582-05f7-4c12-840b-c5e87ede34dc", 00:27:33.101 "assigned_rate_limits": { 00:27:33.101 "rw_ios_per_sec": 0, 00:27:33.101 "rw_mbytes_per_sec": 0, 00:27:33.101 "r_mbytes_per_sec": 0, 00:27:33.101 "w_mbytes_per_sec": 0 00:27:33.101 }, 00:27:33.101 "claimed": false, 00:27:33.101 "zoned": false, 00:27:33.101 "supported_io_types": { 00:27:33.101 "read": true, 00:27:33.101 "write": true, 00:27:33.101 "unmap": true, 00:27:33.101 "write_zeroes": true, 00:27:33.101 "flush": true, 00:27:33.101 "reset": true, 00:27:33.101 "compare": false, 00:27:33.101 "compare_and_write": false, 00:27:33.101 "abort": true, 00:27:33.101 "nvme_admin": false, 00:27:33.101 "nvme_io": false 00:27:33.101 }, 00:27:33.102 "memory_domains": [ 00:27:33.102 { 00:27:33.102 "dma_device_id": "system", 00:27:33.102 "dma_device_type": 1 00:27:33.102 }, 00:27:33.102 { 00:27:33.102 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:33.102 "dma_device_type": 2 00:27:33.102 } 00:27:33.102 ], 00:27:33.102 "driver_specific": {} 00:27:33.102 } 00:27:33.102 ] 00:27:33.102 11:38:16 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:27:33.102 11:38:16 blockdev_general.bdev_error -- common/autotest_common.sh@906 -- # return 0 00:27:33.102 11:38:16 blockdev_general.bdev_error -- bdev/blockdev.sh@509 -- # rpc_cmd bdev_error_create Dev_1 00:27:33.102 11:38:16 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:27:33.102 11:38:16 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:27:33.102 true 00:27:33.102 11:38:16 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:27:33.102 11:38:16 blockdev_general.bdev_error -- bdev/blockdev.sh@510 -- # rpc_cmd bdev_malloc_create -b Dev_2 128 512 00:27:33.102 11:38:16 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:27:33.102 11:38:16 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:27:33.102 Dev_2 00:27:33.102 11:38:16 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:27:33.102 11:38:16 blockdev_general.bdev_error -- bdev/blockdev.sh@511 -- # waitforbdev Dev_2 00:27:33.102 11:38:16 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_name=Dev_2 00:27:33.102 11:38:16 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:27:33.102 11:38:16 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # local i 00:27:33.102 11:38:16 blockdev_general.bdev_error -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:27:33.102 11:38:16 blockdev_general.bdev_error -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:27:33.102 11:38:16 blockdev_general.bdev_error -- common/autotest_common.sh@903 -- # rpc_cmd bdev_wait_for_examine 00:27:33.102 11:38:16 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:27:33.102 11:38:16 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:27:33.102 11:38:16 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:27:33.102 11:38:16 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # rpc_cmd bdev_get_bdevs -b Dev_2 -t 2000 00:27:33.102 11:38:16 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:27:33.102 11:38:16 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:27:33.102 [ 00:27:33.102 { 00:27:33.102 "name": "Dev_2", 00:27:33.102 "aliases": [ 00:27:33.102 "6a68ef61-454d-4780-a008-aafb91718943" 00:27:33.102 ], 00:27:33.102 "product_name": "Malloc disk", 00:27:33.102 "block_size": 512, 00:27:33.102 "num_blocks": 262144, 00:27:33.102 "uuid": "6a68ef61-454d-4780-a008-aafb91718943", 00:27:33.102 "assigned_rate_limits": { 00:27:33.102 "rw_ios_per_sec": 0, 00:27:33.102 "rw_mbytes_per_sec": 0, 00:27:33.102 "r_mbytes_per_sec": 0, 00:27:33.102 "w_mbytes_per_sec": 0 00:27:33.102 }, 00:27:33.102 "claimed": false, 00:27:33.102 "zoned": false, 00:27:33.102 "supported_io_types": { 00:27:33.102 "read": true, 00:27:33.102 "write": true, 00:27:33.102 "unmap": true, 00:27:33.102 "write_zeroes": true, 00:27:33.102 "flush": true, 00:27:33.102 "reset": true, 00:27:33.102 "compare": false, 00:27:33.102 "compare_and_write": false, 00:27:33.102 "abort": true, 00:27:33.102 "nvme_admin": false, 00:27:33.102 "nvme_io": false 00:27:33.102 }, 00:27:33.102 "memory_domains": [ 00:27:33.102 { 00:27:33.102 "dma_device_id": "system", 00:27:33.102 "dma_device_type": 1 00:27:33.102 }, 00:27:33.102 { 00:27:33.102 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:33.102 "dma_device_type": 2 00:27:33.102 } 00:27:33.102 ], 00:27:33.102 "driver_specific": {} 00:27:33.102 } 00:27:33.102 ] 00:27:33.102 11:38:16 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:27:33.102 11:38:16 blockdev_general.bdev_error -- common/autotest_common.sh@906 -- # return 0 00:27:33.102 11:38:16 blockdev_general.bdev_error -- bdev/blockdev.sh@512 -- # rpc_cmd bdev_error_inject_error EE_Dev_1 all failure -n 5 00:27:33.102 11:38:16 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:27:33.102 11:38:16 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:27:33.102 11:38:16 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:27:33.102 11:38:16 blockdev_general.bdev_error -- bdev/blockdev.sh@515 -- # NOT wait 127744 00:27:33.102 11:38:16 blockdev_general.bdev_error -- bdev/blockdev.sh@514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -t 1 perform_tests 00:27:33.102 11:38:16 blockdev_general.bdev_error -- common/autotest_common.sh@649 -- # local es=0 00:27:33.102 11:38:16 blockdev_general.bdev_error -- common/autotest_common.sh@651 -- # valid_exec_arg wait 127744 00:27:33.102 11:38:16 blockdev_general.bdev_error -- common/autotest_common.sh@637 -- # local arg=wait 00:27:33.102 11:38:16 blockdev_general.bdev_error -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:27:33.102 11:38:16 blockdev_general.bdev_error -- common/autotest_common.sh@641 -- # type -t wait 00:27:33.102 11:38:16 blockdev_general.bdev_error -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:27:33.102 11:38:16 blockdev_general.bdev_error -- common/autotest_common.sh@652 -- # wait 127744 00:27:33.102 Running I/O for 5 seconds... 00:27:33.102 task offset: 87440 on job bdev=EE_Dev_1 fails 00:27:33.102 00:27:33.102 Latency(us) 00:27:33.102 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:33.102 Job: EE_Dev_1 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:27:33.102 Job: EE_Dev_1 ended in about 0.00 seconds with error 00:27:33.102 EE_Dev_1 : 0.00 44715.45 174.67 10162.60 0.00 240.66 92.61 430.97 00:27:33.102 Job: Dev_2 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:27:33.102 Dev_2 : 0.00 27562.45 107.67 0.00 0.00 429.18 85.04 797.83 00:27:33.102 =================================================================================================================== 00:27:33.102 Total : 72277.89 282.34 10162.60 0.00 342.91 85.04 797.83 00:27:33.102 [2024-06-10 11:38:17.014627] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:27:33.102 request: 00:27:33.102 { 00:27:33.102 "method": "perform_tests", 00:27:33.102 "req_id": 1 00:27:33.102 } 00:27:33.102 Got JSON-RPC error response 00:27:33.102 response: 00:27:33.102 { 00:27:33.102 "code": -32603, 00:27:33.102 "message": "bdevperf failed with error Operation not permitted" 00:27:33.102 } 00:27:33.378 11:38:17 blockdev_general.bdev_error -- common/autotest_common.sh@652 -- # es=255 00:27:33.378 11:38:17 blockdev_general.bdev_error -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:27:33.378 11:38:17 blockdev_general.bdev_error -- common/autotest_common.sh@661 -- # es=127 00:27:33.378 11:38:17 blockdev_general.bdev_error -- common/autotest_common.sh@662 -- # case "$es" in 00:27:33.378 11:38:17 blockdev_general.bdev_error -- common/autotest_common.sh@669 -- # es=1 00:27:33.378 11:38:17 blockdev_general.bdev_error -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:27:33.378 00:27:33.378 real 0m8.675s 00:27:33.378 user 0m8.852s 00:27:33.378 sys 0m0.746s 00:27:33.378 11:38:17 blockdev_general.bdev_error -- common/autotest_common.sh@1125 -- # xtrace_disable 00:27:33.378 11:38:17 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:27:33.378 ************************************ 00:27:33.378 END TEST bdev_error 00:27:33.378 ************************************ 00:27:33.378 11:38:17 blockdev_general -- bdev/blockdev.sh@791 -- # run_test bdev_stat stat_test_suite '' 00:27:33.378 11:38:17 blockdev_general -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:27:33.378 11:38:17 blockdev_general -- common/autotest_common.sh@1106 -- # xtrace_disable 00:27:33.378 11:38:17 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:27:33.637 ************************************ 00:27:33.637 START TEST bdev_stat 00:27:33.637 ************************************ 00:27:33.637 11:38:17 blockdev_general.bdev_stat -- common/autotest_common.sh@1124 -- # stat_test_suite '' 00:27:33.637 11:38:17 blockdev_general.bdev_stat -- bdev/blockdev.sh@592 -- # STAT_DEV=Malloc_STAT 00:27:33.637 11:38:17 blockdev_general.bdev_stat -- bdev/blockdev.sh@596 -- # STAT_PID=127948 00:27:33.637 11:38:17 blockdev_general.bdev_stat -- bdev/blockdev.sh@597 -- # echo 'Process Bdev IO statistics testing pid: 127948' 00:27:33.637 Process Bdev IO statistics testing pid: 127948 00:27:33.637 11:38:17 blockdev_general.bdev_stat -- bdev/blockdev.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x3 -q 256 -o 4096 -w randread -t 10 -C '' 00:27:33.637 11:38:17 blockdev_general.bdev_stat -- bdev/blockdev.sh@598 -- # trap 'cleanup; killprocess $STAT_PID; exit 1' SIGINT SIGTERM EXIT 00:27:33.637 11:38:17 blockdev_general.bdev_stat -- bdev/blockdev.sh@599 -- # waitforlisten 127948 00:27:33.637 11:38:17 blockdev_general.bdev_stat -- common/autotest_common.sh@830 -- # '[' -z 127948 ']' 00:27:33.637 11:38:17 blockdev_general.bdev_stat -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:27:33.637 11:38:17 blockdev_general.bdev_stat -- common/autotest_common.sh@835 -- # local max_retries=100 00:27:33.637 11:38:17 blockdev_general.bdev_stat -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:27:33.637 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:27:33.637 11:38:17 blockdev_general.bdev_stat -- common/autotest_common.sh@839 -- # xtrace_disable 00:27:33.637 11:38:17 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:27:33.637 [2024-06-10 11:38:17.407728] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:27:33.637 [2024-06-10 11:38:17.407781] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid127948 ] 00:27:33.637 [2024-06-10 11:38:17.495187] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:27:33.637 [2024-06-10 11:38:17.580064] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:27:33.637 [2024-06-10 11:38:17.580067] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:27:34.575 11:38:18 blockdev_general.bdev_stat -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:27:34.575 11:38:18 blockdev_general.bdev_stat -- common/autotest_common.sh@863 -- # return 0 00:27:34.575 11:38:18 blockdev_general.bdev_stat -- bdev/blockdev.sh@601 -- # rpc_cmd bdev_malloc_create -b Malloc_STAT 128 512 00:27:34.575 11:38:18 blockdev_general.bdev_stat -- common/autotest_common.sh@560 -- # xtrace_disable 00:27:34.575 11:38:18 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:27:34.575 Malloc_STAT 00:27:34.575 11:38:18 blockdev_general.bdev_stat -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:27:34.575 11:38:18 blockdev_general.bdev_stat -- bdev/blockdev.sh@602 -- # waitforbdev Malloc_STAT 00:27:34.575 11:38:18 blockdev_general.bdev_stat -- common/autotest_common.sh@898 -- # local bdev_name=Malloc_STAT 00:27:34.575 11:38:18 blockdev_general.bdev_stat -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:27:34.575 11:38:18 blockdev_general.bdev_stat -- common/autotest_common.sh@900 -- # local i 00:27:34.575 11:38:18 blockdev_general.bdev_stat -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:27:34.575 11:38:18 blockdev_general.bdev_stat -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:27:34.575 11:38:18 blockdev_general.bdev_stat -- common/autotest_common.sh@903 -- # rpc_cmd bdev_wait_for_examine 00:27:34.575 11:38:18 blockdev_general.bdev_stat -- common/autotest_common.sh@560 -- # xtrace_disable 00:27:34.575 11:38:18 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:27:34.575 11:38:18 blockdev_general.bdev_stat -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:27:34.575 11:38:18 blockdev_general.bdev_stat -- common/autotest_common.sh@905 -- # rpc_cmd bdev_get_bdevs -b Malloc_STAT -t 2000 00:27:34.575 11:38:18 blockdev_general.bdev_stat -- common/autotest_common.sh@560 -- # xtrace_disable 00:27:34.575 11:38:18 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:27:34.575 [ 00:27:34.575 { 00:27:34.575 "name": "Malloc_STAT", 00:27:34.575 "aliases": [ 00:27:34.575 "c4cc5b41-df16-4936-b764-e5fdd9ec8d7f" 00:27:34.575 ], 00:27:34.575 "product_name": "Malloc disk", 00:27:34.575 "block_size": 512, 00:27:34.575 "num_blocks": 262144, 00:27:34.575 "uuid": "c4cc5b41-df16-4936-b764-e5fdd9ec8d7f", 00:27:34.575 "assigned_rate_limits": { 00:27:34.575 "rw_ios_per_sec": 0, 00:27:34.575 "rw_mbytes_per_sec": 0, 00:27:34.575 "r_mbytes_per_sec": 0, 00:27:34.575 "w_mbytes_per_sec": 0 00:27:34.575 }, 00:27:34.576 "claimed": false, 00:27:34.576 "zoned": false, 00:27:34.576 "supported_io_types": { 00:27:34.576 "read": true, 00:27:34.576 "write": true, 00:27:34.576 "unmap": true, 00:27:34.576 "write_zeroes": true, 00:27:34.576 "flush": true, 00:27:34.576 "reset": true, 00:27:34.576 "compare": false, 00:27:34.576 "compare_and_write": false, 00:27:34.576 "abort": true, 00:27:34.576 "nvme_admin": false, 00:27:34.576 "nvme_io": false 00:27:34.576 }, 00:27:34.576 "memory_domains": [ 00:27:34.576 { 00:27:34.576 "dma_device_id": "system", 00:27:34.576 "dma_device_type": 1 00:27:34.576 }, 00:27:34.576 { 00:27:34.576 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:34.576 "dma_device_type": 2 00:27:34.576 } 00:27:34.576 ], 00:27:34.576 "driver_specific": {} 00:27:34.576 } 00:27:34.576 ] 00:27:34.576 11:38:18 blockdev_general.bdev_stat -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:27:34.576 11:38:18 blockdev_general.bdev_stat -- common/autotest_common.sh@906 -- # return 0 00:27:34.576 11:38:18 blockdev_general.bdev_stat -- bdev/blockdev.sh@605 -- # sleep 2 00:27:34.576 11:38:18 blockdev_general.bdev_stat -- bdev/blockdev.sh@604 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:27:34.576 Running I/O for 10 seconds... 00:27:36.490 11:38:20 blockdev_general.bdev_stat -- bdev/blockdev.sh@606 -- # stat_function_test Malloc_STAT 00:27:36.490 11:38:20 blockdev_general.bdev_stat -- bdev/blockdev.sh@559 -- # local bdev_name=Malloc_STAT 00:27:36.490 11:38:20 blockdev_general.bdev_stat -- bdev/blockdev.sh@560 -- # local iostats 00:27:36.490 11:38:20 blockdev_general.bdev_stat -- bdev/blockdev.sh@561 -- # local io_count1 00:27:36.490 11:38:20 blockdev_general.bdev_stat -- bdev/blockdev.sh@562 -- # local io_count2 00:27:36.490 11:38:20 blockdev_general.bdev_stat -- bdev/blockdev.sh@563 -- # local iostats_per_channel 00:27:36.490 11:38:20 blockdev_general.bdev_stat -- bdev/blockdev.sh@564 -- # local io_count_per_channel1 00:27:36.490 11:38:20 blockdev_general.bdev_stat -- bdev/blockdev.sh@565 -- # local io_count_per_channel2 00:27:36.490 11:38:20 blockdev_general.bdev_stat -- bdev/blockdev.sh@566 -- # local io_count_per_channel_all=0 00:27:36.490 11:38:20 blockdev_general.bdev_stat -- bdev/blockdev.sh@568 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT 00:27:36.490 11:38:20 blockdev_general.bdev_stat -- common/autotest_common.sh@560 -- # xtrace_disable 00:27:36.490 11:38:20 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:27:36.490 11:38:20 blockdev_general.bdev_stat -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:27:36.490 11:38:20 blockdev_general.bdev_stat -- bdev/blockdev.sh@568 -- # iostats='{ 00:27:36.490 "tick_rate": 2300000000, 00:27:36.490 "ticks": 4184341937990476, 00:27:36.490 "bdevs": [ 00:27:36.490 { 00:27:36.490 "name": "Malloc_STAT", 00:27:36.490 "bytes_read": 1006678528, 00:27:36.490 "num_read_ops": 245764, 00:27:36.490 "bytes_written": 0, 00:27:36.490 "num_write_ops": 0, 00:27:36.490 "bytes_unmapped": 0, 00:27:36.490 "num_unmap_ops": 0, 00:27:36.490 "bytes_copied": 0, 00:27:36.490 "num_copy_ops": 0, 00:27:36.490 "read_latency_ticks": 2265907346626, 00:27:36.490 "max_read_latency_ticks": 11312960, 00:27:36.490 "min_read_latency_ticks": 198236, 00:27:36.490 "write_latency_ticks": 0, 00:27:36.490 "max_write_latency_ticks": 0, 00:27:36.490 "min_write_latency_ticks": 0, 00:27:36.490 "unmap_latency_ticks": 0, 00:27:36.490 "max_unmap_latency_ticks": 0, 00:27:36.490 "min_unmap_latency_ticks": 0, 00:27:36.490 "copy_latency_ticks": 0, 00:27:36.490 "max_copy_latency_ticks": 0, 00:27:36.491 "min_copy_latency_ticks": 0, 00:27:36.491 "io_error": {} 00:27:36.491 } 00:27:36.491 ] 00:27:36.491 }' 00:27:36.491 11:38:20 blockdev_general.bdev_stat -- bdev/blockdev.sh@569 -- # jq -r '.bdevs[0].num_read_ops' 00:27:36.491 11:38:20 blockdev_general.bdev_stat -- bdev/blockdev.sh@569 -- # io_count1=245764 00:27:36.491 11:38:20 blockdev_general.bdev_stat -- bdev/blockdev.sh@571 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT -c 00:27:36.491 11:38:20 blockdev_general.bdev_stat -- common/autotest_common.sh@560 -- # xtrace_disable 00:27:36.491 11:38:20 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:27:36.491 11:38:20 blockdev_general.bdev_stat -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:27:36.491 11:38:20 blockdev_general.bdev_stat -- bdev/blockdev.sh@571 -- # iostats_per_channel='{ 00:27:36.491 "tick_rate": 2300000000, 00:27:36.491 "ticks": 4184342091335002, 00:27:36.491 "name": "Malloc_STAT", 00:27:36.491 "channels": [ 00:27:36.491 { 00:27:36.491 "thread_id": 2, 00:27:36.491 "bytes_read": 507510784, 00:27:36.491 "num_read_ops": 123904, 00:27:36.491 "bytes_written": 0, 00:27:36.491 "num_write_ops": 0, 00:27:36.491 "bytes_unmapped": 0, 00:27:36.491 "num_unmap_ops": 0, 00:27:36.491 "bytes_copied": 0, 00:27:36.491 "num_copy_ops": 0, 00:27:36.491 "read_latency_ticks": 1172054098180, 00:27:36.491 "max_read_latency_ticks": 11312960, 00:27:36.491 "min_read_latency_ticks": 5776118, 00:27:36.491 "write_latency_ticks": 0, 00:27:36.491 "max_write_latency_ticks": 0, 00:27:36.491 "min_write_latency_ticks": 0, 00:27:36.491 "unmap_latency_ticks": 0, 00:27:36.491 "max_unmap_latency_ticks": 0, 00:27:36.491 "min_unmap_latency_ticks": 0, 00:27:36.491 "copy_latency_ticks": 0, 00:27:36.491 "max_copy_latency_ticks": 0, 00:27:36.491 "min_copy_latency_ticks": 0 00:27:36.491 }, 00:27:36.491 { 00:27:36.491 "thread_id": 3, 00:27:36.491 "bytes_read": 534773760, 00:27:36.491 "num_read_ops": 130560, 00:27:36.491 "bytes_written": 0, 00:27:36.491 "num_write_ops": 0, 00:27:36.491 "bytes_unmapped": 0, 00:27:36.491 "num_unmap_ops": 0, 00:27:36.491 "bytes_copied": 0, 00:27:36.491 "num_copy_ops": 0, 00:27:36.491 "read_latency_ticks": 1174363773330, 00:27:36.491 "max_read_latency_ticks": 11016980, 00:27:36.491 "min_read_latency_ticks": 5769180, 00:27:36.491 "write_latency_ticks": 0, 00:27:36.491 "max_write_latency_ticks": 0, 00:27:36.491 "min_write_latency_ticks": 0, 00:27:36.491 "unmap_latency_ticks": 0, 00:27:36.491 "max_unmap_latency_ticks": 0, 00:27:36.491 "min_unmap_latency_ticks": 0, 00:27:36.491 "copy_latency_ticks": 0, 00:27:36.491 "max_copy_latency_ticks": 0, 00:27:36.491 "min_copy_latency_ticks": 0 00:27:36.491 } 00:27:36.491 ] 00:27:36.491 }' 00:27:36.491 11:38:20 blockdev_general.bdev_stat -- bdev/blockdev.sh@572 -- # jq -r '.channels[0].num_read_ops' 00:27:36.491 11:38:20 blockdev_general.bdev_stat -- bdev/blockdev.sh@572 -- # io_count_per_channel1=123904 00:27:36.491 11:38:20 blockdev_general.bdev_stat -- bdev/blockdev.sh@573 -- # io_count_per_channel_all=123904 00:27:36.751 11:38:20 blockdev_general.bdev_stat -- bdev/blockdev.sh@574 -- # jq -r '.channels[1].num_read_ops' 00:27:36.751 11:38:20 blockdev_general.bdev_stat -- bdev/blockdev.sh@574 -- # io_count_per_channel2=130560 00:27:36.751 11:38:20 blockdev_general.bdev_stat -- bdev/blockdev.sh@575 -- # io_count_per_channel_all=254464 00:27:36.751 11:38:20 blockdev_general.bdev_stat -- bdev/blockdev.sh@577 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT 00:27:36.751 11:38:20 blockdev_general.bdev_stat -- common/autotest_common.sh@560 -- # xtrace_disable 00:27:36.751 11:38:20 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:27:36.751 11:38:20 blockdev_general.bdev_stat -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:27:36.751 11:38:20 blockdev_general.bdev_stat -- bdev/blockdev.sh@577 -- # iostats='{ 00:27:36.751 "tick_rate": 2300000000, 00:27:36.751 "ticks": 4184342357056310, 00:27:36.751 "bdevs": [ 00:27:36.751 { 00:27:36.751 "name": "Malloc_STAT", 00:27:36.751 "bytes_read": 1102098944, 00:27:36.751 "num_read_ops": 269060, 00:27:36.751 "bytes_written": 0, 00:27:36.751 "num_write_ops": 0, 00:27:36.751 "bytes_unmapped": 0, 00:27:36.751 "num_unmap_ops": 0, 00:27:36.751 "bytes_copied": 0, 00:27:36.751 "num_copy_ops": 0, 00:27:36.751 "read_latency_ticks": 2481510768126, 00:27:36.751 "max_read_latency_ticks": 11312960, 00:27:36.751 "min_read_latency_ticks": 198236, 00:27:36.751 "write_latency_ticks": 0, 00:27:36.751 "max_write_latency_ticks": 0, 00:27:36.751 "min_write_latency_ticks": 0, 00:27:36.751 "unmap_latency_ticks": 0, 00:27:36.751 "max_unmap_latency_ticks": 0, 00:27:36.751 "min_unmap_latency_ticks": 0, 00:27:36.751 "copy_latency_ticks": 0, 00:27:36.751 "max_copy_latency_ticks": 0, 00:27:36.751 "min_copy_latency_ticks": 0, 00:27:36.751 "io_error": {} 00:27:36.751 } 00:27:36.751 ] 00:27:36.751 }' 00:27:36.751 11:38:20 blockdev_general.bdev_stat -- bdev/blockdev.sh@578 -- # jq -r '.bdevs[0].num_read_ops' 00:27:36.751 11:38:20 blockdev_general.bdev_stat -- bdev/blockdev.sh@578 -- # io_count2=269060 00:27:36.751 11:38:20 blockdev_general.bdev_stat -- bdev/blockdev.sh@583 -- # '[' 254464 -lt 245764 ']' 00:27:36.751 11:38:20 blockdev_general.bdev_stat -- bdev/blockdev.sh@583 -- # '[' 254464 -gt 269060 ']' 00:27:36.751 11:38:20 blockdev_general.bdev_stat -- bdev/blockdev.sh@608 -- # rpc_cmd bdev_malloc_delete Malloc_STAT 00:27:36.751 11:38:20 blockdev_general.bdev_stat -- common/autotest_common.sh@560 -- # xtrace_disable 00:27:36.751 11:38:20 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:27:36.751 00:27:36.751 Latency(us) 00:27:36.751 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:36.751 Job: Malloc_STAT (Core Mask 0x1, workload: randread, depth: 256, IO size: 4096) 00:27:36.751 Malloc_STAT : 2.18 62138.38 242.73 0.00 0.00 4111.13 1018.66 4929.45 00:27:36.751 Job: Malloc_STAT (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:27:36.751 Malloc_STAT : 2.18 65379.08 255.39 0.00 0.00 3907.81 655.36 4815.47 00:27:36.751 =================================================================================================================== 00:27:36.751 Total : 127517.45 498.12 0.00 0.00 4006.84 655.36 4929.45 00:27:36.751 0 00:27:36.751 11:38:20 blockdev_general.bdev_stat -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:27:36.751 11:38:20 blockdev_general.bdev_stat -- bdev/blockdev.sh@609 -- # killprocess 127948 00:27:36.751 11:38:20 blockdev_general.bdev_stat -- common/autotest_common.sh@949 -- # '[' -z 127948 ']' 00:27:36.751 11:38:20 blockdev_general.bdev_stat -- common/autotest_common.sh@953 -- # kill -0 127948 00:27:36.751 11:38:20 blockdev_general.bdev_stat -- common/autotest_common.sh@954 -- # uname 00:27:36.751 11:38:20 blockdev_general.bdev_stat -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:27:36.751 11:38:20 blockdev_general.bdev_stat -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 127948 00:27:36.751 11:38:20 blockdev_general.bdev_stat -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:27:36.751 11:38:20 blockdev_general.bdev_stat -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:27:36.751 11:38:20 blockdev_general.bdev_stat -- common/autotest_common.sh@967 -- # echo 'killing process with pid 127948' 00:27:36.751 killing process with pid 127948 00:27:36.751 11:38:20 blockdev_general.bdev_stat -- common/autotest_common.sh@968 -- # kill 127948 00:27:36.751 Received shutdown signal, test time was about 2.253395 seconds 00:27:36.751 00:27:36.751 Latency(us) 00:27:36.751 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:36.751 =================================================================================================================== 00:27:36.751 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:27:36.751 11:38:20 blockdev_general.bdev_stat -- common/autotest_common.sh@973 -- # wait 127948 00:27:37.011 11:38:20 blockdev_general.bdev_stat -- bdev/blockdev.sh@610 -- # trap - SIGINT SIGTERM EXIT 00:27:37.011 00:27:37.011 real 0m3.468s 00:27:37.011 user 0m6.933s 00:27:37.011 sys 0m0.400s 00:27:37.011 11:38:20 blockdev_general.bdev_stat -- common/autotest_common.sh@1125 -- # xtrace_disable 00:27:37.011 11:38:20 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:27:37.011 ************************************ 00:27:37.011 END TEST bdev_stat 00:27:37.011 ************************************ 00:27:37.011 11:38:20 blockdev_general -- bdev/blockdev.sh@794 -- # [[ bdev == gpt ]] 00:27:37.011 11:38:20 blockdev_general -- bdev/blockdev.sh@798 -- # [[ bdev == crypto_sw ]] 00:27:37.011 11:38:20 blockdev_general -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:27:37.011 11:38:20 blockdev_general -- bdev/blockdev.sh@811 -- # cleanup 00:27:37.011 11:38:20 blockdev_general -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:27:37.011 11:38:20 blockdev_general -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:27:37.011 11:38:20 blockdev_general -- bdev/blockdev.sh@26 -- # [[ bdev == rbd ]] 00:27:37.011 11:38:20 blockdev_general -- bdev/blockdev.sh@30 -- # [[ bdev == daos ]] 00:27:37.011 11:38:20 blockdev_general -- bdev/blockdev.sh@34 -- # [[ bdev = \g\p\t ]] 00:27:37.011 11:38:20 blockdev_general -- bdev/blockdev.sh@40 -- # [[ bdev == xnvme ]] 00:27:37.011 00:27:37.011 real 1m46.798s 00:27:37.011 user 6m57.577s 00:27:37.011 sys 0m18.732s 00:27:37.011 11:38:20 blockdev_general -- common/autotest_common.sh@1125 -- # xtrace_disable 00:27:37.011 11:38:20 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:27:37.011 ************************************ 00:27:37.011 END TEST blockdev_general 00:27:37.011 ************************************ 00:27:37.011 11:38:20 -- spdk/autotest.sh@190 -- # run_test bdev_raid /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh 00:27:37.011 11:38:20 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:27:37.011 11:38:20 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:27:37.011 11:38:20 -- common/autotest_common.sh@10 -- # set +x 00:27:37.011 ************************************ 00:27:37.011 START TEST bdev_raid 00:27:37.011 ************************************ 00:27:37.011 11:38:20 bdev_raid -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh 00:27:37.271 * Looking for test storage... 00:27:37.271 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:27:37.271 11:38:21 bdev_raid -- bdev/bdev_raid.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:27:37.271 11:38:21 bdev_raid -- bdev/nbd_common.sh@6 -- # set -e 00:27:37.271 11:38:21 bdev_raid -- bdev/bdev_raid.sh@15 -- # rpc_py='/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock' 00:27:37.271 11:38:21 bdev_raid -- bdev/bdev_raid.sh@851 -- # mkdir -p /raidtest 00:27:37.271 11:38:21 bdev_raid -- bdev/bdev_raid.sh@852 -- # trap 'cleanup; exit 1' EXIT 00:27:37.271 11:38:21 bdev_raid -- bdev/bdev_raid.sh@854 -- # base_blocklen=512 00:27:37.271 11:38:21 bdev_raid -- bdev/bdev_raid.sh@856 -- # uname -s 00:27:37.271 11:38:21 bdev_raid -- bdev/bdev_raid.sh@856 -- # '[' Linux = Linux ']' 00:27:37.271 11:38:21 bdev_raid -- bdev/bdev_raid.sh@856 -- # modprobe -n nbd 00:27:37.271 11:38:21 bdev_raid -- bdev/bdev_raid.sh@857 -- # has_nbd=true 00:27:37.271 11:38:21 bdev_raid -- bdev/bdev_raid.sh@858 -- # modprobe nbd 00:27:37.271 11:38:21 bdev_raid -- bdev/bdev_raid.sh@859 -- # run_test raid_function_test_raid0 raid_function_test raid0 00:27:37.271 11:38:21 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:27:37.271 11:38:21 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:27:37.271 11:38:21 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:27:37.271 ************************************ 00:27:37.271 START TEST raid_function_test_raid0 00:27:37.271 ************************************ 00:27:37.271 11:38:21 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@1124 -- # raid_function_test raid0 00:27:37.271 11:38:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@80 -- # local raid_level=raid0 00:27:37.271 11:38:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@81 -- # local nbd=/dev/nbd0 00:27:37.271 11:38:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@82 -- # local raid_bdev 00:27:37.271 11:38:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@85 -- # raid_pid=128454 00:27:37.271 11:38:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@86 -- # echo 'Process raid pid: 128454' 00:27:37.271 Process raid pid: 128454 00:27:37.271 11:38:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@84 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:27:37.271 11:38:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@87 -- # waitforlisten 128454 /var/tmp/spdk-raid.sock 00:27:37.271 11:38:21 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@830 -- # '[' -z 128454 ']' 00:27:37.271 11:38:21 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:27:37.271 11:38:21 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@835 -- # local max_retries=100 00:27:37.271 11:38:21 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:27:37.271 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:27:37.271 11:38:21 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@839 -- # xtrace_disable 00:27:37.271 11:38:21 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@10 -- # set +x 00:27:37.271 [2024-06-10 11:38:21.177557] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:27:37.271 [2024-06-10 11:38:21.177607] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:27:37.530 [2024-06-10 11:38:21.265937] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:37.530 [2024-06-10 11:38:21.356241] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:27:37.530 [2024-06-10 11:38:21.411591] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:37.530 [2024-06-10 11:38:21.411612] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:38.099 11:38:21 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:27:38.099 11:38:21 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@863 -- # return 0 00:27:38.099 11:38:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@89 -- # configure_raid_bdev raid0 00:27:38.099 11:38:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@66 -- # local raid_level=raid0 00:27:38.099 11:38:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@67 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:27:38.099 11:38:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@69 -- # cat 00:27:38.099 11:38:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@74 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock 00:27:38.361 [2024-06-10 11:38:22.173198] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:27:38.361 [2024-06-10 11:38:22.174143] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:27:38.361 [2024-06-10 11:38:22.174184] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xa52050 00:27:38.361 [2024-06-10 11:38:22.174191] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:27:38.361 [2024-06-10 11:38:22.174309] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x8b52f0 00:27:38.361 [2024-06-10 11:38:22.174382] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xa52050 00:27:38.361 [2024-06-10 11:38:22.174388] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid, raid_bdev 0xa52050 00:27:38.361 [2024-06-10 11:38:22.174454] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:38.361 Base_1 00:27:38.361 Base_2 00:27:38.361 11:38:22 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@76 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:27:38.361 11:38:22 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:27:38.361 11:38:22 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # jq -r '.[0]["name"] | select(.)' 00:27:38.623 11:38:22 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # raid_bdev=raid 00:27:38.623 11:38:22 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@91 -- # '[' raid = '' ']' 00:27:38.623 11:38:22 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@96 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid /dev/nbd0 00:27:38.623 11:38:22 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:38.623 11:38:22 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@10 -- # bdev_list=('raid') 00:27:38.623 11:38:22 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:27:38.623 11:38:22 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:27:38.623 11:38:22 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:27:38.623 11:38:22 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@12 -- # local i 00:27:38.623 11:38:22 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:27:38.623 11:38:22 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:38.623 11:38:22 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid /dev/nbd0 00:27:38.624 [2024-06-10 11:38:22.534149] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa41d60 00:27:38.624 /dev/nbd0 00:27:38.624 11:38:22 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:27:38.624 11:38:22 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:27:38.624 11:38:22 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:27:38.624 11:38:22 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@868 -- # local i 00:27:38.624 11:38:22 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:27:38.624 11:38:22 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:27:38.624 11:38:22 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:27:38.882 11:38:22 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@872 -- # break 00:27:38.882 11:38:22 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:27:38.882 11:38:22 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:27:38.882 11:38:22 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:38.882 1+0 records in 00:27:38.882 1+0 records out 00:27:38.882 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00023629 s, 17.3 MB/s 00:27:38.882 11:38:22 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:38.882 11:38:22 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@885 -- # size=4096 00:27:38.882 11:38:22 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:38.882 11:38:22 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:27:38.882 11:38:22 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@888 -- # return 0 00:27:38.882 11:38:22 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:38.882 11:38:22 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:38.882 11:38:22 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@97 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:27:38.882 11:38:22 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:38.882 11:38:22 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:27:38.882 11:38:22 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:27:38.882 { 00:27:38.882 "nbd_device": "/dev/nbd0", 00:27:38.882 "bdev_name": "raid" 00:27:38.882 } 00:27:38.882 ]' 00:27:38.882 11:38:22 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # echo '[ 00:27:38.882 { 00:27:38.882 "nbd_device": "/dev/nbd0", 00:27:38.882 "bdev_name": "raid" 00:27:38.882 } 00:27:38.882 ]' 00:27:38.882 11:38:22 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:27:38.882 11:38:22 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # nbd_disks_name=/dev/nbd0 00:27:38.882 11:38:22 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # echo /dev/nbd0 00:27:38.882 11:38:22 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:27:38.882 11:38:22 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # count=1 00:27:38.882 11:38:22 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@66 -- # echo 1 00:27:38.882 11:38:22 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@97 -- # count=1 00:27:38.882 11:38:22 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@98 -- # '[' 1 -ne 1 ']' 00:27:38.882 11:38:22 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@102 -- # raid_unmap_data_verify /dev/nbd0 /var/tmp/spdk-raid.sock 00:27:38.882 11:38:22 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@18 -- # hash blkdiscard 00:27:38.882 11:38:22 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@19 -- # local nbd=/dev/nbd0 00:27:38.882 11:38:22 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@20 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:38.882 11:38:22 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@21 -- # local blksize 00:27:39.141 11:38:22 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # lsblk -o LOG-SEC /dev/nbd0 00:27:39.141 11:38:22 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # cut -d ' ' -f 5 00:27:39.141 11:38:22 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # grep -v LOG-SEC 00:27:39.141 11:38:22 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # blksize=512 00:27:39.141 11:38:22 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@23 -- # local rw_blk_num=4096 00:27:39.141 11:38:22 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@24 -- # local rw_len=2097152 00:27:39.141 11:38:22 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@25 -- # unmap_blk_offs=('0' '1028' '321') 00:27:39.141 11:38:22 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@25 -- # local unmap_blk_offs 00:27:39.141 11:38:22 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@26 -- # unmap_blk_nums=('128' '2035' '456') 00:27:39.141 11:38:22 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@26 -- # local unmap_blk_nums 00:27:39.141 11:38:22 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@27 -- # local unmap_off 00:27:39.141 11:38:22 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@28 -- # local unmap_len 00:27:39.141 11:38:22 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@31 -- # dd if=/dev/urandom of=/raidtest/raidrandtest bs=512 count=4096 00:27:39.141 4096+0 records in 00:27:39.141 4096+0 records out 00:27:39.141 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.0274042 s, 76.5 MB/s 00:27:39.141 11:38:22 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@32 -- # dd if=/raidtest/raidrandtest of=/dev/nbd0 bs=512 count=4096 oflag=direct 00:27:39.141 4096+0 records in 00:27:39.141 4096+0 records out 00:27:39.141 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.176545 s, 11.9 MB/s 00:27:39.141 11:38:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@33 -- # blockdev --flushbufs /dev/nbd0 00:27:39.141 11:38:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@36 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:27:39.141 11:38:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i = 0 )) 00:27:39.141 11:38:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:27:39.141 11:38:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=0 00:27:39.141 11:38:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=65536 00:27:39.141 11:38:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=0 count=128 conv=notrunc 00:27:39.141 128+0 records in 00:27:39.141 128+0 records out 00:27:39.141 65536 bytes (66 kB, 64 KiB) copied, 0.000826781 s, 79.3 MB/s 00:27:39.141 11:38:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 0 -l 65536 /dev/nbd0 00:27:39.141 11:38:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:27:39.141 11:38:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:27:39.401 11:38:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:27:39.401 11:38:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:27:39.401 11:38:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=526336 00:27:39.401 11:38:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=1041920 00:27:39.401 11:38:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=1028 count=2035 conv=notrunc 00:27:39.401 2035+0 records in 00:27:39.401 2035+0 records out 00:27:39.401 1041920 bytes (1.0 MB, 1018 KiB) copied, 0.0116443 s, 89.5 MB/s 00:27:39.401 11:38:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 526336 -l 1041920 /dev/nbd0 00:27:39.401 11:38:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:27:39.401 11:38:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:27:39.401 11:38:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:27:39.401 11:38:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:27:39.401 11:38:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=164352 00:27:39.401 11:38:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=233472 00:27:39.401 11:38:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=321 count=456 conv=notrunc 00:27:39.401 456+0 records in 00:27:39.401 456+0 records out 00:27:39.401 233472 bytes (233 kB, 228 KiB) copied, 0.00269109 s, 86.8 MB/s 00:27:39.401 11:38:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 164352 -l 233472 /dev/nbd0 00:27:39.401 11:38:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:27:39.401 11:38:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:27:39.401 11:38:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:27:39.401 11:38:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:27:39.401 11:38:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@54 -- # return 0 00:27:39.401 11:38:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@104 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:27:39.401 11:38:23 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:39.401 11:38:23 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:27:39.401 11:38:23 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:39.401 11:38:23 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@51 -- # local i 00:27:39.401 11:38:23 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:39.401 11:38:23 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:27:39.401 11:38:23 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:27:39.401 [2024-06-10 11:38:23.336520] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:39.401 11:38:23 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:27:39.401 11:38:23 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:27:39.401 11:38:23 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:39.401 11:38:23 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:39.401 11:38:23 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:27:39.401 11:38:23 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@41 -- # break 00:27:39.401 11:38:23 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@45 -- # return 0 00:27:39.660 11:38:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@105 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:27:39.660 11:38:23 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:39.660 11:38:23 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:27:39.660 11:38:23 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:27:39.660 11:38:23 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:27:39.660 11:38:23 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:27:39.660 11:38:23 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:27:39.660 11:38:23 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # echo '' 00:27:39.660 11:38:23 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:27:39.660 11:38:23 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # true 00:27:39.660 11:38:23 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # count=0 00:27:39.660 11:38:23 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@66 -- # echo 0 00:27:39.660 11:38:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@105 -- # count=0 00:27:39.660 11:38:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@106 -- # '[' 0 -ne 0 ']' 00:27:39.660 11:38:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@110 -- # killprocess 128454 00:27:39.660 11:38:23 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@949 -- # '[' -z 128454 ']' 00:27:39.660 11:38:23 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@953 -- # kill -0 128454 00:27:39.660 11:38:23 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@954 -- # uname 00:27:39.660 11:38:23 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:27:39.660 11:38:23 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 128454 00:27:39.919 11:38:23 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:27:39.919 11:38:23 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:27:39.919 11:38:23 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@967 -- # echo 'killing process with pid 128454' 00:27:39.919 killing process with pid 128454 00:27:39.919 11:38:23 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@968 -- # kill 128454 00:27:39.919 [2024-06-10 11:38:23.623745] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:27:39.919 [2024-06-10 11:38:23.623791] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:39.919 [2024-06-10 11:38:23.623819] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:39.919 [2024-06-10 11:38:23.623827] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa52050 name raid, state offline 00:27:39.919 11:38:23 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@973 -- # wait 128454 00:27:39.919 [2024-06-10 11:38:23.639287] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:27:39.919 11:38:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@112 -- # return 0 00:27:39.919 00:27:39.919 real 0m2.706s 00:27:39.919 user 0m3.452s 00:27:39.919 sys 0m1.025s 00:27:39.919 11:38:23 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@1125 -- # xtrace_disable 00:27:39.919 11:38:23 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@10 -- # set +x 00:27:39.919 ************************************ 00:27:39.919 END TEST raid_function_test_raid0 00:27:39.919 ************************************ 00:27:40.179 11:38:23 bdev_raid -- bdev/bdev_raid.sh@860 -- # run_test raid_function_test_concat raid_function_test concat 00:27:40.179 11:38:23 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:27:40.179 11:38:23 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:27:40.179 11:38:23 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:27:40.179 ************************************ 00:27:40.179 START TEST raid_function_test_concat 00:27:40.179 ************************************ 00:27:40.179 11:38:23 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@1124 -- # raid_function_test concat 00:27:40.179 11:38:23 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@80 -- # local raid_level=concat 00:27:40.179 11:38:23 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@81 -- # local nbd=/dev/nbd0 00:27:40.179 11:38:23 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@82 -- # local raid_bdev 00:27:40.179 11:38:23 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@85 -- # raid_pid=128926 00:27:40.179 11:38:23 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@86 -- # echo 'Process raid pid: 128926' 00:27:40.179 Process raid pid: 128926 00:27:40.179 11:38:23 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@84 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:27:40.179 11:38:23 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@87 -- # waitforlisten 128926 /var/tmp/spdk-raid.sock 00:27:40.179 11:38:23 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@830 -- # '[' -z 128926 ']' 00:27:40.179 11:38:23 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:27:40.179 11:38:23 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@835 -- # local max_retries=100 00:27:40.179 11:38:23 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:27:40.179 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:27:40.179 11:38:23 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@839 -- # xtrace_disable 00:27:40.179 11:38:23 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@10 -- # set +x 00:27:40.179 [2024-06-10 11:38:23.957164] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:27:40.179 [2024-06-10 11:38:23.957214] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:27:40.179 [2024-06-10 11:38:24.045685] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:40.438 [2024-06-10 11:38:24.138748] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:27:40.438 [2024-06-10 11:38:24.196155] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:40.438 [2024-06-10 11:38:24.196177] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:41.007 11:38:24 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:27:41.007 11:38:24 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@863 -- # return 0 00:27:41.007 11:38:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@89 -- # configure_raid_bdev concat 00:27:41.007 11:38:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@66 -- # local raid_level=concat 00:27:41.007 11:38:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@67 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:27:41.007 11:38:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@69 -- # cat 00:27:41.007 11:38:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@74 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock 00:27:41.007 [2024-06-10 11:38:24.949948] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:27:41.007 [2024-06-10 11:38:24.950961] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:27:41.007 [2024-06-10 11:38:24.951004] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2313050 00:27:41.007 [2024-06-10 11:38:24.951012] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:27:41.007 [2024-06-10 11:38:24.951140] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21762f0 00:27:41.007 [2024-06-10 11:38:24.951222] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2313050 00:27:41.007 [2024-06-10 11:38:24.951229] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid, raid_bdev 0x2313050 00:27:41.007 [2024-06-10 11:38:24.951297] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:41.267 Base_1 00:27:41.267 Base_2 00:27:41.267 11:38:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@76 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:27:41.267 11:38:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:27:41.267 11:38:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # jq -r '.[0]["name"] | select(.)' 00:27:41.267 11:38:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # raid_bdev=raid 00:27:41.267 11:38:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@91 -- # '[' raid = '' ']' 00:27:41.267 11:38:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@96 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid /dev/nbd0 00:27:41.267 11:38:25 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:41.267 11:38:25 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@10 -- # bdev_list=('raid') 00:27:41.267 11:38:25 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:27:41.267 11:38:25 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:27:41.267 11:38:25 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:27:41.267 11:38:25 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@12 -- # local i 00:27:41.267 11:38:25 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:27:41.267 11:38:25 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:41.267 11:38:25 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid /dev/nbd0 00:27:41.526 [2024-06-10 11:38:25.310898] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2302d60 00:27:41.526 /dev/nbd0 00:27:41.526 11:38:25 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:27:41.526 11:38:25 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:27:41.526 11:38:25 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:27:41.526 11:38:25 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@868 -- # local i 00:27:41.526 11:38:25 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:27:41.526 11:38:25 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:27:41.526 11:38:25 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:27:41.526 11:38:25 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@872 -- # break 00:27:41.526 11:38:25 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:27:41.526 11:38:25 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:27:41.526 11:38:25 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:41.526 1+0 records in 00:27:41.526 1+0 records out 00:27:41.526 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00026733 s, 15.3 MB/s 00:27:41.526 11:38:25 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:41.526 11:38:25 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@885 -- # size=4096 00:27:41.526 11:38:25 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:41.526 11:38:25 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:27:41.526 11:38:25 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@888 -- # return 0 00:27:41.526 11:38:25 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:41.526 11:38:25 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:41.526 11:38:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@97 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:27:41.526 11:38:25 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:41.526 11:38:25 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:27:41.786 11:38:25 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:27:41.786 { 00:27:41.786 "nbd_device": "/dev/nbd0", 00:27:41.786 "bdev_name": "raid" 00:27:41.786 } 00:27:41.786 ]' 00:27:41.786 11:38:25 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # echo '[ 00:27:41.786 { 00:27:41.786 "nbd_device": "/dev/nbd0", 00:27:41.786 "bdev_name": "raid" 00:27:41.786 } 00:27:41.786 ]' 00:27:41.786 11:38:25 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:27:41.786 11:38:25 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # nbd_disks_name=/dev/nbd0 00:27:41.786 11:38:25 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # echo /dev/nbd0 00:27:41.786 11:38:25 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:27:41.786 11:38:25 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # count=1 00:27:41.786 11:38:25 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@66 -- # echo 1 00:27:41.786 11:38:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@97 -- # count=1 00:27:41.786 11:38:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@98 -- # '[' 1 -ne 1 ']' 00:27:41.786 11:38:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@102 -- # raid_unmap_data_verify /dev/nbd0 /var/tmp/spdk-raid.sock 00:27:41.786 11:38:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@18 -- # hash blkdiscard 00:27:41.786 11:38:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@19 -- # local nbd=/dev/nbd0 00:27:41.786 11:38:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@20 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:41.786 11:38:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@21 -- # local blksize 00:27:41.786 11:38:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # lsblk -o LOG-SEC /dev/nbd0 00:27:41.786 11:38:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # cut -d ' ' -f 5 00:27:41.786 11:38:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # grep -v LOG-SEC 00:27:41.786 11:38:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # blksize=512 00:27:41.786 11:38:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@23 -- # local rw_blk_num=4096 00:27:41.786 11:38:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@24 -- # local rw_len=2097152 00:27:41.786 11:38:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@25 -- # unmap_blk_offs=('0' '1028' '321') 00:27:41.786 11:38:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@25 -- # local unmap_blk_offs 00:27:41.786 11:38:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@26 -- # unmap_blk_nums=('128' '2035' '456') 00:27:41.786 11:38:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@26 -- # local unmap_blk_nums 00:27:41.786 11:38:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@27 -- # local unmap_off 00:27:41.786 11:38:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@28 -- # local unmap_len 00:27:41.786 11:38:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@31 -- # dd if=/dev/urandom of=/raidtest/raidrandtest bs=512 count=4096 00:27:41.786 4096+0 records in 00:27:41.786 4096+0 records out 00:27:41.786 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.026746 s, 78.4 MB/s 00:27:41.786 11:38:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@32 -- # dd if=/raidtest/raidrandtest of=/dev/nbd0 bs=512 count=4096 oflag=direct 00:27:42.046 4096+0 records in 00:27:42.046 4096+0 records out 00:27:42.046 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.139087 s, 15.1 MB/s 00:27:42.046 11:38:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@33 -- # blockdev --flushbufs /dev/nbd0 00:27:42.046 11:38:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@36 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:27:42.046 11:38:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i = 0 )) 00:27:42.046 11:38:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:27:42.046 11:38:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=0 00:27:42.046 11:38:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=65536 00:27:42.046 11:38:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=0 count=128 conv=notrunc 00:27:42.046 128+0 records in 00:27:42.046 128+0 records out 00:27:42.046 65536 bytes (66 kB, 64 KiB) copied, 0.00066614 s, 98.4 MB/s 00:27:42.046 11:38:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 0 -l 65536 /dev/nbd0 00:27:42.046 11:38:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:27:42.046 11:38:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:27:42.046 11:38:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:27:42.046 11:38:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:27:42.046 11:38:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=526336 00:27:42.046 11:38:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=1041920 00:27:42.046 11:38:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=1028 count=2035 conv=notrunc 00:27:42.046 2035+0 records in 00:27:42.046 2035+0 records out 00:27:42.046 1041920 bytes (1.0 MB, 1018 KiB) copied, 0.00390679 s, 267 MB/s 00:27:42.046 11:38:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 526336 -l 1041920 /dev/nbd0 00:27:42.046 11:38:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:27:42.046 11:38:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:27:42.046 11:38:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:27:42.046 11:38:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:27:42.046 11:38:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=164352 00:27:42.046 11:38:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=233472 00:27:42.046 11:38:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=321 count=456 conv=notrunc 00:27:42.046 456+0 records in 00:27:42.046 456+0 records out 00:27:42.046 233472 bytes (233 kB, 228 KiB) copied, 0.00271354 s, 86.0 MB/s 00:27:42.046 11:38:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 164352 -l 233472 /dev/nbd0 00:27:42.046 11:38:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:27:42.046 11:38:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:27:42.046 11:38:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:27:42.046 11:38:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:27:42.046 11:38:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@54 -- # return 0 00:27:42.046 11:38:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@104 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:27:42.046 11:38:25 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:42.046 11:38:25 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:27:42.046 11:38:25 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:42.046 11:38:25 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@51 -- # local i 00:27:42.046 11:38:25 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:42.046 11:38:25 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:27:42.305 11:38:26 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:27:42.306 [2024-06-10 11:38:26.044435] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:42.306 11:38:26 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:27:42.306 11:38:26 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:27:42.306 11:38:26 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:42.306 11:38:26 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:42.306 11:38:26 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:27:42.306 11:38:26 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@41 -- # break 00:27:42.306 11:38:26 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@45 -- # return 0 00:27:42.306 11:38:26 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@105 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:27:42.306 11:38:26 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:42.306 11:38:26 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:27:42.306 11:38:26 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:27:42.306 11:38:26 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:27:42.306 11:38:26 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:27:42.565 11:38:26 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:27:42.565 11:38:26 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # echo '' 00:27:42.565 11:38:26 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:27:42.565 11:38:26 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # true 00:27:42.565 11:38:26 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # count=0 00:27:42.565 11:38:26 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@66 -- # echo 0 00:27:42.565 11:38:26 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@105 -- # count=0 00:27:42.565 11:38:26 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@106 -- # '[' 0 -ne 0 ']' 00:27:42.565 11:38:26 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@110 -- # killprocess 128926 00:27:42.565 11:38:26 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@949 -- # '[' -z 128926 ']' 00:27:42.565 11:38:26 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@953 -- # kill -0 128926 00:27:42.565 11:38:26 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@954 -- # uname 00:27:42.565 11:38:26 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:27:42.565 11:38:26 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 128926 00:27:42.565 11:38:26 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:27:42.565 11:38:26 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:27:42.565 11:38:26 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@967 -- # echo 'killing process with pid 128926' 00:27:42.565 killing process with pid 128926 00:27:42.565 11:38:26 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@968 -- # kill 128926 00:27:42.565 [2024-06-10 11:38:26.326884] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:27:42.565 [2024-06-10 11:38:26.326930] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:42.565 [2024-06-10 11:38:26.326958] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:42.565 [2024-06-10 11:38:26.326966] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2313050 name raid, state offline 00:27:42.565 11:38:26 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@973 -- # wait 128926 00:27:42.565 [2024-06-10 11:38:26.343728] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:27:42.825 11:38:26 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@112 -- # return 0 00:27:42.825 00:27:42.825 real 0m2.640s 00:27:42.825 user 0m3.410s 00:27:42.825 sys 0m0.968s 00:27:42.825 11:38:26 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@1125 -- # xtrace_disable 00:27:42.825 11:38:26 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@10 -- # set +x 00:27:42.825 ************************************ 00:27:42.825 END TEST raid_function_test_concat 00:27:42.825 ************************************ 00:27:42.825 11:38:26 bdev_raid -- bdev/bdev_raid.sh@863 -- # run_test raid0_resize_test raid0_resize_test 00:27:42.825 11:38:26 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:27:42.825 11:38:26 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:27:42.825 11:38:26 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:27:42.825 ************************************ 00:27:42.825 START TEST raid0_resize_test 00:27:42.825 ************************************ 00:27:42.825 11:38:26 bdev_raid.raid0_resize_test -- common/autotest_common.sh@1124 -- # raid0_resize_test 00:27:42.825 11:38:26 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@347 -- # local blksize=512 00:27:42.825 11:38:26 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@348 -- # local bdev_size_mb=32 00:27:42.825 11:38:26 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@349 -- # local new_bdev_size_mb=64 00:27:42.825 11:38:26 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@350 -- # local blkcnt 00:27:42.825 11:38:26 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@351 -- # local raid_size_mb 00:27:42.825 11:38:26 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@352 -- # local new_raid_size_mb 00:27:42.825 11:38:26 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@355 -- # raid_pid=129388 00:27:42.825 11:38:26 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@356 -- # echo 'Process raid pid: 129388' 00:27:42.825 Process raid pid: 129388 00:27:42.825 11:38:26 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@357 -- # waitforlisten 129388 /var/tmp/spdk-raid.sock 00:27:42.825 11:38:26 bdev_raid.raid0_resize_test -- common/autotest_common.sh@830 -- # '[' -z 129388 ']' 00:27:42.825 11:38:26 bdev_raid.raid0_resize_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:27:42.825 11:38:26 bdev_raid.raid0_resize_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:27:42.825 11:38:26 bdev_raid.raid0_resize_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:27:42.825 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:27:42.825 11:38:26 bdev_raid.raid0_resize_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:27:42.825 11:38:26 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@354 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:27:42.825 11:38:26 bdev_raid.raid0_resize_test -- common/autotest_common.sh@10 -- # set +x 00:27:42.825 [2024-06-10 11:38:26.675665] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:27:42.825 [2024-06-10 11:38:26.675717] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:27:42.825 [2024-06-10 11:38:26.763315] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:43.084 [2024-06-10 11:38:26.847526] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:27:43.084 [2024-06-10 11:38:26.897009] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:43.084 [2024-06-10 11:38:26.897034] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:43.652 11:38:27 bdev_raid.raid0_resize_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:27:43.652 11:38:27 bdev_raid.raid0_resize_test -- common/autotest_common.sh@863 -- # return 0 00:27:43.653 11:38:27 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@359 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_1 32 512 00:27:43.912 Base_1 00:27:43.912 11:38:27 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@360 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_2 32 512 00:27:43.912 Base_2 00:27:43.912 11:38:27 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@362 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r 0 -b 'Base_1 Base_2' -n Raid 00:27:44.171 [2024-06-10 11:38:27.933461] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:27:44.171 [2024-06-10 11:38:27.934592] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:27:44.171 [2024-06-10 11:38:27.934633] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xfbe480 00:27:44.171 [2024-06-10 11:38:27.934641] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:27:44.171 [2024-06-10 11:38:27.934792] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf77ae0 00:27:44.171 [2024-06-10 11:38:27.934857] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xfbe480 00:27:44.171 [2024-06-10 11:38:27.934864] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Raid, raid_bdev 0xfbe480 00:27:44.171 [2024-06-10 11:38:27.934953] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:44.171 11:38:27 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@365 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_1 64 00:27:44.171 [2024-06-10 11:38:28.097873] bdev_raid.c:2262:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:27:44.171 [2024-06-10 11:38:28.097890] bdev_raid.c:2275:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_1' was resized: old size 65536, new size 131072 00:27:44.171 true 00:27:44.171 11:38:28 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:27:44.171 11:38:28 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # jq '.[].num_blocks' 00:27:44.430 [2024-06-10 11:38:28.274400] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:44.430 11:38:28 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # blkcnt=131072 00:27:44.430 11:38:28 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@369 -- # raid_size_mb=64 00:27:44.430 11:38:28 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@370 -- # '[' 64 '!=' 64 ']' 00:27:44.430 11:38:28 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@376 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_2 64 00:27:44.689 [2024-06-10 11:38:28.458774] bdev_raid.c:2262:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:27:44.689 [2024-06-10 11:38:28.458785] bdev_raid.c:2275:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_2' was resized: old size 65536, new size 131072 00:27:44.689 [2024-06-10 11:38:28.458800] bdev_raid.c:2289:raid_bdev_resize_base_bdev: *NOTICE*: raid bdev 'Raid': block count was changed from 131072 to 262144 00:27:44.689 true 00:27:44.689 11:38:28 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:27:44.689 11:38:28 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # jq '.[].num_blocks' 00:27:44.947 [2024-06-10 11:38:28.639317] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:44.947 11:38:28 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # blkcnt=262144 00:27:44.947 11:38:28 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@380 -- # raid_size_mb=128 00:27:44.947 11:38:28 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@381 -- # '[' 128 '!=' 128 ']' 00:27:44.947 11:38:28 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@386 -- # killprocess 129388 00:27:44.947 11:38:28 bdev_raid.raid0_resize_test -- common/autotest_common.sh@949 -- # '[' -z 129388 ']' 00:27:44.947 11:38:28 bdev_raid.raid0_resize_test -- common/autotest_common.sh@953 -- # kill -0 129388 00:27:44.947 11:38:28 bdev_raid.raid0_resize_test -- common/autotest_common.sh@954 -- # uname 00:27:44.947 11:38:28 bdev_raid.raid0_resize_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:27:44.947 11:38:28 bdev_raid.raid0_resize_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 129388 00:27:44.947 11:38:28 bdev_raid.raid0_resize_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:27:44.947 11:38:28 bdev_raid.raid0_resize_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:27:44.947 11:38:28 bdev_raid.raid0_resize_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 129388' 00:27:44.947 killing process with pid 129388 00:27:44.947 11:38:28 bdev_raid.raid0_resize_test -- common/autotest_common.sh@968 -- # kill 129388 00:27:44.947 [2024-06-10 11:38:28.701236] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:27:44.947 11:38:28 bdev_raid.raid0_resize_test -- common/autotest_common.sh@973 -- # wait 129388 00:27:44.947 [2024-06-10 11:38:28.701276] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:44.947 [2024-06-10 11:38:28.701307] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:44.947 [2024-06-10 11:38:28.701315] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfbe480 name Raid, state offline 00:27:44.947 [2024-06-10 11:38:28.702478] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:27:44.947 11:38:28 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@388 -- # return 0 00:27:44.947 00:27:44.947 real 0m2.259s 00:27:44.947 user 0m3.358s 00:27:44.947 sys 0m0.481s 00:27:44.947 11:38:28 bdev_raid.raid0_resize_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:27:44.947 11:38:28 bdev_raid.raid0_resize_test -- common/autotest_common.sh@10 -- # set +x 00:27:44.947 ************************************ 00:27:44.947 END TEST raid0_resize_test 00:27:44.947 ************************************ 00:27:45.205 11:38:28 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:27:45.205 11:38:28 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:27:45.205 11:38:28 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 2 false 00:27:45.205 11:38:28 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:27:45.205 11:38:28 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:27:45.205 11:38:28 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:27:45.205 ************************************ 00:27:45.205 START TEST raid_state_function_test 00:27:45.205 ************************************ 00:27:45.205 11:38:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # raid_state_function_test raid0 2 false 00:27:45.205 11:38:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:27:45.205 11:38:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:27:45.205 11:38:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:27:45.205 11:38:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:27:45.205 11:38:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:27:45.206 11:38:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:27:45.206 11:38:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:27:45.206 11:38:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:27:45.206 11:38:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:27:45.206 11:38:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:27:45.206 11:38:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:27:45.206 11:38:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:27:45.206 11:38:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:27:45.206 11:38:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:27:45.206 11:38:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:27:45.206 11:38:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:27:45.206 11:38:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:27:45.206 11:38:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:27:45.206 11:38:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:27:45.206 11:38:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:27:45.206 11:38:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:27:45.206 11:38:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:27:45.206 11:38:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:27:45.206 11:38:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=129666 00:27:45.206 11:38:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 129666' 00:27:45.206 Process raid pid: 129666 00:27:45.206 11:38:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:27:45.206 11:38:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 129666 /var/tmp/spdk-raid.sock 00:27:45.206 11:38:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@830 -- # '[' -z 129666 ']' 00:27:45.206 11:38:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:27:45.206 11:38:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:27:45.206 11:38:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:27:45.206 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:27:45.206 11:38:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:27:45.206 11:38:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:27:45.206 [2024-06-10 11:38:29.022743] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:27:45.206 [2024-06-10 11:38:29.022794] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:27:45.206 [2024-06-10 11:38:29.109776] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:45.464 [2024-06-10 11:38:29.195507] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:27:45.464 [2024-06-10 11:38:29.248350] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:45.464 [2024-06-10 11:38:29.248377] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:46.031 11:38:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:27:46.031 11:38:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@863 -- # return 0 00:27:46.031 11:38:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:27:46.290 [2024-06-10 11:38:29.978779] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:27:46.290 [2024-06-10 11:38:29.978816] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:27:46.290 [2024-06-10 11:38:29.978824] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:27:46.290 [2024-06-10 11:38:29.978832] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:27:46.290 11:38:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:27:46.290 11:38:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:46.290 11:38:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:46.290 11:38:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:27:46.290 11:38:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:27:46.290 11:38:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:46.290 11:38:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:46.290 11:38:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:46.290 11:38:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:46.290 11:38:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:46.290 11:38:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:46.290 11:38:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:46.290 11:38:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:46.290 "name": "Existed_Raid", 00:27:46.290 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:46.290 "strip_size_kb": 64, 00:27:46.290 "state": "configuring", 00:27:46.290 "raid_level": "raid0", 00:27:46.290 "superblock": false, 00:27:46.290 "num_base_bdevs": 2, 00:27:46.290 "num_base_bdevs_discovered": 0, 00:27:46.290 "num_base_bdevs_operational": 2, 00:27:46.290 "base_bdevs_list": [ 00:27:46.290 { 00:27:46.290 "name": "BaseBdev1", 00:27:46.290 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:46.290 "is_configured": false, 00:27:46.290 "data_offset": 0, 00:27:46.290 "data_size": 0 00:27:46.290 }, 00:27:46.290 { 00:27:46.290 "name": "BaseBdev2", 00:27:46.290 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:46.290 "is_configured": false, 00:27:46.290 "data_offset": 0, 00:27:46.290 "data_size": 0 00:27:46.290 } 00:27:46.290 ] 00:27:46.290 }' 00:27:46.290 11:38:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:46.290 11:38:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:27:46.911 11:38:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:27:46.911 [2024-06-10 11:38:30.808918] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:27:46.911 [2024-06-10 11:38:30.808946] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x917510 name Existed_Raid, state configuring 00:27:46.911 11:38:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:27:47.169 [2024-06-10 11:38:30.981363] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:27:47.169 [2024-06-10 11:38:30.981384] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:27:47.169 [2024-06-10 11:38:30.981390] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:27:47.169 [2024-06-10 11:38:30.981397] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:27:47.169 11:38:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:27:47.427 [2024-06-10 11:38:31.150338] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:47.427 BaseBdev1 00:27:47.427 11:38:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:27:47.427 11:38:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:27:47.427 11:38:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:27:47.427 11:38:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:27:47.427 11:38:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:27:47.427 11:38:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:27:47.427 11:38:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:27:47.427 11:38:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:27:47.685 [ 00:27:47.685 { 00:27:47.685 "name": "BaseBdev1", 00:27:47.685 "aliases": [ 00:27:47.685 "106d3169-6d2d-41dc-bb4a-219d37a33d63" 00:27:47.685 ], 00:27:47.685 "product_name": "Malloc disk", 00:27:47.685 "block_size": 512, 00:27:47.685 "num_blocks": 65536, 00:27:47.685 "uuid": "106d3169-6d2d-41dc-bb4a-219d37a33d63", 00:27:47.685 "assigned_rate_limits": { 00:27:47.685 "rw_ios_per_sec": 0, 00:27:47.685 "rw_mbytes_per_sec": 0, 00:27:47.685 "r_mbytes_per_sec": 0, 00:27:47.685 "w_mbytes_per_sec": 0 00:27:47.685 }, 00:27:47.685 "claimed": true, 00:27:47.685 "claim_type": "exclusive_write", 00:27:47.685 "zoned": false, 00:27:47.685 "supported_io_types": { 00:27:47.685 "read": true, 00:27:47.685 "write": true, 00:27:47.685 "unmap": true, 00:27:47.685 "write_zeroes": true, 00:27:47.685 "flush": true, 00:27:47.685 "reset": true, 00:27:47.685 "compare": false, 00:27:47.685 "compare_and_write": false, 00:27:47.685 "abort": true, 00:27:47.685 "nvme_admin": false, 00:27:47.685 "nvme_io": false 00:27:47.685 }, 00:27:47.685 "memory_domains": [ 00:27:47.685 { 00:27:47.685 "dma_device_id": "system", 00:27:47.685 "dma_device_type": 1 00:27:47.685 }, 00:27:47.685 { 00:27:47.685 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:47.685 "dma_device_type": 2 00:27:47.685 } 00:27:47.685 ], 00:27:47.685 "driver_specific": {} 00:27:47.685 } 00:27:47.685 ] 00:27:47.685 11:38:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:27:47.685 11:38:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:27:47.685 11:38:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:47.685 11:38:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:47.685 11:38:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:27:47.685 11:38:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:27:47.685 11:38:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:47.685 11:38:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:47.685 11:38:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:47.685 11:38:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:47.685 11:38:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:47.685 11:38:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:47.685 11:38:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:47.944 11:38:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:47.944 "name": "Existed_Raid", 00:27:47.944 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:47.944 "strip_size_kb": 64, 00:27:47.944 "state": "configuring", 00:27:47.944 "raid_level": "raid0", 00:27:47.944 "superblock": false, 00:27:47.944 "num_base_bdevs": 2, 00:27:47.944 "num_base_bdevs_discovered": 1, 00:27:47.944 "num_base_bdevs_operational": 2, 00:27:47.944 "base_bdevs_list": [ 00:27:47.944 { 00:27:47.944 "name": "BaseBdev1", 00:27:47.944 "uuid": "106d3169-6d2d-41dc-bb4a-219d37a33d63", 00:27:47.944 "is_configured": true, 00:27:47.944 "data_offset": 0, 00:27:47.944 "data_size": 65536 00:27:47.944 }, 00:27:47.944 { 00:27:47.944 "name": "BaseBdev2", 00:27:47.944 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:47.944 "is_configured": false, 00:27:47.944 "data_offset": 0, 00:27:47.944 "data_size": 0 00:27:47.944 } 00:27:47.944 ] 00:27:47.944 }' 00:27:47.944 11:38:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:47.944 11:38:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:27:48.510 11:38:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:27:48.510 [2024-06-10 11:38:32.321367] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:27:48.510 [2024-06-10 11:38:32.321401] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x916e00 name Existed_Raid, state configuring 00:27:48.510 11:38:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:27:48.769 [2024-06-10 11:38:32.493830] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:48.769 [2024-06-10 11:38:32.494853] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:27:48.769 [2024-06-10 11:38:32.494886] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:27:48.769 11:38:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:27:48.769 11:38:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:27:48.769 11:38:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:27:48.769 11:38:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:48.769 11:38:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:48.769 11:38:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:27:48.769 11:38:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:27:48.769 11:38:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:48.769 11:38:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:48.769 11:38:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:48.769 11:38:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:48.770 11:38:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:48.770 11:38:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:48.770 11:38:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:48.770 11:38:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:48.770 "name": "Existed_Raid", 00:27:48.770 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:48.770 "strip_size_kb": 64, 00:27:48.770 "state": "configuring", 00:27:48.770 "raid_level": "raid0", 00:27:48.770 "superblock": false, 00:27:48.770 "num_base_bdevs": 2, 00:27:48.770 "num_base_bdevs_discovered": 1, 00:27:48.770 "num_base_bdevs_operational": 2, 00:27:48.770 "base_bdevs_list": [ 00:27:48.770 { 00:27:48.770 "name": "BaseBdev1", 00:27:48.770 "uuid": "106d3169-6d2d-41dc-bb4a-219d37a33d63", 00:27:48.770 "is_configured": true, 00:27:48.770 "data_offset": 0, 00:27:48.770 "data_size": 65536 00:27:48.770 }, 00:27:48.770 { 00:27:48.770 "name": "BaseBdev2", 00:27:48.770 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:48.770 "is_configured": false, 00:27:48.770 "data_offset": 0, 00:27:48.770 "data_size": 0 00:27:48.770 } 00:27:48.770 ] 00:27:48.770 }' 00:27:48.770 11:38:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:48.770 11:38:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:27:49.336 11:38:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:27:49.595 [2024-06-10 11:38:33.330885] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:27:49.595 [2024-06-10 11:38:33.330916] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x917bf0 00:27:49.595 [2024-06-10 11:38:33.330921] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:27:49.595 [2024-06-10 11:38:33.331093] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xac99b0 00:27:49.595 [2024-06-10 11:38:33.331176] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x917bf0 00:27:49.595 [2024-06-10 11:38:33.331182] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x917bf0 00:27:49.595 [2024-06-10 11:38:33.331319] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:49.595 BaseBdev2 00:27:49.595 11:38:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:27:49.595 11:38:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:27:49.595 11:38:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:27:49.595 11:38:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:27:49.595 11:38:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:27:49.595 11:38:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:27:49.595 11:38:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:27:49.595 11:38:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:27:49.854 [ 00:27:49.854 { 00:27:49.854 "name": "BaseBdev2", 00:27:49.854 "aliases": [ 00:27:49.854 "ee77f8ab-7367-4230-8443-44d52a76e7fb" 00:27:49.854 ], 00:27:49.854 "product_name": "Malloc disk", 00:27:49.854 "block_size": 512, 00:27:49.854 "num_blocks": 65536, 00:27:49.854 "uuid": "ee77f8ab-7367-4230-8443-44d52a76e7fb", 00:27:49.854 "assigned_rate_limits": { 00:27:49.854 "rw_ios_per_sec": 0, 00:27:49.854 "rw_mbytes_per_sec": 0, 00:27:49.854 "r_mbytes_per_sec": 0, 00:27:49.854 "w_mbytes_per_sec": 0 00:27:49.854 }, 00:27:49.854 "claimed": true, 00:27:49.854 "claim_type": "exclusive_write", 00:27:49.854 "zoned": false, 00:27:49.854 "supported_io_types": { 00:27:49.854 "read": true, 00:27:49.854 "write": true, 00:27:49.854 "unmap": true, 00:27:49.854 "write_zeroes": true, 00:27:49.854 "flush": true, 00:27:49.854 "reset": true, 00:27:49.854 "compare": false, 00:27:49.854 "compare_and_write": false, 00:27:49.854 "abort": true, 00:27:49.854 "nvme_admin": false, 00:27:49.854 "nvme_io": false 00:27:49.854 }, 00:27:49.854 "memory_domains": [ 00:27:49.854 { 00:27:49.854 "dma_device_id": "system", 00:27:49.854 "dma_device_type": 1 00:27:49.854 }, 00:27:49.854 { 00:27:49.854 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:49.854 "dma_device_type": 2 00:27:49.854 } 00:27:49.854 ], 00:27:49.854 "driver_specific": {} 00:27:49.854 } 00:27:49.854 ] 00:27:49.854 11:38:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:27:49.854 11:38:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:27:49.854 11:38:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:27:49.855 11:38:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 2 00:27:49.855 11:38:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:49.855 11:38:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:49.855 11:38:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:27:49.855 11:38:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:27:49.855 11:38:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:49.855 11:38:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:49.855 11:38:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:49.855 11:38:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:49.855 11:38:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:49.855 11:38:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:49.855 11:38:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:50.112 11:38:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:50.113 "name": "Existed_Raid", 00:27:50.113 "uuid": "5097e5fa-86de-4337-a81e-fc08513ba2de", 00:27:50.113 "strip_size_kb": 64, 00:27:50.113 "state": "online", 00:27:50.113 "raid_level": "raid0", 00:27:50.113 "superblock": false, 00:27:50.113 "num_base_bdevs": 2, 00:27:50.113 "num_base_bdevs_discovered": 2, 00:27:50.113 "num_base_bdevs_operational": 2, 00:27:50.113 "base_bdevs_list": [ 00:27:50.113 { 00:27:50.113 "name": "BaseBdev1", 00:27:50.113 "uuid": "106d3169-6d2d-41dc-bb4a-219d37a33d63", 00:27:50.113 "is_configured": true, 00:27:50.113 "data_offset": 0, 00:27:50.113 "data_size": 65536 00:27:50.113 }, 00:27:50.113 { 00:27:50.113 "name": "BaseBdev2", 00:27:50.113 "uuid": "ee77f8ab-7367-4230-8443-44d52a76e7fb", 00:27:50.113 "is_configured": true, 00:27:50.113 "data_offset": 0, 00:27:50.113 "data_size": 65536 00:27:50.113 } 00:27:50.113 ] 00:27:50.113 }' 00:27:50.113 11:38:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:50.113 11:38:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:27:50.679 11:38:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:27:50.679 11:38:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:27:50.679 11:38:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:27:50.679 11:38:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:27:50.679 11:38:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:27:50.679 11:38:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:27:50.679 11:38:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:27:50.679 11:38:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:27:50.679 [2024-06-10 11:38:34.506080] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:50.679 11:38:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:27:50.679 "name": "Existed_Raid", 00:27:50.679 "aliases": [ 00:27:50.679 "5097e5fa-86de-4337-a81e-fc08513ba2de" 00:27:50.679 ], 00:27:50.679 "product_name": "Raid Volume", 00:27:50.679 "block_size": 512, 00:27:50.679 "num_blocks": 131072, 00:27:50.679 "uuid": "5097e5fa-86de-4337-a81e-fc08513ba2de", 00:27:50.679 "assigned_rate_limits": { 00:27:50.679 "rw_ios_per_sec": 0, 00:27:50.679 "rw_mbytes_per_sec": 0, 00:27:50.679 "r_mbytes_per_sec": 0, 00:27:50.679 "w_mbytes_per_sec": 0 00:27:50.679 }, 00:27:50.679 "claimed": false, 00:27:50.679 "zoned": false, 00:27:50.679 "supported_io_types": { 00:27:50.679 "read": true, 00:27:50.679 "write": true, 00:27:50.679 "unmap": true, 00:27:50.679 "write_zeroes": true, 00:27:50.679 "flush": true, 00:27:50.679 "reset": true, 00:27:50.679 "compare": false, 00:27:50.679 "compare_and_write": false, 00:27:50.679 "abort": false, 00:27:50.679 "nvme_admin": false, 00:27:50.679 "nvme_io": false 00:27:50.679 }, 00:27:50.679 "memory_domains": [ 00:27:50.679 { 00:27:50.679 "dma_device_id": "system", 00:27:50.679 "dma_device_type": 1 00:27:50.679 }, 00:27:50.679 { 00:27:50.679 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:50.679 "dma_device_type": 2 00:27:50.679 }, 00:27:50.679 { 00:27:50.679 "dma_device_id": "system", 00:27:50.679 "dma_device_type": 1 00:27:50.679 }, 00:27:50.679 { 00:27:50.679 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:50.679 "dma_device_type": 2 00:27:50.679 } 00:27:50.679 ], 00:27:50.679 "driver_specific": { 00:27:50.679 "raid": { 00:27:50.679 "uuid": "5097e5fa-86de-4337-a81e-fc08513ba2de", 00:27:50.679 "strip_size_kb": 64, 00:27:50.679 "state": "online", 00:27:50.679 "raid_level": "raid0", 00:27:50.679 "superblock": false, 00:27:50.679 "num_base_bdevs": 2, 00:27:50.679 "num_base_bdevs_discovered": 2, 00:27:50.679 "num_base_bdevs_operational": 2, 00:27:50.679 "base_bdevs_list": [ 00:27:50.679 { 00:27:50.679 "name": "BaseBdev1", 00:27:50.679 "uuid": "106d3169-6d2d-41dc-bb4a-219d37a33d63", 00:27:50.679 "is_configured": true, 00:27:50.679 "data_offset": 0, 00:27:50.679 "data_size": 65536 00:27:50.679 }, 00:27:50.679 { 00:27:50.679 "name": "BaseBdev2", 00:27:50.679 "uuid": "ee77f8ab-7367-4230-8443-44d52a76e7fb", 00:27:50.679 "is_configured": true, 00:27:50.679 "data_offset": 0, 00:27:50.679 "data_size": 65536 00:27:50.679 } 00:27:50.679 ] 00:27:50.679 } 00:27:50.679 } 00:27:50.679 }' 00:27:50.679 11:38:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:27:50.679 11:38:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:27:50.679 BaseBdev2' 00:27:50.679 11:38:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:50.680 11:38:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:27:50.680 11:38:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:50.938 11:38:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:50.938 "name": "BaseBdev1", 00:27:50.938 "aliases": [ 00:27:50.938 "106d3169-6d2d-41dc-bb4a-219d37a33d63" 00:27:50.938 ], 00:27:50.938 "product_name": "Malloc disk", 00:27:50.938 "block_size": 512, 00:27:50.938 "num_blocks": 65536, 00:27:50.938 "uuid": "106d3169-6d2d-41dc-bb4a-219d37a33d63", 00:27:50.938 "assigned_rate_limits": { 00:27:50.938 "rw_ios_per_sec": 0, 00:27:50.938 "rw_mbytes_per_sec": 0, 00:27:50.938 "r_mbytes_per_sec": 0, 00:27:50.938 "w_mbytes_per_sec": 0 00:27:50.938 }, 00:27:50.938 "claimed": true, 00:27:50.938 "claim_type": "exclusive_write", 00:27:50.938 "zoned": false, 00:27:50.938 "supported_io_types": { 00:27:50.938 "read": true, 00:27:50.938 "write": true, 00:27:50.938 "unmap": true, 00:27:50.938 "write_zeroes": true, 00:27:50.938 "flush": true, 00:27:50.938 "reset": true, 00:27:50.938 "compare": false, 00:27:50.938 "compare_and_write": false, 00:27:50.938 "abort": true, 00:27:50.938 "nvme_admin": false, 00:27:50.938 "nvme_io": false 00:27:50.938 }, 00:27:50.938 "memory_domains": [ 00:27:50.938 { 00:27:50.938 "dma_device_id": "system", 00:27:50.938 "dma_device_type": 1 00:27:50.938 }, 00:27:50.938 { 00:27:50.938 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:50.938 "dma_device_type": 2 00:27:50.938 } 00:27:50.938 ], 00:27:50.938 "driver_specific": {} 00:27:50.938 }' 00:27:50.938 11:38:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:50.938 11:38:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:50.938 11:38:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:27:50.938 11:38:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:50.938 11:38:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:51.197 11:38:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:27:51.197 11:38:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:51.197 11:38:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:51.197 11:38:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:27:51.197 11:38:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:51.197 11:38:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:51.197 11:38:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:27:51.197 11:38:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:51.197 11:38:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:27:51.197 11:38:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:51.455 11:38:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:51.455 "name": "BaseBdev2", 00:27:51.455 "aliases": [ 00:27:51.455 "ee77f8ab-7367-4230-8443-44d52a76e7fb" 00:27:51.455 ], 00:27:51.455 "product_name": "Malloc disk", 00:27:51.455 "block_size": 512, 00:27:51.455 "num_blocks": 65536, 00:27:51.455 "uuid": "ee77f8ab-7367-4230-8443-44d52a76e7fb", 00:27:51.455 "assigned_rate_limits": { 00:27:51.455 "rw_ios_per_sec": 0, 00:27:51.455 "rw_mbytes_per_sec": 0, 00:27:51.455 "r_mbytes_per_sec": 0, 00:27:51.455 "w_mbytes_per_sec": 0 00:27:51.455 }, 00:27:51.455 "claimed": true, 00:27:51.455 "claim_type": "exclusive_write", 00:27:51.455 "zoned": false, 00:27:51.455 "supported_io_types": { 00:27:51.455 "read": true, 00:27:51.455 "write": true, 00:27:51.455 "unmap": true, 00:27:51.455 "write_zeroes": true, 00:27:51.455 "flush": true, 00:27:51.455 "reset": true, 00:27:51.455 "compare": false, 00:27:51.455 "compare_and_write": false, 00:27:51.455 "abort": true, 00:27:51.455 "nvme_admin": false, 00:27:51.455 "nvme_io": false 00:27:51.455 }, 00:27:51.455 "memory_domains": [ 00:27:51.455 { 00:27:51.455 "dma_device_id": "system", 00:27:51.455 "dma_device_type": 1 00:27:51.455 }, 00:27:51.455 { 00:27:51.455 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:51.455 "dma_device_type": 2 00:27:51.455 } 00:27:51.455 ], 00:27:51.455 "driver_specific": {} 00:27:51.455 }' 00:27:51.455 11:38:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:51.455 11:38:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:51.455 11:38:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:27:51.455 11:38:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:51.455 11:38:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:51.455 11:38:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:27:51.455 11:38:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:51.455 11:38:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:51.713 11:38:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:27:51.713 11:38:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:51.713 11:38:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:51.713 11:38:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:27:51.713 11:38:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:27:51.972 [2024-06-10 11:38:35.673012] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:27:51.972 [2024-06-10 11:38:35.673039] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:51.972 [2024-06-10 11:38:35.673069] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:51.972 11:38:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:27:51.972 11:38:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:27:51.972 11:38:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:27:51.972 11:38:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:27:51.972 11:38:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:27:51.972 11:38:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 1 00:27:51.972 11:38:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:51.972 11:38:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:27:51.972 11:38:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:27:51.972 11:38:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:27:51.972 11:38:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:51.972 11:38:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:51.972 11:38:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:51.972 11:38:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:51.972 11:38:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:51.972 11:38:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:51.972 11:38:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:51.972 11:38:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:51.972 "name": "Existed_Raid", 00:27:51.972 "uuid": "5097e5fa-86de-4337-a81e-fc08513ba2de", 00:27:51.972 "strip_size_kb": 64, 00:27:51.972 "state": "offline", 00:27:51.972 "raid_level": "raid0", 00:27:51.972 "superblock": false, 00:27:51.972 "num_base_bdevs": 2, 00:27:51.972 "num_base_bdevs_discovered": 1, 00:27:51.972 "num_base_bdevs_operational": 1, 00:27:51.972 "base_bdevs_list": [ 00:27:51.972 { 00:27:51.972 "name": null, 00:27:51.972 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:51.972 "is_configured": false, 00:27:51.972 "data_offset": 0, 00:27:51.972 "data_size": 65536 00:27:51.972 }, 00:27:51.972 { 00:27:51.972 "name": "BaseBdev2", 00:27:51.972 "uuid": "ee77f8ab-7367-4230-8443-44d52a76e7fb", 00:27:51.972 "is_configured": true, 00:27:51.972 "data_offset": 0, 00:27:51.972 "data_size": 65536 00:27:51.972 } 00:27:51.972 ] 00:27:51.972 }' 00:27:51.972 11:38:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:51.972 11:38:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:27:52.540 11:38:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:27:52.540 11:38:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:27:52.540 11:38:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:52.540 11:38:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:27:52.800 11:38:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:27:52.800 11:38:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:27:52.800 11:38:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:27:52.800 [2024-06-10 11:38:36.697352] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:27:52.800 [2024-06-10 11:38:36.697397] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x917bf0 name Existed_Raid, state offline 00:27:52.800 11:38:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:27:52.800 11:38:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:27:52.800 11:38:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:52.800 11:38:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:27:53.059 11:38:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:27:53.059 11:38:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:27:53.059 11:38:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:27:53.059 11:38:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 129666 00:27:53.059 11:38:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@949 -- # '[' -z 129666 ']' 00:27:53.059 11:38:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # kill -0 129666 00:27:53.059 11:38:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # uname 00:27:53.059 11:38:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:27:53.059 11:38:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 129666 00:27:53.059 11:38:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:27:53.059 11:38:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:27:53.059 11:38:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 129666' 00:27:53.059 killing process with pid 129666 00:27:53.059 11:38:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # kill 129666 00:27:53.059 [2024-06-10 11:38:36.954802] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:27:53.060 11:38:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@973 -- # wait 129666 00:27:53.060 [2024-06-10 11:38:36.955703] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:27:53.319 11:38:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:27:53.319 00:27:53.319 real 0m8.195s 00:27:53.319 user 0m14.400s 00:27:53.319 sys 0m1.639s 00:27:53.319 11:38:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:27:53.319 11:38:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:27:53.319 ************************************ 00:27:53.319 END TEST raid_state_function_test 00:27:53.319 ************************************ 00:27:53.319 11:38:37 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 2 true 00:27:53.319 11:38:37 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:27:53.319 11:38:37 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:27:53.319 11:38:37 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:27:53.319 ************************************ 00:27:53.319 START TEST raid_state_function_test_sb 00:27:53.319 ************************************ 00:27:53.319 11:38:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # raid_state_function_test raid0 2 true 00:27:53.319 11:38:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:27:53.319 11:38:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:27:53.319 11:38:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:27:53.319 11:38:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:27:53.319 11:38:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:27:53.319 11:38:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:27:53.319 11:38:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:27:53.319 11:38:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:27:53.319 11:38:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:27:53.319 11:38:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:27:53.319 11:38:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:27:53.319 11:38:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:27:53.319 11:38:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:27:53.319 11:38:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:27:53.319 11:38:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:27:53.319 11:38:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:27:53.319 11:38:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:27:53.319 11:38:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:27:53.319 11:38:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:27:53.319 11:38:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:27:53.319 11:38:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:27:53.319 11:38:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:27:53.319 11:38:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:27:53.319 11:38:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=131049 00:27:53.319 11:38:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 131049' 00:27:53.319 Process raid pid: 131049 00:27:53.319 11:38:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:27:53.319 11:38:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 131049 /var/tmp/spdk-raid.sock 00:27:53.319 11:38:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@830 -- # '[' -z 131049 ']' 00:27:53.319 11:38:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:27:53.319 11:38:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local max_retries=100 00:27:53.319 11:38:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:27:53.319 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:27:53.319 11:38:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@839 -- # xtrace_disable 00:27:53.319 11:38:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:27:53.578 [2024-06-10 11:38:37.301284] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:27:53.578 [2024-06-10 11:38:37.301333] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:27:53.578 [2024-06-10 11:38:37.388357] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:53.579 [2024-06-10 11:38:37.472209] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:27:53.838 [2024-06-10 11:38:37.527257] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:53.838 [2024-06-10 11:38:37.527278] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:54.407 11:38:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:27:54.407 11:38:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@863 -- # return 0 00:27:54.407 11:38:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:27:54.407 [2024-06-10 11:38:38.255563] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:27:54.407 [2024-06-10 11:38:38.255600] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:27:54.407 [2024-06-10 11:38:38.255608] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:27:54.407 [2024-06-10 11:38:38.255616] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:27:54.407 11:38:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:27:54.407 11:38:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:54.407 11:38:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:54.407 11:38:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:27:54.407 11:38:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:27:54.407 11:38:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:54.407 11:38:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:54.407 11:38:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:54.407 11:38:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:54.407 11:38:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:54.407 11:38:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:54.407 11:38:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:54.666 11:38:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:54.666 "name": "Existed_Raid", 00:27:54.666 "uuid": "e1572754-8bfb-4b9c-9bcc-63d562bf1215", 00:27:54.666 "strip_size_kb": 64, 00:27:54.666 "state": "configuring", 00:27:54.666 "raid_level": "raid0", 00:27:54.666 "superblock": true, 00:27:54.666 "num_base_bdevs": 2, 00:27:54.666 "num_base_bdevs_discovered": 0, 00:27:54.666 "num_base_bdevs_operational": 2, 00:27:54.666 "base_bdevs_list": [ 00:27:54.666 { 00:27:54.666 "name": "BaseBdev1", 00:27:54.666 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:54.666 "is_configured": false, 00:27:54.666 "data_offset": 0, 00:27:54.666 "data_size": 0 00:27:54.666 }, 00:27:54.666 { 00:27:54.666 "name": "BaseBdev2", 00:27:54.666 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:54.666 "is_configured": false, 00:27:54.666 "data_offset": 0, 00:27:54.666 "data_size": 0 00:27:54.666 } 00:27:54.666 ] 00:27:54.666 }' 00:27:54.666 11:38:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:54.666 11:38:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:27:55.233 11:38:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:27:55.233 [2024-06-10 11:38:39.101671] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:27:55.233 [2024-06-10 11:38:39.101692] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa15510 name Existed_Raid, state configuring 00:27:55.233 11:38:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:27:55.491 [2024-06-10 11:38:39.266115] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:27:55.491 [2024-06-10 11:38:39.266136] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:27:55.491 [2024-06-10 11:38:39.266142] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:27:55.491 [2024-06-10 11:38:39.266149] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:27:55.491 11:38:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:27:55.749 [2024-06-10 11:38:39.447152] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:55.749 BaseBdev1 00:27:55.749 11:38:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:27:55.749 11:38:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:27:55.749 11:38:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:27:55.749 11:38:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:27:55.749 11:38:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:27:55.749 11:38:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:27:55.749 11:38:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:27:55.749 11:38:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:27:56.008 [ 00:27:56.008 { 00:27:56.008 "name": "BaseBdev1", 00:27:56.008 "aliases": [ 00:27:56.008 "55791164-4e45-4b3a-a8f3-5494c6aa8f78" 00:27:56.008 ], 00:27:56.008 "product_name": "Malloc disk", 00:27:56.008 "block_size": 512, 00:27:56.008 "num_blocks": 65536, 00:27:56.008 "uuid": "55791164-4e45-4b3a-a8f3-5494c6aa8f78", 00:27:56.008 "assigned_rate_limits": { 00:27:56.008 "rw_ios_per_sec": 0, 00:27:56.008 "rw_mbytes_per_sec": 0, 00:27:56.008 "r_mbytes_per_sec": 0, 00:27:56.008 "w_mbytes_per_sec": 0 00:27:56.008 }, 00:27:56.008 "claimed": true, 00:27:56.008 "claim_type": "exclusive_write", 00:27:56.008 "zoned": false, 00:27:56.008 "supported_io_types": { 00:27:56.008 "read": true, 00:27:56.008 "write": true, 00:27:56.008 "unmap": true, 00:27:56.008 "write_zeroes": true, 00:27:56.008 "flush": true, 00:27:56.008 "reset": true, 00:27:56.008 "compare": false, 00:27:56.008 "compare_and_write": false, 00:27:56.008 "abort": true, 00:27:56.008 "nvme_admin": false, 00:27:56.008 "nvme_io": false 00:27:56.008 }, 00:27:56.008 "memory_domains": [ 00:27:56.008 { 00:27:56.008 "dma_device_id": "system", 00:27:56.008 "dma_device_type": 1 00:27:56.008 }, 00:27:56.008 { 00:27:56.008 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:56.008 "dma_device_type": 2 00:27:56.008 } 00:27:56.008 ], 00:27:56.008 "driver_specific": {} 00:27:56.008 } 00:27:56.008 ] 00:27:56.008 11:38:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:27:56.008 11:38:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:27:56.008 11:38:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:56.008 11:38:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:56.008 11:38:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:27:56.008 11:38:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:27:56.008 11:38:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:56.008 11:38:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:56.008 11:38:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:56.008 11:38:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:56.008 11:38:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:56.008 11:38:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:56.008 11:38:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:56.267 11:38:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:56.267 "name": "Existed_Raid", 00:27:56.267 "uuid": "27138c25-7365-4ddd-8fd0-4ff232b43e4f", 00:27:56.267 "strip_size_kb": 64, 00:27:56.267 "state": "configuring", 00:27:56.267 "raid_level": "raid0", 00:27:56.267 "superblock": true, 00:27:56.267 "num_base_bdevs": 2, 00:27:56.267 "num_base_bdevs_discovered": 1, 00:27:56.267 "num_base_bdevs_operational": 2, 00:27:56.267 "base_bdevs_list": [ 00:27:56.267 { 00:27:56.267 "name": "BaseBdev1", 00:27:56.267 "uuid": "55791164-4e45-4b3a-a8f3-5494c6aa8f78", 00:27:56.267 "is_configured": true, 00:27:56.267 "data_offset": 2048, 00:27:56.267 "data_size": 63488 00:27:56.267 }, 00:27:56.267 { 00:27:56.267 "name": "BaseBdev2", 00:27:56.267 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:56.267 "is_configured": false, 00:27:56.267 "data_offset": 0, 00:27:56.267 "data_size": 0 00:27:56.267 } 00:27:56.267 ] 00:27:56.267 }' 00:27:56.267 11:38:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:56.267 11:38:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:27:56.835 11:38:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:27:56.835 [2024-06-10 11:38:40.654256] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:27:56.835 [2024-06-10 11:38:40.654289] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa14e00 name Existed_Raid, state configuring 00:27:56.835 11:38:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:27:57.094 [2024-06-10 11:38:40.830740] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:57.094 [2024-06-10 11:38:40.831751] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:27:57.094 [2024-06-10 11:38:40.831776] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:27:57.094 11:38:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:27:57.094 11:38:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:27:57.094 11:38:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:27:57.094 11:38:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:57.094 11:38:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:57.094 11:38:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:27:57.094 11:38:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:27:57.094 11:38:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:57.094 11:38:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:57.094 11:38:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:57.094 11:38:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:57.094 11:38:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:57.094 11:38:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:57.094 11:38:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:57.094 11:38:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:57.094 "name": "Existed_Raid", 00:27:57.094 "uuid": "e04c3bce-3432-4037-883b-7bc3b845c692", 00:27:57.094 "strip_size_kb": 64, 00:27:57.094 "state": "configuring", 00:27:57.094 "raid_level": "raid0", 00:27:57.094 "superblock": true, 00:27:57.094 "num_base_bdevs": 2, 00:27:57.094 "num_base_bdevs_discovered": 1, 00:27:57.094 "num_base_bdevs_operational": 2, 00:27:57.094 "base_bdevs_list": [ 00:27:57.094 { 00:27:57.094 "name": "BaseBdev1", 00:27:57.094 "uuid": "55791164-4e45-4b3a-a8f3-5494c6aa8f78", 00:27:57.094 "is_configured": true, 00:27:57.094 "data_offset": 2048, 00:27:57.094 "data_size": 63488 00:27:57.094 }, 00:27:57.094 { 00:27:57.094 "name": "BaseBdev2", 00:27:57.094 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:57.094 "is_configured": false, 00:27:57.094 "data_offset": 0, 00:27:57.094 "data_size": 0 00:27:57.094 } 00:27:57.094 ] 00:27:57.094 }' 00:27:57.094 11:38:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:57.094 11:38:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:27:57.661 11:38:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:27:57.921 [2024-06-10 11:38:41.691792] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:27:57.921 [2024-06-10 11:38:41.691921] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xa15bf0 00:27:57.921 [2024-06-10 11:38:41.691931] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:27:57.921 [2024-06-10 11:38:41.692042] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xbc79b0 00:27:57.921 [2024-06-10 11:38:41.692120] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xa15bf0 00:27:57.921 [2024-06-10 11:38:41.692126] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xa15bf0 00:27:57.921 [2024-06-10 11:38:41.692187] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:57.921 BaseBdev2 00:27:57.921 11:38:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:27:57.921 11:38:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:27:57.921 11:38:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:27:57.921 11:38:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:27:57.921 11:38:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:27:57.921 11:38:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:27:57.921 11:38:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:27:58.180 11:38:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:27:58.180 [ 00:27:58.180 { 00:27:58.180 "name": "BaseBdev2", 00:27:58.180 "aliases": [ 00:27:58.180 "31a63e73-f89b-4566-902b-5555162cf402" 00:27:58.180 ], 00:27:58.180 "product_name": "Malloc disk", 00:27:58.180 "block_size": 512, 00:27:58.180 "num_blocks": 65536, 00:27:58.180 "uuid": "31a63e73-f89b-4566-902b-5555162cf402", 00:27:58.180 "assigned_rate_limits": { 00:27:58.180 "rw_ios_per_sec": 0, 00:27:58.180 "rw_mbytes_per_sec": 0, 00:27:58.180 "r_mbytes_per_sec": 0, 00:27:58.180 "w_mbytes_per_sec": 0 00:27:58.180 }, 00:27:58.180 "claimed": true, 00:27:58.180 "claim_type": "exclusive_write", 00:27:58.180 "zoned": false, 00:27:58.180 "supported_io_types": { 00:27:58.180 "read": true, 00:27:58.180 "write": true, 00:27:58.180 "unmap": true, 00:27:58.180 "write_zeroes": true, 00:27:58.180 "flush": true, 00:27:58.180 "reset": true, 00:27:58.180 "compare": false, 00:27:58.180 "compare_and_write": false, 00:27:58.180 "abort": true, 00:27:58.180 "nvme_admin": false, 00:27:58.180 "nvme_io": false 00:27:58.180 }, 00:27:58.180 "memory_domains": [ 00:27:58.180 { 00:27:58.180 "dma_device_id": "system", 00:27:58.180 "dma_device_type": 1 00:27:58.180 }, 00:27:58.180 { 00:27:58.180 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:58.180 "dma_device_type": 2 00:27:58.180 } 00:27:58.180 ], 00:27:58.180 "driver_specific": {} 00:27:58.180 } 00:27:58.180 ] 00:27:58.180 11:38:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:27:58.180 11:38:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:27:58.180 11:38:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:27:58.180 11:38:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 2 00:27:58.180 11:38:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:58.180 11:38:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:58.180 11:38:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:27:58.180 11:38:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:27:58.180 11:38:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:58.180 11:38:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:58.180 11:38:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:58.180 11:38:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:58.181 11:38:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:58.181 11:38:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:58.181 11:38:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:58.438 11:38:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:58.438 "name": "Existed_Raid", 00:27:58.438 "uuid": "e04c3bce-3432-4037-883b-7bc3b845c692", 00:27:58.438 "strip_size_kb": 64, 00:27:58.438 "state": "online", 00:27:58.438 "raid_level": "raid0", 00:27:58.438 "superblock": true, 00:27:58.439 "num_base_bdevs": 2, 00:27:58.439 "num_base_bdevs_discovered": 2, 00:27:58.439 "num_base_bdevs_operational": 2, 00:27:58.439 "base_bdevs_list": [ 00:27:58.439 { 00:27:58.439 "name": "BaseBdev1", 00:27:58.439 "uuid": "55791164-4e45-4b3a-a8f3-5494c6aa8f78", 00:27:58.439 "is_configured": true, 00:27:58.439 "data_offset": 2048, 00:27:58.439 "data_size": 63488 00:27:58.439 }, 00:27:58.439 { 00:27:58.439 "name": "BaseBdev2", 00:27:58.439 "uuid": "31a63e73-f89b-4566-902b-5555162cf402", 00:27:58.439 "is_configured": true, 00:27:58.439 "data_offset": 2048, 00:27:58.439 "data_size": 63488 00:27:58.439 } 00:27:58.439 ] 00:27:58.439 }' 00:27:58.439 11:38:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:58.439 11:38:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:27:59.005 11:38:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:27:59.005 11:38:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:27:59.005 11:38:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:27:59.005 11:38:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:27:59.005 11:38:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:27:59.005 11:38:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:27:59.005 11:38:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:27:59.005 11:38:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:27:59.005 [2024-06-10 11:38:42.866989] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:59.005 11:38:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:27:59.006 "name": "Existed_Raid", 00:27:59.006 "aliases": [ 00:27:59.006 "e04c3bce-3432-4037-883b-7bc3b845c692" 00:27:59.006 ], 00:27:59.006 "product_name": "Raid Volume", 00:27:59.006 "block_size": 512, 00:27:59.006 "num_blocks": 126976, 00:27:59.006 "uuid": "e04c3bce-3432-4037-883b-7bc3b845c692", 00:27:59.006 "assigned_rate_limits": { 00:27:59.006 "rw_ios_per_sec": 0, 00:27:59.006 "rw_mbytes_per_sec": 0, 00:27:59.006 "r_mbytes_per_sec": 0, 00:27:59.006 "w_mbytes_per_sec": 0 00:27:59.006 }, 00:27:59.006 "claimed": false, 00:27:59.006 "zoned": false, 00:27:59.006 "supported_io_types": { 00:27:59.006 "read": true, 00:27:59.006 "write": true, 00:27:59.006 "unmap": true, 00:27:59.006 "write_zeroes": true, 00:27:59.006 "flush": true, 00:27:59.006 "reset": true, 00:27:59.006 "compare": false, 00:27:59.006 "compare_and_write": false, 00:27:59.006 "abort": false, 00:27:59.006 "nvme_admin": false, 00:27:59.006 "nvme_io": false 00:27:59.006 }, 00:27:59.006 "memory_domains": [ 00:27:59.006 { 00:27:59.006 "dma_device_id": "system", 00:27:59.006 "dma_device_type": 1 00:27:59.006 }, 00:27:59.006 { 00:27:59.006 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:59.006 "dma_device_type": 2 00:27:59.006 }, 00:27:59.006 { 00:27:59.006 "dma_device_id": "system", 00:27:59.006 "dma_device_type": 1 00:27:59.006 }, 00:27:59.006 { 00:27:59.006 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:59.006 "dma_device_type": 2 00:27:59.006 } 00:27:59.006 ], 00:27:59.006 "driver_specific": { 00:27:59.006 "raid": { 00:27:59.006 "uuid": "e04c3bce-3432-4037-883b-7bc3b845c692", 00:27:59.006 "strip_size_kb": 64, 00:27:59.006 "state": "online", 00:27:59.006 "raid_level": "raid0", 00:27:59.006 "superblock": true, 00:27:59.006 "num_base_bdevs": 2, 00:27:59.006 "num_base_bdevs_discovered": 2, 00:27:59.006 "num_base_bdevs_operational": 2, 00:27:59.006 "base_bdevs_list": [ 00:27:59.006 { 00:27:59.006 "name": "BaseBdev1", 00:27:59.006 "uuid": "55791164-4e45-4b3a-a8f3-5494c6aa8f78", 00:27:59.006 "is_configured": true, 00:27:59.006 "data_offset": 2048, 00:27:59.006 "data_size": 63488 00:27:59.006 }, 00:27:59.006 { 00:27:59.006 "name": "BaseBdev2", 00:27:59.006 "uuid": "31a63e73-f89b-4566-902b-5555162cf402", 00:27:59.006 "is_configured": true, 00:27:59.006 "data_offset": 2048, 00:27:59.006 "data_size": 63488 00:27:59.006 } 00:27:59.006 ] 00:27:59.006 } 00:27:59.006 } 00:27:59.006 }' 00:27:59.006 11:38:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:27:59.006 11:38:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:27:59.006 BaseBdev2' 00:27:59.006 11:38:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:59.006 11:38:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:27:59.006 11:38:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:59.265 11:38:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:59.265 "name": "BaseBdev1", 00:27:59.265 "aliases": [ 00:27:59.265 "55791164-4e45-4b3a-a8f3-5494c6aa8f78" 00:27:59.265 ], 00:27:59.265 "product_name": "Malloc disk", 00:27:59.265 "block_size": 512, 00:27:59.265 "num_blocks": 65536, 00:27:59.265 "uuid": "55791164-4e45-4b3a-a8f3-5494c6aa8f78", 00:27:59.265 "assigned_rate_limits": { 00:27:59.265 "rw_ios_per_sec": 0, 00:27:59.265 "rw_mbytes_per_sec": 0, 00:27:59.265 "r_mbytes_per_sec": 0, 00:27:59.265 "w_mbytes_per_sec": 0 00:27:59.265 }, 00:27:59.265 "claimed": true, 00:27:59.265 "claim_type": "exclusive_write", 00:27:59.265 "zoned": false, 00:27:59.265 "supported_io_types": { 00:27:59.265 "read": true, 00:27:59.265 "write": true, 00:27:59.265 "unmap": true, 00:27:59.265 "write_zeroes": true, 00:27:59.265 "flush": true, 00:27:59.265 "reset": true, 00:27:59.265 "compare": false, 00:27:59.265 "compare_and_write": false, 00:27:59.265 "abort": true, 00:27:59.265 "nvme_admin": false, 00:27:59.265 "nvme_io": false 00:27:59.265 }, 00:27:59.265 "memory_domains": [ 00:27:59.265 { 00:27:59.265 "dma_device_id": "system", 00:27:59.265 "dma_device_type": 1 00:27:59.265 }, 00:27:59.265 { 00:27:59.265 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:59.265 "dma_device_type": 2 00:27:59.265 } 00:27:59.265 ], 00:27:59.265 "driver_specific": {} 00:27:59.265 }' 00:27:59.265 11:38:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:59.265 11:38:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:59.265 11:38:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:27:59.265 11:38:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:59.524 11:38:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:59.524 11:38:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:27:59.524 11:38:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:59.524 11:38:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:59.524 11:38:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:27:59.524 11:38:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:59.524 11:38:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:59.524 11:38:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:27:59.524 11:38:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:59.524 11:38:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:27:59.524 11:38:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:59.782 11:38:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:59.782 "name": "BaseBdev2", 00:27:59.782 "aliases": [ 00:27:59.782 "31a63e73-f89b-4566-902b-5555162cf402" 00:27:59.782 ], 00:27:59.782 "product_name": "Malloc disk", 00:27:59.782 "block_size": 512, 00:27:59.782 "num_blocks": 65536, 00:27:59.782 "uuid": "31a63e73-f89b-4566-902b-5555162cf402", 00:27:59.782 "assigned_rate_limits": { 00:27:59.782 "rw_ios_per_sec": 0, 00:27:59.782 "rw_mbytes_per_sec": 0, 00:27:59.782 "r_mbytes_per_sec": 0, 00:27:59.782 "w_mbytes_per_sec": 0 00:27:59.782 }, 00:27:59.782 "claimed": true, 00:27:59.782 "claim_type": "exclusive_write", 00:27:59.782 "zoned": false, 00:27:59.782 "supported_io_types": { 00:27:59.782 "read": true, 00:27:59.782 "write": true, 00:27:59.783 "unmap": true, 00:27:59.783 "write_zeroes": true, 00:27:59.783 "flush": true, 00:27:59.783 "reset": true, 00:27:59.783 "compare": false, 00:27:59.783 "compare_and_write": false, 00:27:59.783 "abort": true, 00:27:59.783 "nvme_admin": false, 00:27:59.783 "nvme_io": false 00:27:59.783 }, 00:27:59.783 "memory_domains": [ 00:27:59.783 { 00:27:59.783 "dma_device_id": "system", 00:27:59.783 "dma_device_type": 1 00:27:59.783 }, 00:27:59.783 { 00:27:59.783 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:59.783 "dma_device_type": 2 00:27:59.783 } 00:27:59.783 ], 00:27:59.783 "driver_specific": {} 00:27:59.783 }' 00:27:59.783 11:38:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:59.783 11:38:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:59.783 11:38:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:27:59.783 11:38:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:59.783 11:38:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:00.041 11:38:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:28:00.042 11:38:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:00.042 11:38:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:00.042 11:38:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:28:00.042 11:38:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:00.042 11:38:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:00.042 11:38:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:28:00.042 11:38:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:28:00.301 [2024-06-10 11:38:44.062108] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:28:00.301 [2024-06-10 11:38:44.062128] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:00.301 [2024-06-10 11:38:44.062157] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:00.301 11:38:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:28:00.301 11:38:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:28:00.301 11:38:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:28:00.301 11:38:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:28:00.301 11:38:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:28:00.301 11:38:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 1 00:28:00.301 11:38:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:00.301 11:38:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:28:00.301 11:38:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:28:00.301 11:38:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:28:00.301 11:38:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:00.301 11:38:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:00.301 11:38:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:00.301 11:38:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:00.301 11:38:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:00.301 11:38:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:00.301 11:38:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:00.560 11:38:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:00.560 "name": "Existed_Raid", 00:28:00.560 "uuid": "e04c3bce-3432-4037-883b-7bc3b845c692", 00:28:00.560 "strip_size_kb": 64, 00:28:00.560 "state": "offline", 00:28:00.560 "raid_level": "raid0", 00:28:00.560 "superblock": true, 00:28:00.560 "num_base_bdevs": 2, 00:28:00.560 "num_base_bdevs_discovered": 1, 00:28:00.560 "num_base_bdevs_operational": 1, 00:28:00.560 "base_bdevs_list": [ 00:28:00.560 { 00:28:00.560 "name": null, 00:28:00.560 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:00.560 "is_configured": false, 00:28:00.560 "data_offset": 2048, 00:28:00.560 "data_size": 63488 00:28:00.560 }, 00:28:00.560 { 00:28:00.560 "name": "BaseBdev2", 00:28:00.560 "uuid": "31a63e73-f89b-4566-902b-5555162cf402", 00:28:00.560 "is_configured": true, 00:28:00.560 "data_offset": 2048, 00:28:00.560 "data_size": 63488 00:28:00.560 } 00:28:00.560 ] 00:28:00.560 }' 00:28:00.560 11:38:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:00.560 11:38:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:28:00.819 11:38:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:28:00.819 11:38:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:28:00.819 11:38:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:00.819 11:38:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:28:01.078 11:38:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:28:01.078 11:38:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:28:01.078 11:38:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:28:01.337 [2024-06-10 11:38:45.061485] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:28:01.337 [2024-06-10 11:38:45.061521] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa15bf0 name Existed_Raid, state offline 00:28:01.337 11:38:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:28:01.337 11:38:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:28:01.337 11:38:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:01.337 11:38:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:28:01.337 11:38:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:28:01.337 11:38:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:28:01.337 11:38:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:28:01.337 11:38:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 131049 00:28:01.337 11:38:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@949 -- # '[' -z 131049 ']' 00:28:01.337 11:38:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # kill -0 131049 00:28:01.337 11:38:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # uname 00:28:01.337 11:38:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:28:01.337 11:38:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 131049 00:28:01.596 11:38:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:28:01.596 11:38:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:28:01.596 11:38:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # echo 'killing process with pid 131049' 00:28:01.596 killing process with pid 131049 00:28:01.596 11:38:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # kill 131049 00:28:01.596 [2024-06-10 11:38:45.303361] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:28:01.596 11:38:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@973 -- # wait 131049 00:28:01.596 [2024-06-10 11:38:45.304232] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:28:01.596 11:38:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:28:01.596 00:28:01.596 real 0m8.265s 00:28:01.596 user 0m14.531s 00:28:01.596 sys 0m1.633s 00:28:01.596 11:38:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # xtrace_disable 00:28:01.596 11:38:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:28:01.596 ************************************ 00:28:01.596 END TEST raid_state_function_test_sb 00:28:01.596 ************************************ 00:28:01.855 11:38:45 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 2 00:28:01.855 11:38:45 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:28:01.855 11:38:45 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:28:01.855 11:38:45 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:28:01.855 ************************************ 00:28:01.855 START TEST raid_superblock_test 00:28:01.855 ************************************ 00:28:01.855 11:38:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # raid_superblock_test raid0 2 00:28:01.855 11:38:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:28:01.855 11:38:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:28:01.855 11:38:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:28:01.855 11:38:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:28:01.855 11:38:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:28:01.855 11:38:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:28:01.855 11:38:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:28:01.855 11:38:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:28:01.855 11:38:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:28:01.855 11:38:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:28:01.855 11:38:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:28:01.855 11:38:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:28:01.855 11:38:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:28:01.855 11:38:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:28:01.855 11:38:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:28:01.855 11:38:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:28:01.855 11:38:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=132412 00:28:01.855 11:38:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 132412 /var/tmp/spdk-raid.sock 00:28:01.855 11:38:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:28:01.855 11:38:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@830 -- # '[' -z 132412 ']' 00:28:01.855 11:38:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:28:01.855 11:38:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:28:01.855 11:38:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:28:01.855 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:28:01.855 11:38:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:28:01.855 11:38:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:28:01.855 [2024-06-10 11:38:45.649373] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:28:01.855 [2024-06-10 11:38:45.649428] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid132412 ] 00:28:01.855 [2024-06-10 11:38:45.735478] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:02.115 [2024-06-10 11:38:45.821050] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:28:02.115 [2024-06-10 11:38:45.885764] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:02.115 [2024-06-10 11:38:45.885792] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:02.682 11:38:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:28:02.682 11:38:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@863 -- # return 0 00:28:02.682 11:38:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:28:02.682 11:38:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:28:02.682 11:38:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:28:02.682 11:38:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:28:02.682 11:38:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:28:02.682 11:38:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:28:02.683 11:38:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:28:02.683 11:38:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:28:02.683 11:38:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:28:02.683 malloc1 00:28:02.941 11:38:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:28:02.941 [2024-06-10 11:38:46.790586] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:28:02.941 [2024-06-10 11:38:46.790629] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:02.941 [2024-06-10 11:38:46.790644] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf54100 00:28:02.941 [2024-06-10 11:38:46.790652] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:02.941 [2024-06-10 11:38:46.791837] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:02.941 [2024-06-10 11:38:46.791860] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:28:02.941 pt1 00:28:02.941 11:38:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:28:02.941 11:38:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:28:02.941 11:38:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:28:02.941 11:38:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:28:02.941 11:38:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:28:02.942 11:38:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:28:02.942 11:38:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:28:02.942 11:38:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:28:02.942 11:38:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:28:03.201 malloc2 00:28:03.201 11:38:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:28:03.201 [2024-06-10 11:38:47.143376] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:28:03.201 [2024-06-10 11:38:47.143413] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:03.201 [2024-06-10 11:38:47.143427] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf55500 00:28:03.201 [2024-06-10 11:38:47.143436] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:03.201 [2024-06-10 11:38:47.144441] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:03.201 [2024-06-10 11:38:47.144461] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:28:03.460 pt2 00:28:03.460 11:38:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:28:03.460 11:38:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:28:03.460 11:38:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2' -n raid_bdev1 -s 00:28:03.460 [2024-06-10 11:38:47.315838] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:28:03.460 [2024-06-10 11:38:47.316632] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:03.460 [2024-06-10 11:38:47.316728] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xf54c00 00:28:03.460 [2024-06-10 11:38:47.316737] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:28:03.460 [2024-06-10 11:38:47.316853] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf53a00 00:28:03.460 [2024-06-10 11:38:47.316953] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xf54c00 00:28:03.460 [2024-06-10 11:38:47.316959] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xf54c00 00:28:03.460 [2024-06-10 11:38:47.317017] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:03.460 11:38:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:28:03.460 11:38:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:03.460 11:38:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:03.460 11:38:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:28:03.460 11:38:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:28:03.460 11:38:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:03.460 11:38:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:03.460 11:38:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:03.460 11:38:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:03.460 11:38:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:03.460 11:38:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:03.460 11:38:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:03.719 11:38:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:03.719 "name": "raid_bdev1", 00:28:03.719 "uuid": "34dfadfa-a72b-477b-b0a4-5e638da4b5d1", 00:28:03.719 "strip_size_kb": 64, 00:28:03.719 "state": "online", 00:28:03.719 "raid_level": "raid0", 00:28:03.719 "superblock": true, 00:28:03.719 "num_base_bdevs": 2, 00:28:03.719 "num_base_bdevs_discovered": 2, 00:28:03.719 "num_base_bdevs_operational": 2, 00:28:03.719 "base_bdevs_list": [ 00:28:03.719 { 00:28:03.719 "name": "pt1", 00:28:03.719 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:03.719 "is_configured": true, 00:28:03.719 "data_offset": 2048, 00:28:03.719 "data_size": 63488 00:28:03.719 }, 00:28:03.719 { 00:28:03.719 "name": "pt2", 00:28:03.719 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:03.719 "is_configured": true, 00:28:03.719 "data_offset": 2048, 00:28:03.719 "data_size": 63488 00:28:03.719 } 00:28:03.719 ] 00:28:03.719 }' 00:28:03.719 11:38:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:03.719 11:38:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:28:04.287 11:38:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:28:04.287 11:38:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:28:04.287 11:38:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:28:04.287 11:38:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:28:04.287 11:38:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:28:04.287 11:38:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:28:04.287 11:38:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:04.287 11:38:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:28:04.287 [2024-06-10 11:38:48.170178] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:04.287 11:38:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:28:04.287 "name": "raid_bdev1", 00:28:04.287 "aliases": [ 00:28:04.287 "34dfadfa-a72b-477b-b0a4-5e638da4b5d1" 00:28:04.287 ], 00:28:04.287 "product_name": "Raid Volume", 00:28:04.287 "block_size": 512, 00:28:04.287 "num_blocks": 126976, 00:28:04.287 "uuid": "34dfadfa-a72b-477b-b0a4-5e638da4b5d1", 00:28:04.287 "assigned_rate_limits": { 00:28:04.287 "rw_ios_per_sec": 0, 00:28:04.287 "rw_mbytes_per_sec": 0, 00:28:04.287 "r_mbytes_per_sec": 0, 00:28:04.287 "w_mbytes_per_sec": 0 00:28:04.287 }, 00:28:04.287 "claimed": false, 00:28:04.287 "zoned": false, 00:28:04.287 "supported_io_types": { 00:28:04.287 "read": true, 00:28:04.287 "write": true, 00:28:04.287 "unmap": true, 00:28:04.287 "write_zeroes": true, 00:28:04.287 "flush": true, 00:28:04.287 "reset": true, 00:28:04.287 "compare": false, 00:28:04.287 "compare_and_write": false, 00:28:04.287 "abort": false, 00:28:04.287 "nvme_admin": false, 00:28:04.287 "nvme_io": false 00:28:04.287 }, 00:28:04.287 "memory_domains": [ 00:28:04.287 { 00:28:04.287 "dma_device_id": "system", 00:28:04.287 "dma_device_type": 1 00:28:04.287 }, 00:28:04.287 { 00:28:04.287 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:04.287 "dma_device_type": 2 00:28:04.287 }, 00:28:04.287 { 00:28:04.287 "dma_device_id": "system", 00:28:04.287 "dma_device_type": 1 00:28:04.287 }, 00:28:04.287 { 00:28:04.287 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:04.287 "dma_device_type": 2 00:28:04.287 } 00:28:04.287 ], 00:28:04.287 "driver_specific": { 00:28:04.287 "raid": { 00:28:04.287 "uuid": "34dfadfa-a72b-477b-b0a4-5e638da4b5d1", 00:28:04.287 "strip_size_kb": 64, 00:28:04.287 "state": "online", 00:28:04.287 "raid_level": "raid0", 00:28:04.287 "superblock": true, 00:28:04.287 "num_base_bdevs": 2, 00:28:04.287 "num_base_bdevs_discovered": 2, 00:28:04.287 "num_base_bdevs_operational": 2, 00:28:04.287 "base_bdevs_list": [ 00:28:04.287 { 00:28:04.287 "name": "pt1", 00:28:04.287 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:04.287 "is_configured": true, 00:28:04.287 "data_offset": 2048, 00:28:04.287 "data_size": 63488 00:28:04.287 }, 00:28:04.287 { 00:28:04.287 "name": "pt2", 00:28:04.287 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:04.287 "is_configured": true, 00:28:04.287 "data_offset": 2048, 00:28:04.287 "data_size": 63488 00:28:04.287 } 00:28:04.287 ] 00:28:04.287 } 00:28:04.287 } 00:28:04.287 }' 00:28:04.287 11:38:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:28:04.546 11:38:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:28:04.546 pt2' 00:28:04.546 11:38:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:04.546 11:38:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:28:04.546 11:38:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:04.546 11:38:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:04.546 "name": "pt1", 00:28:04.546 "aliases": [ 00:28:04.546 "00000000-0000-0000-0000-000000000001" 00:28:04.546 ], 00:28:04.546 "product_name": "passthru", 00:28:04.546 "block_size": 512, 00:28:04.546 "num_blocks": 65536, 00:28:04.546 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:04.546 "assigned_rate_limits": { 00:28:04.546 "rw_ios_per_sec": 0, 00:28:04.546 "rw_mbytes_per_sec": 0, 00:28:04.546 "r_mbytes_per_sec": 0, 00:28:04.546 "w_mbytes_per_sec": 0 00:28:04.546 }, 00:28:04.546 "claimed": true, 00:28:04.546 "claim_type": "exclusive_write", 00:28:04.546 "zoned": false, 00:28:04.546 "supported_io_types": { 00:28:04.546 "read": true, 00:28:04.546 "write": true, 00:28:04.546 "unmap": true, 00:28:04.546 "write_zeroes": true, 00:28:04.546 "flush": true, 00:28:04.546 "reset": true, 00:28:04.546 "compare": false, 00:28:04.546 "compare_and_write": false, 00:28:04.546 "abort": true, 00:28:04.546 "nvme_admin": false, 00:28:04.546 "nvme_io": false 00:28:04.546 }, 00:28:04.546 "memory_domains": [ 00:28:04.546 { 00:28:04.546 "dma_device_id": "system", 00:28:04.546 "dma_device_type": 1 00:28:04.546 }, 00:28:04.546 { 00:28:04.546 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:04.546 "dma_device_type": 2 00:28:04.546 } 00:28:04.546 ], 00:28:04.546 "driver_specific": { 00:28:04.546 "passthru": { 00:28:04.546 "name": "pt1", 00:28:04.546 "base_bdev_name": "malloc1" 00:28:04.546 } 00:28:04.546 } 00:28:04.546 }' 00:28:04.546 11:38:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:04.546 11:38:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:04.546 11:38:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:28:04.546 11:38:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:04.805 11:38:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:04.805 11:38:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:28:04.805 11:38:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:04.805 11:38:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:04.805 11:38:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:28:04.805 11:38:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:04.805 11:38:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:04.805 11:38:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:28:04.805 11:38:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:04.805 11:38:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:28:04.805 11:38:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:05.064 11:38:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:05.064 "name": "pt2", 00:28:05.064 "aliases": [ 00:28:05.064 "00000000-0000-0000-0000-000000000002" 00:28:05.064 ], 00:28:05.064 "product_name": "passthru", 00:28:05.064 "block_size": 512, 00:28:05.064 "num_blocks": 65536, 00:28:05.064 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:05.064 "assigned_rate_limits": { 00:28:05.064 "rw_ios_per_sec": 0, 00:28:05.064 "rw_mbytes_per_sec": 0, 00:28:05.064 "r_mbytes_per_sec": 0, 00:28:05.064 "w_mbytes_per_sec": 0 00:28:05.064 }, 00:28:05.064 "claimed": true, 00:28:05.064 "claim_type": "exclusive_write", 00:28:05.064 "zoned": false, 00:28:05.064 "supported_io_types": { 00:28:05.064 "read": true, 00:28:05.064 "write": true, 00:28:05.064 "unmap": true, 00:28:05.064 "write_zeroes": true, 00:28:05.064 "flush": true, 00:28:05.064 "reset": true, 00:28:05.064 "compare": false, 00:28:05.064 "compare_and_write": false, 00:28:05.064 "abort": true, 00:28:05.064 "nvme_admin": false, 00:28:05.064 "nvme_io": false 00:28:05.064 }, 00:28:05.064 "memory_domains": [ 00:28:05.064 { 00:28:05.064 "dma_device_id": "system", 00:28:05.064 "dma_device_type": 1 00:28:05.064 }, 00:28:05.064 { 00:28:05.064 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:05.064 "dma_device_type": 2 00:28:05.064 } 00:28:05.064 ], 00:28:05.064 "driver_specific": { 00:28:05.064 "passthru": { 00:28:05.064 "name": "pt2", 00:28:05.064 "base_bdev_name": "malloc2" 00:28:05.064 } 00:28:05.064 } 00:28:05.064 }' 00:28:05.064 11:38:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:05.064 11:38:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:05.064 11:38:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:28:05.064 11:38:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:05.064 11:38:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:05.064 11:38:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:28:05.064 11:38:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:05.322 11:38:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:05.322 11:38:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:28:05.322 11:38:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:05.322 11:38:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:05.322 11:38:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:28:05.322 11:38:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:05.322 11:38:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:28:05.583 [2024-06-10 11:38:49.301082] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:05.583 11:38:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=34dfadfa-a72b-477b-b0a4-5e638da4b5d1 00:28:05.583 11:38:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 34dfadfa-a72b-477b-b0a4-5e638da4b5d1 ']' 00:28:05.583 11:38:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:05.583 [2024-06-10 11:38:49.481410] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:05.583 [2024-06-10 11:38:49.481425] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:05.583 [2024-06-10 11:38:49.481463] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:05.583 [2024-06-10 11:38:49.481491] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:05.583 [2024-06-10 11:38:49.481499] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf54c00 name raid_bdev1, state offline 00:28:05.583 11:38:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:05.583 11:38:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:28:05.843 11:38:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:28:05.843 11:38:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:28:05.843 11:38:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:28:05.843 11:38:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:28:06.102 11:38:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:28:06.102 11:38:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:28:06.102 11:38:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:28:06.102 11:38:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:28:06.361 11:38:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:28:06.361 11:38:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:28:06.361 11:38:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@649 -- # local es=0 00:28:06.361 11:38:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:28:06.361 11:38:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:06.361 11:38:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:28:06.361 11:38:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:06.361 11:38:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:28:06.361 11:38:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:06.361 11:38:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:28:06.361 11:38:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:06.361 11:38:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:28:06.361 11:38:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:28:06.620 [2024-06-10 11:38:50.327601] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:28:06.620 [2024-06-10 11:38:50.328673] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:28:06.620 [2024-06-10 11:38:50.328721] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:28:06.620 [2024-06-10 11:38:50.328754] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:28:06.620 [2024-06-10 11:38:50.328767] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:06.620 [2024-06-10 11:38:50.328775] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1101c50 name raid_bdev1, state configuring 00:28:06.620 request: 00:28:06.620 { 00:28:06.620 "name": "raid_bdev1", 00:28:06.620 "raid_level": "raid0", 00:28:06.620 "base_bdevs": [ 00:28:06.620 "malloc1", 00:28:06.620 "malloc2" 00:28:06.620 ], 00:28:06.620 "superblock": false, 00:28:06.620 "strip_size_kb": 64, 00:28:06.620 "method": "bdev_raid_create", 00:28:06.620 "req_id": 1 00:28:06.620 } 00:28:06.620 Got JSON-RPC error response 00:28:06.620 response: 00:28:06.620 { 00:28:06.620 "code": -17, 00:28:06.620 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:28:06.620 } 00:28:06.620 11:38:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # es=1 00:28:06.620 11:38:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:28:06.620 11:38:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:28:06.620 11:38:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:28:06.620 11:38:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:06.620 11:38:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:28:06.620 11:38:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:28:06.620 11:38:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:28:06.620 11:38:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:28:06.880 [2024-06-10 11:38:50.656407] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:28:06.880 [2024-06-10 11:38:50.656443] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:06.880 [2024-06-10 11:38:50.656455] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1100ad0 00:28:06.880 [2024-06-10 11:38:50.656464] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:06.880 [2024-06-10 11:38:50.657658] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:06.880 [2024-06-10 11:38:50.657681] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:28:06.880 [2024-06-10 11:38:50.657729] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:28:06.880 [2024-06-10 11:38:50.657750] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:28:06.880 pt1 00:28:06.880 11:38:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 2 00:28:06.880 11:38:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:06.880 11:38:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:06.880 11:38:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:28:06.880 11:38:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:28:06.880 11:38:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:06.880 11:38:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:06.880 11:38:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:06.880 11:38:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:06.880 11:38:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:06.880 11:38:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:06.880 11:38:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:07.139 11:38:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:07.139 "name": "raid_bdev1", 00:28:07.139 "uuid": "34dfadfa-a72b-477b-b0a4-5e638da4b5d1", 00:28:07.139 "strip_size_kb": 64, 00:28:07.139 "state": "configuring", 00:28:07.139 "raid_level": "raid0", 00:28:07.139 "superblock": true, 00:28:07.139 "num_base_bdevs": 2, 00:28:07.139 "num_base_bdevs_discovered": 1, 00:28:07.139 "num_base_bdevs_operational": 2, 00:28:07.139 "base_bdevs_list": [ 00:28:07.139 { 00:28:07.139 "name": "pt1", 00:28:07.139 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:07.139 "is_configured": true, 00:28:07.139 "data_offset": 2048, 00:28:07.139 "data_size": 63488 00:28:07.139 }, 00:28:07.139 { 00:28:07.139 "name": null, 00:28:07.139 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:07.139 "is_configured": false, 00:28:07.139 "data_offset": 2048, 00:28:07.139 "data_size": 63488 00:28:07.139 } 00:28:07.139 ] 00:28:07.139 }' 00:28:07.139 11:38:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:07.139 11:38:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:28:07.397 11:38:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:28:07.397 11:38:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:28:07.397 11:38:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:28:07.397 11:38:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:28:07.656 [2024-06-10 11:38:51.458472] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:28:07.656 [2024-06-10 11:38:51.458510] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:07.656 [2024-06-10 11:38:51.458537] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1102550 00:28:07.656 [2024-06-10 11:38:51.458545] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:07.656 [2024-06-10 11:38:51.458789] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:07.656 [2024-06-10 11:38:51.458801] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:28:07.656 [2024-06-10 11:38:51.458844] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:28:07.656 [2024-06-10 11:38:51.458857] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:07.656 [2024-06-10 11:38:51.458930] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xf535c0 00:28:07.656 [2024-06-10 11:38:51.458937] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:28:07.656 [2024-06-10 11:38:51.459049] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x10feae0 00:28:07.656 [2024-06-10 11:38:51.459133] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xf535c0 00:28:07.656 [2024-06-10 11:38:51.459139] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xf535c0 00:28:07.656 [2024-06-10 11:38:51.459210] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:07.656 pt2 00:28:07.656 11:38:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:28:07.656 11:38:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:28:07.656 11:38:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:28:07.656 11:38:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:07.656 11:38:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:07.656 11:38:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:28:07.656 11:38:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:28:07.656 11:38:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:07.656 11:38:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:07.656 11:38:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:07.656 11:38:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:07.656 11:38:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:07.656 11:38:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:07.656 11:38:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:07.915 11:38:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:07.915 "name": "raid_bdev1", 00:28:07.915 "uuid": "34dfadfa-a72b-477b-b0a4-5e638da4b5d1", 00:28:07.915 "strip_size_kb": 64, 00:28:07.915 "state": "online", 00:28:07.915 "raid_level": "raid0", 00:28:07.915 "superblock": true, 00:28:07.915 "num_base_bdevs": 2, 00:28:07.915 "num_base_bdevs_discovered": 2, 00:28:07.915 "num_base_bdevs_operational": 2, 00:28:07.915 "base_bdevs_list": [ 00:28:07.915 { 00:28:07.915 "name": "pt1", 00:28:07.915 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:07.915 "is_configured": true, 00:28:07.915 "data_offset": 2048, 00:28:07.915 "data_size": 63488 00:28:07.915 }, 00:28:07.915 { 00:28:07.915 "name": "pt2", 00:28:07.915 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:07.915 "is_configured": true, 00:28:07.915 "data_offset": 2048, 00:28:07.915 "data_size": 63488 00:28:07.915 } 00:28:07.915 ] 00:28:07.915 }' 00:28:07.915 11:38:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:07.915 11:38:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:28:08.542 11:38:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:28:08.542 11:38:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:28:08.542 11:38:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:28:08.542 11:38:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:28:08.542 11:38:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:28:08.542 11:38:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:28:08.542 11:38:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:08.542 11:38:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:28:08.543 [2024-06-10 11:38:52.320999] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:08.543 11:38:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:28:08.543 "name": "raid_bdev1", 00:28:08.543 "aliases": [ 00:28:08.543 "34dfadfa-a72b-477b-b0a4-5e638da4b5d1" 00:28:08.543 ], 00:28:08.543 "product_name": "Raid Volume", 00:28:08.543 "block_size": 512, 00:28:08.543 "num_blocks": 126976, 00:28:08.543 "uuid": "34dfadfa-a72b-477b-b0a4-5e638da4b5d1", 00:28:08.543 "assigned_rate_limits": { 00:28:08.543 "rw_ios_per_sec": 0, 00:28:08.543 "rw_mbytes_per_sec": 0, 00:28:08.543 "r_mbytes_per_sec": 0, 00:28:08.543 "w_mbytes_per_sec": 0 00:28:08.543 }, 00:28:08.543 "claimed": false, 00:28:08.543 "zoned": false, 00:28:08.543 "supported_io_types": { 00:28:08.543 "read": true, 00:28:08.543 "write": true, 00:28:08.543 "unmap": true, 00:28:08.543 "write_zeroes": true, 00:28:08.543 "flush": true, 00:28:08.543 "reset": true, 00:28:08.543 "compare": false, 00:28:08.543 "compare_and_write": false, 00:28:08.543 "abort": false, 00:28:08.543 "nvme_admin": false, 00:28:08.543 "nvme_io": false 00:28:08.543 }, 00:28:08.543 "memory_domains": [ 00:28:08.543 { 00:28:08.543 "dma_device_id": "system", 00:28:08.543 "dma_device_type": 1 00:28:08.543 }, 00:28:08.543 { 00:28:08.543 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:08.543 "dma_device_type": 2 00:28:08.543 }, 00:28:08.543 { 00:28:08.543 "dma_device_id": "system", 00:28:08.543 "dma_device_type": 1 00:28:08.543 }, 00:28:08.543 { 00:28:08.543 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:08.543 "dma_device_type": 2 00:28:08.543 } 00:28:08.543 ], 00:28:08.543 "driver_specific": { 00:28:08.543 "raid": { 00:28:08.543 "uuid": "34dfadfa-a72b-477b-b0a4-5e638da4b5d1", 00:28:08.543 "strip_size_kb": 64, 00:28:08.543 "state": "online", 00:28:08.543 "raid_level": "raid0", 00:28:08.543 "superblock": true, 00:28:08.543 "num_base_bdevs": 2, 00:28:08.543 "num_base_bdevs_discovered": 2, 00:28:08.543 "num_base_bdevs_operational": 2, 00:28:08.543 "base_bdevs_list": [ 00:28:08.543 { 00:28:08.543 "name": "pt1", 00:28:08.543 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:08.543 "is_configured": true, 00:28:08.543 "data_offset": 2048, 00:28:08.543 "data_size": 63488 00:28:08.543 }, 00:28:08.543 { 00:28:08.543 "name": "pt2", 00:28:08.543 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:08.543 "is_configured": true, 00:28:08.543 "data_offset": 2048, 00:28:08.543 "data_size": 63488 00:28:08.543 } 00:28:08.543 ] 00:28:08.543 } 00:28:08.543 } 00:28:08.543 }' 00:28:08.543 11:38:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:28:08.543 11:38:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:28:08.543 pt2' 00:28:08.543 11:38:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:08.543 11:38:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:28:08.543 11:38:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:08.802 11:38:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:08.802 "name": "pt1", 00:28:08.802 "aliases": [ 00:28:08.802 "00000000-0000-0000-0000-000000000001" 00:28:08.802 ], 00:28:08.802 "product_name": "passthru", 00:28:08.802 "block_size": 512, 00:28:08.802 "num_blocks": 65536, 00:28:08.802 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:08.802 "assigned_rate_limits": { 00:28:08.802 "rw_ios_per_sec": 0, 00:28:08.802 "rw_mbytes_per_sec": 0, 00:28:08.802 "r_mbytes_per_sec": 0, 00:28:08.802 "w_mbytes_per_sec": 0 00:28:08.802 }, 00:28:08.802 "claimed": true, 00:28:08.802 "claim_type": "exclusive_write", 00:28:08.802 "zoned": false, 00:28:08.802 "supported_io_types": { 00:28:08.802 "read": true, 00:28:08.802 "write": true, 00:28:08.802 "unmap": true, 00:28:08.802 "write_zeroes": true, 00:28:08.802 "flush": true, 00:28:08.802 "reset": true, 00:28:08.802 "compare": false, 00:28:08.802 "compare_and_write": false, 00:28:08.802 "abort": true, 00:28:08.802 "nvme_admin": false, 00:28:08.802 "nvme_io": false 00:28:08.802 }, 00:28:08.802 "memory_domains": [ 00:28:08.802 { 00:28:08.802 "dma_device_id": "system", 00:28:08.802 "dma_device_type": 1 00:28:08.802 }, 00:28:08.802 { 00:28:08.802 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:08.802 "dma_device_type": 2 00:28:08.802 } 00:28:08.802 ], 00:28:08.802 "driver_specific": { 00:28:08.802 "passthru": { 00:28:08.802 "name": "pt1", 00:28:08.802 "base_bdev_name": "malloc1" 00:28:08.802 } 00:28:08.802 } 00:28:08.802 }' 00:28:08.802 11:38:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:08.802 11:38:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:08.802 11:38:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:28:08.802 11:38:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:08.802 11:38:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:08.802 11:38:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:28:08.802 11:38:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:08.802 11:38:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:09.061 11:38:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:28:09.061 11:38:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:09.061 11:38:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:09.061 11:38:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:28:09.061 11:38:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:09.061 11:38:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:28:09.061 11:38:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:09.061 11:38:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:09.061 "name": "pt2", 00:28:09.061 "aliases": [ 00:28:09.061 "00000000-0000-0000-0000-000000000002" 00:28:09.061 ], 00:28:09.061 "product_name": "passthru", 00:28:09.061 "block_size": 512, 00:28:09.061 "num_blocks": 65536, 00:28:09.061 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:09.061 "assigned_rate_limits": { 00:28:09.061 "rw_ios_per_sec": 0, 00:28:09.061 "rw_mbytes_per_sec": 0, 00:28:09.061 "r_mbytes_per_sec": 0, 00:28:09.061 "w_mbytes_per_sec": 0 00:28:09.061 }, 00:28:09.061 "claimed": true, 00:28:09.061 "claim_type": "exclusive_write", 00:28:09.061 "zoned": false, 00:28:09.061 "supported_io_types": { 00:28:09.062 "read": true, 00:28:09.062 "write": true, 00:28:09.062 "unmap": true, 00:28:09.062 "write_zeroes": true, 00:28:09.062 "flush": true, 00:28:09.062 "reset": true, 00:28:09.062 "compare": false, 00:28:09.062 "compare_and_write": false, 00:28:09.062 "abort": true, 00:28:09.062 "nvme_admin": false, 00:28:09.062 "nvme_io": false 00:28:09.062 }, 00:28:09.062 "memory_domains": [ 00:28:09.062 { 00:28:09.062 "dma_device_id": "system", 00:28:09.062 "dma_device_type": 1 00:28:09.062 }, 00:28:09.062 { 00:28:09.062 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:09.062 "dma_device_type": 2 00:28:09.062 } 00:28:09.062 ], 00:28:09.062 "driver_specific": { 00:28:09.062 "passthru": { 00:28:09.062 "name": "pt2", 00:28:09.062 "base_bdev_name": "malloc2" 00:28:09.062 } 00:28:09.062 } 00:28:09.062 }' 00:28:09.320 11:38:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:09.320 11:38:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:09.320 11:38:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:28:09.320 11:38:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:09.320 11:38:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:09.320 11:38:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:28:09.320 11:38:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:09.320 11:38:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:09.320 11:38:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:28:09.320 11:38:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:09.320 11:38:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:09.579 11:38:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:28:09.579 11:38:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:09.579 11:38:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:28:09.579 [2024-06-10 11:38:53.435875] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:09.579 11:38:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 34dfadfa-a72b-477b-b0a4-5e638da4b5d1 '!=' 34dfadfa-a72b-477b-b0a4-5e638da4b5d1 ']' 00:28:09.579 11:38:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:28:09.579 11:38:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:28:09.579 11:38:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:28:09.579 11:38:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 132412 00:28:09.579 11:38:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@949 -- # '[' -z 132412 ']' 00:28:09.579 11:38:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # kill -0 132412 00:28:09.579 11:38:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # uname 00:28:09.579 11:38:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:28:09.579 11:38:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 132412 00:28:09.579 11:38:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:28:09.579 11:38:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:28:09.579 11:38:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 132412' 00:28:09.579 killing process with pid 132412 00:28:09.579 11:38:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # kill 132412 00:28:09.579 [2024-06-10 11:38:53.499876] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:28:09.579 [2024-06-10 11:38:53.499921] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:09.579 [2024-06-10 11:38:53.499952] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:09.579 [2024-06-10 11:38:53.499961] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf535c0 name raid_bdev1, state offline 00:28:09.579 11:38:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@973 -- # wait 132412 00:28:09.579 [2024-06-10 11:38:53.515147] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:28:09.839 11:38:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:28:09.839 00:28:09.839 real 0m8.106s 00:28:09.839 user 0m14.341s 00:28:09.839 sys 0m1.563s 00:28:09.839 11:38:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:28:09.839 11:38:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:28:09.839 ************************************ 00:28:09.839 END TEST raid_superblock_test 00:28:09.839 ************************************ 00:28:09.839 11:38:53 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 2 read 00:28:09.839 11:38:53 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:28:09.839 11:38:53 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:28:09.839 11:38:53 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:28:09.839 ************************************ 00:28:09.839 START TEST raid_read_error_test 00:28:09.839 ************************************ 00:28:09.839 11:38:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test raid0 2 read 00:28:09.839 11:38:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:28:09.839 11:38:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:28:09.839 11:38:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:28:09.839 11:38:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:28:09.839 11:38:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:28:09.839 11:38:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:28:09.839 11:38:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:28:09.839 11:38:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:28:09.839 11:38:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:28:09.839 11:38:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:28:09.839 11:38:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:28:09.839 11:38:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:28:09.839 11:38:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:28:09.839 11:38:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:28:09.839 11:38:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:28:09.839 11:38:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:28:09.839 11:38:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:28:09.839 11:38:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:28:09.839 11:38:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:28:09.839 11:38:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:28:09.839 11:38:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:28:09.839 11:38:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:28:09.839 11:38:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.uiV32z4KHG 00:28:09.839 11:38:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=133697 00:28:09.839 11:38:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 133697 /var/tmp/spdk-raid.sock 00:28:09.839 11:38:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:28:09.839 11:38:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@830 -- # '[' -z 133697 ']' 00:28:09.839 11:38:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:28:09.839 11:38:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:28:09.839 11:38:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:28:09.839 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:28:09.839 11:38:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:28:09.839 11:38:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:28:10.099 [2024-06-10 11:38:53.833621] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:28:10.099 [2024-06-10 11:38:53.833666] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid133697 ] 00:28:10.099 [2024-06-10 11:38:53.917611] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:10.099 [2024-06-10 11:38:53.997782] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:28:10.358 [2024-06-10 11:38:54.054057] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:10.358 [2024-06-10 11:38:54.054085] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:10.927 11:38:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:28:10.927 11:38:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@863 -- # return 0 00:28:10.927 11:38:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:28:10.927 11:38:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:28:10.927 BaseBdev1_malloc 00:28:10.927 11:38:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:28:11.186 true 00:28:11.186 11:38:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:28:11.445 [2024-06-10 11:38:55.133162] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:28:11.445 [2024-06-10 11:38:55.133200] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:11.445 [2024-06-10 11:38:55.133215] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f4cb10 00:28:11.445 [2024-06-10 11:38:55.133224] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:11.445 [2024-06-10 11:38:55.134604] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:11.445 [2024-06-10 11:38:55.134626] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:28:11.445 BaseBdev1 00:28:11.445 11:38:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:28:11.445 11:38:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:28:11.445 BaseBdev2_malloc 00:28:11.445 11:38:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:28:11.704 true 00:28:11.704 11:38:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:28:11.963 [2024-06-10 11:38:55.655354] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:28:11.963 [2024-06-10 11:38:55.655396] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:11.963 [2024-06-10 11:38:55.655410] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f51280 00:28:11.963 [2024-06-10 11:38:55.655418] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:11.963 [2024-06-10 11:38:55.656615] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:11.963 [2024-06-10 11:38:55.656636] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:28:11.963 BaseBdev2 00:28:11.963 11:38:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:28:11.963 [2024-06-10 11:38:55.815792] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:11.963 [2024-06-10 11:38:55.816717] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:28:11.963 [2024-06-10 11:38:55.816851] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f50ca0 00:28:11.963 [2024-06-10 11:38:55.816860] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:28:11.963 [2024-06-10 11:38:55.817030] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f53d10 00:28:11.963 [2024-06-10 11:38:55.817134] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f50ca0 00:28:11.963 [2024-06-10 11:38:55.817141] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1f50ca0 00:28:11.963 [2024-06-10 11:38:55.817214] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:11.963 11:38:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:28:11.963 11:38:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:11.963 11:38:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:11.963 11:38:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:28:11.963 11:38:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:28:11.963 11:38:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:11.963 11:38:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:11.963 11:38:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:11.963 11:38:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:11.963 11:38:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:11.963 11:38:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:11.963 11:38:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:12.223 11:38:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:12.223 "name": "raid_bdev1", 00:28:12.223 "uuid": "37273b76-2dc5-4712-9642-17cf98a2af1b", 00:28:12.223 "strip_size_kb": 64, 00:28:12.223 "state": "online", 00:28:12.223 "raid_level": "raid0", 00:28:12.223 "superblock": true, 00:28:12.223 "num_base_bdevs": 2, 00:28:12.223 "num_base_bdevs_discovered": 2, 00:28:12.223 "num_base_bdevs_operational": 2, 00:28:12.223 "base_bdevs_list": [ 00:28:12.223 { 00:28:12.223 "name": "BaseBdev1", 00:28:12.223 "uuid": "d8836fcc-0f54-5b2c-be8f-b50223dd6dee", 00:28:12.223 "is_configured": true, 00:28:12.223 "data_offset": 2048, 00:28:12.223 "data_size": 63488 00:28:12.223 }, 00:28:12.223 { 00:28:12.223 "name": "BaseBdev2", 00:28:12.223 "uuid": "601a7e5a-b8a5-5918-a357-f09e22a5a502", 00:28:12.223 "is_configured": true, 00:28:12.223 "data_offset": 2048, 00:28:12.223 "data_size": 63488 00:28:12.223 } 00:28:12.223 ] 00:28:12.223 }' 00:28:12.223 11:38:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:12.223 11:38:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:28:12.817 11:38:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:28:12.817 11:38:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:28:12.817 [2024-06-10 11:38:56.597995] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f4e420 00:28:13.754 11:38:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:28:14.013 11:38:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:28:14.013 11:38:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:28:14.013 11:38:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:28:14.013 11:38:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:28:14.013 11:38:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:14.013 11:38:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:14.014 11:38:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:28:14.014 11:38:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:28:14.014 11:38:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:14.014 11:38:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:14.014 11:38:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:14.014 11:38:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:14.014 11:38:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:14.014 11:38:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:14.014 11:38:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:14.014 11:38:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:14.014 "name": "raid_bdev1", 00:28:14.014 "uuid": "37273b76-2dc5-4712-9642-17cf98a2af1b", 00:28:14.014 "strip_size_kb": 64, 00:28:14.014 "state": "online", 00:28:14.014 "raid_level": "raid0", 00:28:14.014 "superblock": true, 00:28:14.014 "num_base_bdevs": 2, 00:28:14.014 "num_base_bdevs_discovered": 2, 00:28:14.014 "num_base_bdevs_operational": 2, 00:28:14.014 "base_bdevs_list": [ 00:28:14.014 { 00:28:14.014 "name": "BaseBdev1", 00:28:14.014 "uuid": "d8836fcc-0f54-5b2c-be8f-b50223dd6dee", 00:28:14.014 "is_configured": true, 00:28:14.014 "data_offset": 2048, 00:28:14.014 "data_size": 63488 00:28:14.014 }, 00:28:14.014 { 00:28:14.014 "name": "BaseBdev2", 00:28:14.014 "uuid": "601a7e5a-b8a5-5918-a357-f09e22a5a502", 00:28:14.014 "is_configured": true, 00:28:14.014 "data_offset": 2048, 00:28:14.014 "data_size": 63488 00:28:14.014 } 00:28:14.014 ] 00:28:14.014 }' 00:28:14.014 11:38:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:14.014 11:38:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:28:14.582 11:38:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:14.842 [2024-06-10 11:38:58.531098] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:14.842 [2024-06-10 11:38:58.531126] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:14.842 [2024-06-10 11:38:58.533226] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:14.842 [2024-06-10 11:38:58.533249] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:14.842 [2024-06-10 11:38:58.533267] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:14.842 [2024-06-10 11:38:58.533275] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f50ca0 name raid_bdev1, state offline 00:28:14.842 0 00:28:14.842 11:38:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 133697 00:28:14.842 11:38:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@949 -- # '[' -z 133697 ']' 00:28:14.842 11:38:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # kill -0 133697 00:28:14.842 11:38:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # uname 00:28:14.842 11:38:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:28:14.842 11:38:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 133697 00:28:14.842 11:38:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:28:14.842 11:38:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:28:14.842 11:38:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 133697' 00:28:14.842 killing process with pid 133697 00:28:14.842 11:38:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # kill 133697 00:28:14.842 [2024-06-10 11:38:58.597276] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:28:14.842 11:38:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@973 -- # wait 133697 00:28:14.842 [2024-06-10 11:38:58.607189] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:28:15.102 11:38:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.uiV32z4KHG 00:28:15.102 11:38:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:28:15.102 11:38:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:28:15.102 11:38:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.52 00:28:15.102 11:38:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:28:15.102 11:38:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:28:15.102 11:38:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:28:15.102 11:38:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.52 != \0\.\0\0 ]] 00:28:15.102 00:28:15.102 real 0m5.053s 00:28:15.102 user 0m7.598s 00:28:15.102 sys 0m0.888s 00:28:15.102 11:38:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:28:15.102 11:38:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:28:15.102 ************************************ 00:28:15.102 END TEST raid_read_error_test 00:28:15.102 ************************************ 00:28:15.102 11:38:58 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 2 write 00:28:15.102 11:38:58 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:28:15.102 11:38:58 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:28:15.102 11:38:58 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:28:15.102 ************************************ 00:28:15.102 START TEST raid_write_error_test 00:28:15.102 ************************************ 00:28:15.102 11:38:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test raid0 2 write 00:28:15.102 11:38:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:28:15.102 11:38:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:28:15.102 11:38:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:28:15.102 11:38:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:28:15.102 11:38:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:28:15.102 11:38:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:28:15.102 11:38:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:28:15.102 11:38:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:28:15.102 11:38:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:28:15.102 11:38:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:28:15.102 11:38:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:28:15.102 11:38:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:28:15.102 11:38:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:28:15.102 11:38:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:28:15.102 11:38:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:28:15.102 11:38:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:28:15.102 11:38:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:28:15.102 11:38:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:28:15.102 11:38:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:28:15.102 11:38:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:28:15.102 11:38:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:28:15.102 11:38:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:28:15.102 11:38:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.H57x85iGAl 00:28:15.102 11:38:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=134499 00:28:15.102 11:38:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 134499 /var/tmp/spdk-raid.sock 00:28:15.102 11:38:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:28:15.102 11:38:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@830 -- # '[' -z 134499 ']' 00:28:15.102 11:38:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:28:15.102 11:38:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:28:15.102 11:38:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:28:15.102 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:28:15.102 11:38:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:28:15.102 11:38:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:28:15.102 [2024-06-10 11:38:58.973946] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:28:15.102 [2024-06-10 11:38:58.973999] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid134499 ] 00:28:15.362 [2024-06-10 11:38:59.057471] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:15.362 [2024-06-10 11:38:59.135639] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:28:15.362 [2024-06-10 11:38:59.194582] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:15.362 [2024-06-10 11:38:59.194614] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:15.930 11:38:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:28:15.930 11:38:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@863 -- # return 0 00:28:15.930 11:38:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:28:15.930 11:38:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:28:16.188 BaseBdev1_malloc 00:28:16.188 11:38:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:28:16.188 true 00:28:16.188 11:39:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:28:16.446 [2024-06-10 11:39:00.290183] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:28:16.446 [2024-06-10 11:39:00.290227] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:16.446 [2024-06-10 11:39:00.290243] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16ffb10 00:28:16.446 [2024-06-10 11:39:00.290252] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:16.446 [2024-06-10 11:39:00.291633] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:16.446 [2024-06-10 11:39:00.291657] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:28:16.446 BaseBdev1 00:28:16.446 11:39:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:28:16.446 11:39:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:28:16.704 BaseBdev2_malloc 00:28:16.704 11:39:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:28:16.704 true 00:28:16.963 11:39:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:28:16.963 [2024-06-10 11:39:00.811310] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:28:16.963 [2024-06-10 11:39:00.811349] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:16.963 [2024-06-10 11:39:00.811380] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1704280 00:28:16.963 [2024-06-10 11:39:00.811389] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:16.963 [2024-06-10 11:39:00.812453] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:16.963 [2024-06-10 11:39:00.812475] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:28:16.963 BaseBdev2 00:28:16.963 11:39:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:28:17.223 [2024-06-10 11:39:00.991800] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:17.223 [2024-06-10 11:39:00.992600] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:28:17.223 [2024-06-10 11:39:00.992726] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1703ca0 00:28:17.223 [2024-06-10 11:39:00.992735] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:28:17.223 [2024-06-10 11:39:00.992861] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1706d10 00:28:17.223 [2024-06-10 11:39:00.992965] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1703ca0 00:28:17.223 [2024-06-10 11:39:00.992971] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1703ca0 00:28:17.223 [2024-06-10 11:39:00.993036] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:17.223 11:39:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:28:17.223 11:39:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:17.223 11:39:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:17.223 11:39:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:28:17.223 11:39:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:28:17.223 11:39:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:17.223 11:39:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:17.223 11:39:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:17.223 11:39:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:17.223 11:39:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:17.223 11:39:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:17.223 11:39:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:17.482 11:39:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:17.482 "name": "raid_bdev1", 00:28:17.482 "uuid": "ba3d48ab-6331-4ef5-848b-3a2aaf7c6b19", 00:28:17.482 "strip_size_kb": 64, 00:28:17.482 "state": "online", 00:28:17.482 "raid_level": "raid0", 00:28:17.482 "superblock": true, 00:28:17.482 "num_base_bdevs": 2, 00:28:17.482 "num_base_bdevs_discovered": 2, 00:28:17.482 "num_base_bdevs_operational": 2, 00:28:17.482 "base_bdevs_list": [ 00:28:17.482 { 00:28:17.482 "name": "BaseBdev1", 00:28:17.482 "uuid": "9a91ca6f-1cb9-5aa4-b770-a5a5038cab75", 00:28:17.482 "is_configured": true, 00:28:17.482 "data_offset": 2048, 00:28:17.482 "data_size": 63488 00:28:17.482 }, 00:28:17.482 { 00:28:17.482 "name": "BaseBdev2", 00:28:17.482 "uuid": "5ccc7075-52b5-55e7-add1-a72e8fdcf85e", 00:28:17.482 "is_configured": true, 00:28:17.482 "data_offset": 2048, 00:28:17.482 "data_size": 63488 00:28:17.482 } 00:28:17.482 ] 00:28:17.482 }' 00:28:17.482 11:39:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:17.482 11:39:01 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:28:18.050 11:39:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:28:18.050 11:39:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:28:18.050 [2024-06-10 11:39:01.757983] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1701420 00:28:18.991 11:39:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:28:18.991 11:39:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:28:18.991 11:39:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:28:18.991 11:39:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:28:18.991 11:39:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:28:18.991 11:39:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:18.991 11:39:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:18.991 11:39:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:28:18.991 11:39:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:28:18.991 11:39:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:18.991 11:39:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:18.991 11:39:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:18.991 11:39:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:18.991 11:39:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:18.991 11:39:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:18.991 11:39:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:19.251 11:39:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:19.251 "name": "raid_bdev1", 00:28:19.251 "uuid": "ba3d48ab-6331-4ef5-848b-3a2aaf7c6b19", 00:28:19.251 "strip_size_kb": 64, 00:28:19.251 "state": "online", 00:28:19.251 "raid_level": "raid0", 00:28:19.251 "superblock": true, 00:28:19.251 "num_base_bdevs": 2, 00:28:19.251 "num_base_bdevs_discovered": 2, 00:28:19.251 "num_base_bdevs_operational": 2, 00:28:19.251 "base_bdevs_list": [ 00:28:19.251 { 00:28:19.251 "name": "BaseBdev1", 00:28:19.251 "uuid": "9a91ca6f-1cb9-5aa4-b770-a5a5038cab75", 00:28:19.251 "is_configured": true, 00:28:19.251 "data_offset": 2048, 00:28:19.251 "data_size": 63488 00:28:19.251 }, 00:28:19.251 { 00:28:19.251 "name": "BaseBdev2", 00:28:19.251 "uuid": "5ccc7075-52b5-55e7-add1-a72e8fdcf85e", 00:28:19.251 "is_configured": true, 00:28:19.251 "data_offset": 2048, 00:28:19.251 "data_size": 63488 00:28:19.251 } 00:28:19.251 ] 00:28:19.251 }' 00:28:19.251 11:39:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:19.251 11:39:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:28:19.818 11:39:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:19.818 [2024-06-10 11:39:03.707139] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:19.818 [2024-06-10 11:39:03.707181] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:19.819 [2024-06-10 11:39:03.709346] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:19.819 [2024-06-10 11:39:03.709372] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:19.819 [2024-06-10 11:39:03.709389] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:19.819 [2024-06-10 11:39:03.709397] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1703ca0 name raid_bdev1, state offline 00:28:19.819 0 00:28:19.819 11:39:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 134499 00:28:19.819 11:39:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@949 -- # '[' -z 134499 ']' 00:28:19.819 11:39:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # kill -0 134499 00:28:19.819 11:39:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # uname 00:28:19.819 11:39:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:28:19.819 11:39:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 134499 00:28:20.077 11:39:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:28:20.077 11:39:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:28:20.077 11:39:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 134499' 00:28:20.077 killing process with pid 134499 00:28:20.077 11:39:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # kill 134499 00:28:20.077 [2024-06-10 11:39:03.773656] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:28:20.077 11:39:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@973 -- # wait 134499 00:28:20.077 [2024-06-10 11:39:03.783576] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:28:20.077 11:39:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.H57x85iGAl 00:28:20.077 11:39:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:28:20.077 11:39:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:28:20.077 11:39:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.51 00:28:20.077 11:39:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:28:20.077 11:39:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:28:20.077 11:39:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:28:20.077 11:39:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.51 != \0\.\0\0 ]] 00:28:20.077 00:28:20.077 real 0m5.093s 00:28:20.077 user 0m7.627s 00:28:20.077 sys 0m0.917s 00:28:20.077 11:39:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:28:20.077 11:39:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:28:20.077 ************************************ 00:28:20.077 END TEST raid_write_error_test 00:28:20.077 ************************************ 00:28:20.337 11:39:04 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:28:20.337 11:39:04 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 2 false 00:28:20.337 11:39:04 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:28:20.337 11:39:04 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:28:20.337 11:39:04 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:28:20.337 ************************************ 00:28:20.337 START TEST raid_state_function_test 00:28:20.337 ************************************ 00:28:20.337 11:39:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # raid_state_function_test concat 2 false 00:28:20.337 11:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:28:20.337 11:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:28:20.337 11:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:28:20.337 11:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:28:20.337 11:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:28:20.337 11:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:28:20.337 11:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:28:20.337 11:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:28:20.337 11:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:28:20.337 11:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:28:20.337 11:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:28:20.337 11:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:28:20.337 11:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:28:20.337 11:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:28:20.337 11:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:28:20.337 11:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:28:20.337 11:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:28:20.337 11:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:28:20.337 11:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:28:20.337 11:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:28:20.337 11:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:28:20.337 11:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:28:20.337 11:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:28:20.337 11:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:28:20.337 11:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=135403 00:28:20.337 11:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 135403' 00:28:20.337 Process raid pid: 135403 00:28:20.337 11:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 135403 /var/tmp/spdk-raid.sock 00:28:20.337 11:39:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@830 -- # '[' -z 135403 ']' 00:28:20.337 11:39:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:28:20.337 11:39:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:28:20.337 11:39:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:28:20.337 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:28:20.337 11:39:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:28:20.337 11:39:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:28:20.337 [2024-06-10 11:39:04.134764] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:28:20.337 [2024-06-10 11:39:04.134823] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:28:20.337 [2024-06-10 11:39:04.242229] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:20.595 [2024-06-10 11:39:04.334405] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:28:20.595 [2024-06-10 11:39:04.385707] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:20.595 [2024-06-10 11:39:04.385730] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:21.163 11:39:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:28:21.163 11:39:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@863 -- # return 0 00:28:21.163 11:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:28:21.163 [2024-06-10 11:39:05.081797] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:28:21.163 [2024-06-10 11:39:05.081832] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:28:21.163 [2024-06-10 11:39:05.081839] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:28:21.163 [2024-06-10 11:39:05.081863] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:28:21.163 11:39:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:28:21.163 11:39:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:21.163 11:39:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:21.163 11:39:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:28:21.163 11:39:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:28:21.163 11:39:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:21.163 11:39:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:21.163 11:39:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:21.163 11:39:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:21.163 11:39:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:21.163 11:39:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:21.163 11:39:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:21.422 11:39:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:21.422 "name": "Existed_Raid", 00:28:21.422 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:21.422 "strip_size_kb": 64, 00:28:21.422 "state": "configuring", 00:28:21.422 "raid_level": "concat", 00:28:21.422 "superblock": false, 00:28:21.422 "num_base_bdevs": 2, 00:28:21.422 "num_base_bdevs_discovered": 0, 00:28:21.422 "num_base_bdevs_operational": 2, 00:28:21.422 "base_bdevs_list": [ 00:28:21.422 { 00:28:21.422 "name": "BaseBdev1", 00:28:21.422 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:21.422 "is_configured": false, 00:28:21.422 "data_offset": 0, 00:28:21.422 "data_size": 0 00:28:21.422 }, 00:28:21.422 { 00:28:21.422 "name": "BaseBdev2", 00:28:21.422 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:21.422 "is_configured": false, 00:28:21.423 "data_offset": 0, 00:28:21.423 "data_size": 0 00:28:21.423 } 00:28:21.423 ] 00:28:21.423 }' 00:28:21.423 11:39:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:21.423 11:39:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:28:21.990 11:39:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:28:22.250 [2024-06-10 11:39:05.947963] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:28:22.250 [2024-06-10 11:39:05.947985] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1674510 name Existed_Raid, state configuring 00:28:22.250 11:39:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:28:22.250 [2024-06-10 11:39:06.128444] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:28:22.250 [2024-06-10 11:39:06.128468] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:28:22.250 [2024-06-10 11:39:06.128474] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:28:22.250 [2024-06-10 11:39:06.128481] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:28:22.250 11:39:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:28:22.509 [2024-06-10 11:39:06.305490] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:22.509 BaseBdev1 00:28:22.509 11:39:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:28:22.509 11:39:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:28:22.509 11:39:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:28:22.509 11:39:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:28:22.509 11:39:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:28:22.509 11:39:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:28:22.509 11:39:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:28:22.768 11:39:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:28:22.768 [ 00:28:22.768 { 00:28:22.768 "name": "BaseBdev1", 00:28:22.768 "aliases": [ 00:28:22.768 "6943c0f0-ddf9-4102-960f-6f9aa62f1d48" 00:28:22.768 ], 00:28:22.768 "product_name": "Malloc disk", 00:28:22.768 "block_size": 512, 00:28:22.768 "num_blocks": 65536, 00:28:22.768 "uuid": "6943c0f0-ddf9-4102-960f-6f9aa62f1d48", 00:28:22.768 "assigned_rate_limits": { 00:28:22.768 "rw_ios_per_sec": 0, 00:28:22.768 "rw_mbytes_per_sec": 0, 00:28:22.768 "r_mbytes_per_sec": 0, 00:28:22.768 "w_mbytes_per_sec": 0 00:28:22.768 }, 00:28:22.768 "claimed": true, 00:28:22.768 "claim_type": "exclusive_write", 00:28:22.768 "zoned": false, 00:28:22.768 "supported_io_types": { 00:28:22.768 "read": true, 00:28:22.768 "write": true, 00:28:22.768 "unmap": true, 00:28:22.768 "write_zeroes": true, 00:28:22.768 "flush": true, 00:28:22.768 "reset": true, 00:28:22.768 "compare": false, 00:28:22.768 "compare_and_write": false, 00:28:22.768 "abort": true, 00:28:22.768 "nvme_admin": false, 00:28:22.768 "nvme_io": false 00:28:22.768 }, 00:28:22.768 "memory_domains": [ 00:28:22.768 { 00:28:22.768 "dma_device_id": "system", 00:28:22.768 "dma_device_type": 1 00:28:22.768 }, 00:28:22.768 { 00:28:22.768 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:22.768 "dma_device_type": 2 00:28:22.768 } 00:28:22.768 ], 00:28:22.768 "driver_specific": {} 00:28:22.768 } 00:28:22.768 ] 00:28:22.768 11:39:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:28:22.768 11:39:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:28:22.768 11:39:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:22.768 11:39:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:22.768 11:39:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:28:22.768 11:39:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:28:22.769 11:39:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:22.769 11:39:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:22.769 11:39:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:22.769 11:39:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:22.769 11:39:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:22.769 11:39:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:22.769 11:39:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:23.027 11:39:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:23.027 "name": "Existed_Raid", 00:28:23.027 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:23.027 "strip_size_kb": 64, 00:28:23.027 "state": "configuring", 00:28:23.027 "raid_level": "concat", 00:28:23.027 "superblock": false, 00:28:23.028 "num_base_bdevs": 2, 00:28:23.028 "num_base_bdevs_discovered": 1, 00:28:23.028 "num_base_bdevs_operational": 2, 00:28:23.028 "base_bdevs_list": [ 00:28:23.028 { 00:28:23.028 "name": "BaseBdev1", 00:28:23.028 "uuid": "6943c0f0-ddf9-4102-960f-6f9aa62f1d48", 00:28:23.028 "is_configured": true, 00:28:23.028 "data_offset": 0, 00:28:23.028 "data_size": 65536 00:28:23.028 }, 00:28:23.028 { 00:28:23.028 "name": "BaseBdev2", 00:28:23.028 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:23.028 "is_configured": false, 00:28:23.028 "data_offset": 0, 00:28:23.028 "data_size": 0 00:28:23.028 } 00:28:23.028 ] 00:28:23.028 }' 00:28:23.028 11:39:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:23.028 11:39:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:28:23.594 11:39:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:28:23.594 [2024-06-10 11:39:07.476656] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:28:23.595 [2024-06-10 11:39:07.476687] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1673e00 name Existed_Raid, state configuring 00:28:23.595 11:39:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:28:23.853 [2024-06-10 11:39:07.649118] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:23.853 [2024-06-10 11:39:07.650123] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:28:23.853 [2024-06-10 11:39:07.650149] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:28:23.853 11:39:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:28:23.853 11:39:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:28:23.853 11:39:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:28:23.853 11:39:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:23.853 11:39:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:23.853 11:39:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:28:23.853 11:39:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:28:23.853 11:39:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:23.853 11:39:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:23.853 11:39:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:23.853 11:39:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:23.853 11:39:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:23.853 11:39:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:23.853 11:39:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:24.112 11:39:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:24.112 "name": "Existed_Raid", 00:28:24.112 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:24.112 "strip_size_kb": 64, 00:28:24.112 "state": "configuring", 00:28:24.112 "raid_level": "concat", 00:28:24.112 "superblock": false, 00:28:24.112 "num_base_bdevs": 2, 00:28:24.112 "num_base_bdevs_discovered": 1, 00:28:24.112 "num_base_bdevs_operational": 2, 00:28:24.112 "base_bdevs_list": [ 00:28:24.112 { 00:28:24.112 "name": "BaseBdev1", 00:28:24.112 "uuid": "6943c0f0-ddf9-4102-960f-6f9aa62f1d48", 00:28:24.112 "is_configured": true, 00:28:24.112 "data_offset": 0, 00:28:24.112 "data_size": 65536 00:28:24.112 }, 00:28:24.112 { 00:28:24.112 "name": "BaseBdev2", 00:28:24.112 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:24.112 "is_configured": false, 00:28:24.112 "data_offset": 0, 00:28:24.112 "data_size": 0 00:28:24.112 } 00:28:24.112 ] 00:28:24.112 }' 00:28:24.112 11:39:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:24.112 11:39:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:28:24.680 11:39:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:28:24.681 [2024-06-10 11:39:08.494061] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:28:24.681 [2024-06-10 11:39:08.494089] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1674bf0 00:28:24.681 [2024-06-10 11:39:08.494095] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:28:24.681 [2024-06-10 11:39:08.494258] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18269b0 00:28:24.681 [2024-06-10 11:39:08.494335] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1674bf0 00:28:24.681 [2024-06-10 11:39:08.494342] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1674bf0 00:28:24.681 [2024-06-10 11:39:08.494454] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:24.681 BaseBdev2 00:28:24.681 11:39:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:28:24.681 11:39:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:28:24.681 11:39:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:28:24.681 11:39:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:28:24.681 11:39:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:28:24.681 11:39:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:28:24.681 11:39:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:28:24.938 11:39:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:28:24.938 [ 00:28:24.938 { 00:28:24.938 "name": "BaseBdev2", 00:28:24.938 "aliases": [ 00:28:24.938 "945746ee-d335-4a40-a8d1-505ff4698c3d" 00:28:24.938 ], 00:28:24.938 "product_name": "Malloc disk", 00:28:24.938 "block_size": 512, 00:28:24.939 "num_blocks": 65536, 00:28:24.939 "uuid": "945746ee-d335-4a40-a8d1-505ff4698c3d", 00:28:24.939 "assigned_rate_limits": { 00:28:24.939 "rw_ios_per_sec": 0, 00:28:24.939 "rw_mbytes_per_sec": 0, 00:28:24.939 "r_mbytes_per_sec": 0, 00:28:24.939 "w_mbytes_per_sec": 0 00:28:24.939 }, 00:28:24.939 "claimed": true, 00:28:24.939 "claim_type": "exclusive_write", 00:28:24.939 "zoned": false, 00:28:24.939 "supported_io_types": { 00:28:24.939 "read": true, 00:28:24.939 "write": true, 00:28:24.939 "unmap": true, 00:28:24.939 "write_zeroes": true, 00:28:24.939 "flush": true, 00:28:24.939 "reset": true, 00:28:24.939 "compare": false, 00:28:24.939 "compare_and_write": false, 00:28:24.939 "abort": true, 00:28:24.939 "nvme_admin": false, 00:28:24.939 "nvme_io": false 00:28:24.939 }, 00:28:24.939 "memory_domains": [ 00:28:24.939 { 00:28:24.939 "dma_device_id": "system", 00:28:24.939 "dma_device_type": 1 00:28:24.939 }, 00:28:24.939 { 00:28:24.939 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:24.939 "dma_device_type": 2 00:28:24.939 } 00:28:24.939 ], 00:28:24.939 "driver_specific": {} 00:28:24.939 } 00:28:24.939 ] 00:28:24.939 11:39:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:28:24.939 11:39:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:28:24.939 11:39:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:28:24.939 11:39:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 2 00:28:24.939 11:39:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:24.939 11:39:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:24.939 11:39:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:28:24.939 11:39:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:28:24.939 11:39:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:24.939 11:39:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:24.939 11:39:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:24.939 11:39:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:24.939 11:39:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:24.939 11:39:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:24.939 11:39:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:25.197 11:39:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:25.197 "name": "Existed_Raid", 00:28:25.197 "uuid": "307b0e04-c7e6-4f51-827d-18e3817d4d51", 00:28:25.197 "strip_size_kb": 64, 00:28:25.197 "state": "online", 00:28:25.197 "raid_level": "concat", 00:28:25.197 "superblock": false, 00:28:25.197 "num_base_bdevs": 2, 00:28:25.197 "num_base_bdevs_discovered": 2, 00:28:25.197 "num_base_bdevs_operational": 2, 00:28:25.197 "base_bdevs_list": [ 00:28:25.197 { 00:28:25.197 "name": "BaseBdev1", 00:28:25.197 "uuid": "6943c0f0-ddf9-4102-960f-6f9aa62f1d48", 00:28:25.197 "is_configured": true, 00:28:25.197 "data_offset": 0, 00:28:25.197 "data_size": 65536 00:28:25.197 }, 00:28:25.197 { 00:28:25.197 "name": "BaseBdev2", 00:28:25.197 "uuid": "945746ee-d335-4a40-a8d1-505ff4698c3d", 00:28:25.197 "is_configured": true, 00:28:25.197 "data_offset": 0, 00:28:25.197 "data_size": 65536 00:28:25.197 } 00:28:25.197 ] 00:28:25.197 }' 00:28:25.197 11:39:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:25.197 11:39:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:28:25.764 11:39:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:28:25.764 11:39:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:28:25.764 11:39:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:28:25.764 11:39:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:28:25.764 11:39:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:28:25.764 11:39:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:28:25.764 11:39:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:28:25.764 11:39:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:28:25.764 [2024-06-10 11:39:09.701345] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:26.023 11:39:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:28:26.023 "name": "Existed_Raid", 00:28:26.023 "aliases": [ 00:28:26.023 "307b0e04-c7e6-4f51-827d-18e3817d4d51" 00:28:26.023 ], 00:28:26.023 "product_name": "Raid Volume", 00:28:26.023 "block_size": 512, 00:28:26.023 "num_blocks": 131072, 00:28:26.023 "uuid": "307b0e04-c7e6-4f51-827d-18e3817d4d51", 00:28:26.023 "assigned_rate_limits": { 00:28:26.023 "rw_ios_per_sec": 0, 00:28:26.023 "rw_mbytes_per_sec": 0, 00:28:26.023 "r_mbytes_per_sec": 0, 00:28:26.023 "w_mbytes_per_sec": 0 00:28:26.023 }, 00:28:26.023 "claimed": false, 00:28:26.023 "zoned": false, 00:28:26.023 "supported_io_types": { 00:28:26.023 "read": true, 00:28:26.023 "write": true, 00:28:26.023 "unmap": true, 00:28:26.023 "write_zeroes": true, 00:28:26.023 "flush": true, 00:28:26.023 "reset": true, 00:28:26.023 "compare": false, 00:28:26.023 "compare_and_write": false, 00:28:26.023 "abort": false, 00:28:26.023 "nvme_admin": false, 00:28:26.023 "nvme_io": false 00:28:26.023 }, 00:28:26.023 "memory_domains": [ 00:28:26.023 { 00:28:26.023 "dma_device_id": "system", 00:28:26.023 "dma_device_type": 1 00:28:26.023 }, 00:28:26.023 { 00:28:26.023 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:26.023 "dma_device_type": 2 00:28:26.023 }, 00:28:26.023 { 00:28:26.023 "dma_device_id": "system", 00:28:26.023 "dma_device_type": 1 00:28:26.023 }, 00:28:26.023 { 00:28:26.023 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:26.023 "dma_device_type": 2 00:28:26.023 } 00:28:26.023 ], 00:28:26.023 "driver_specific": { 00:28:26.023 "raid": { 00:28:26.023 "uuid": "307b0e04-c7e6-4f51-827d-18e3817d4d51", 00:28:26.023 "strip_size_kb": 64, 00:28:26.023 "state": "online", 00:28:26.023 "raid_level": "concat", 00:28:26.023 "superblock": false, 00:28:26.023 "num_base_bdevs": 2, 00:28:26.023 "num_base_bdevs_discovered": 2, 00:28:26.023 "num_base_bdevs_operational": 2, 00:28:26.023 "base_bdevs_list": [ 00:28:26.023 { 00:28:26.023 "name": "BaseBdev1", 00:28:26.023 "uuid": "6943c0f0-ddf9-4102-960f-6f9aa62f1d48", 00:28:26.023 "is_configured": true, 00:28:26.023 "data_offset": 0, 00:28:26.023 "data_size": 65536 00:28:26.023 }, 00:28:26.023 { 00:28:26.023 "name": "BaseBdev2", 00:28:26.023 "uuid": "945746ee-d335-4a40-a8d1-505ff4698c3d", 00:28:26.023 "is_configured": true, 00:28:26.023 "data_offset": 0, 00:28:26.023 "data_size": 65536 00:28:26.023 } 00:28:26.023 ] 00:28:26.023 } 00:28:26.023 } 00:28:26.023 }' 00:28:26.023 11:39:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:28:26.023 11:39:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:28:26.023 BaseBdev2' 00:28:26.023 11:39:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:26.023 11:39:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:28:26.023 11:39:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:26.023 11:39:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:26.023 "name": "BaseBdev1", 00:28:26.023 "aliases": [ 00:28:26.023 "6943c0f0-ddf9-4102-960f-6f9aa62f1d48" 00:28:26.023 ], 00:28:26.023 "product_name": "Malloc disk", 00:28:26.023 "block_size": 512, 00:28:26.023 "num_blocks": 65536, 00:28:26.023 "uuid": "6943c0f0-ddf9-4102-960f-6f9aa62f1d48", 00:28:26.023 "assigned_rate_limits": { 00:28:26.023 "rw_ios_per_sec": 0, 00:28:26.023 "rw_mbytes_per_sec": 0, 00:28:26.023 "r_mbytes_per_sec": 0, 00:28:26.023 "w_mbytes_per_sec": 0 00:28:26.023 }, 00:28:26.023 "claimed": true, 00:28:26.023 "claim_type": "exclusive_write", 00:28:26.023 "zoned": false, 00:28:26.023 "supported_io_types": { 00:28:26.023 "read": true, 00:28:26.023 "write": true, 00:28:26.023 "unmap": true, 00:28:26.023 "write_zeroes": true, 00:28:26.023 "flush": true, 00:28:26.023 "reset": true, 00:28:26.023 "compare": false, 00:28:26.023 "compare_and_write": false, 00:28:26.023 "abort": true, 00:28:26.023 "nvme_admin": false, 00:28:26.023 "nvme_io": false 00:28:26.023 }, 00:28:26.023 "memory_domains": [ 00:28:26.023 { 00:28:26.023 "dma_device_id": "system", 00:28:26.023 "dma_device_type": 1 00:28:26.023 }, 00:28:26.023 { 00:28:26.023 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:26.023 "dma_device_type": 2 00:28:26.023 } 00:28:26.023 ], 00:28:26.023 "driver_specific": {} 00:28:26.023 }' 00:28:26.023 11:39:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:26.282 11:39:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:26.282 11:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:28:26.282 11:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:26.282 11:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:26.282 11:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:28:26.282 11:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:26.282 11:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:26.282 11:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:28:26.282 11:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:26.282 11:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:26.541 11:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:28:26.541 11:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:26.541 11:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:28:26.541 11:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:26.541 11:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:26.541 "name": "BaseBdev2", 00:28:26.541 "aliases": [ 00:28:26.541 "945746ee-d335-4a40-a8d1-505ff4698c3d" 00:28:26.541 ], 00:28:26.541 "product_name": "Malloc disk", 00:28:26.541 "block_size": 512, 00:28:26.541 "num_blocks": 65536, 00:28:26.541 "uuid": "945746ee-d335-4a40-a8d1-505ff4698c3d", 00:28:26.541 "assigned_rate_limits": { 00:28:26.541 "rw_ios_per_sec": 0, 00:28:26.541 "rw_mbytes_per_sec": 0, 00:28:26.541 "r_mbytes_per_sec": 0, 00:28:26.541 "w_mbytes_per_sec": 0 00:28:26.541 }, 00:28:26.541 "claimed": true, 00:28:26.541 "claim_type": "exclusive_write", 00:28:26.541 "zoned": false, 00:28:26.541 "supported_io_types": { 00:28:26.541 "read": true, 00:28:26.541 "write": true, 00:28:26.541 "unmap": true, 00:28:26.541 "write_zeroes": true, 00:28:26.541 "flush": true, 00:28:26.541 "reset": true, 00:28:26.541 "compare": false, 00:28:26.541 "compare_and_write": false, 00:28:26.541 "abort": true, 00:28:26.541 "nvme_admin": false, 00:28:26.541 "nvme_io": false 00:28:26.541 }, 00:28:26.541 "memory_domains": [ 00:28:26.541 { 00:28:26.541 "dma_device_id": "system", 00:28:26.541 "dma_device_type": 1 00:28:26.541 }, 00:28:26.541 { 00:28:26.541 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:26.541 "dma_device_type": 2 00:28:26.541 } 00:28:26.541 ], 00:28:26.541 "driver_specific": {} 00:28:26.541 }' 00:28:26.541 11:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:26.541 11:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:26.800 11:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:28:26.800 11:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:26.800 11:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:26.800 11:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:28:26.800 11:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:26.800 11:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:26.800 11:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:28:26.800 11:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:26.800 11:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:26.800 11:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:28:26.800 11:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:28:27.060 [2024-06-10 11:39:10.872260] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:28:27.060 [2024-06-10 11:39:10.872282] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:27.060 [2024-06-10 11:39:10.872309] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:27.060 11:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:28:27.060 11:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:28:27.060 11:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:28:27.060 11:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:28:27.060 11:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:28:27.060 11:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 1 00:28:27.060 11:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:27.060 11:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:28:27.060 11:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:28:27.060 11:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:28:27.060 11:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:27.060 11:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:27.060 11:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:27.060 11:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:27.060 11:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:27.060 11:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:27.060 11:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:27.319 11:39:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:27.319 "name": "Existed_Raid", 00:28:27.319 "uuid": "307b0e04-c7e6-4f51-827d-18e3817d4d51", 00:28:27.319 "strip_size_kb": 64, 00:28:27.319 "state": "offline", 00:28:27.319 "raid_level": "concat", 00:28:27.319 "superblock": false, 00:28:27.319 "num_base_bdevs": 2, 00:28:27.319 "num_base_bdevs_discovered": 1, 00:28:27.319 "num_base_bdevs_operational": 1, 00:28:27.319 "base_bdevs_list": [ 00:28:27.319 { 00:28:27.319 "name": null, 00:28:27.319 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:27.319 "is_configured": false, 00:28:27.319 "data_offset": 0, 00:28:27.319 "data_size": 65536 00:28:27.319 }, 00:28:27.319 { 00:28:27.319 "name": "BaseBdev2", 00:28:27.319 "uuid": "945746ee-d335-4a40-a8d1-505ff4698c3d", 00:28:27.319 "is_configured": true, 00:28:27.319 "data_offset": 0, 00:28:27.319 "data_size": 65536 00:28:27.319 } 00:28:27.319 ] 00:28:27.319 }' 00:28:27.319 11:39:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:27.319 11:39:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:28:27.577 11:39:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:28:27.577 11:39:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:28:27.577 11:39:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:27.835 11:39:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:28:27.835 11:39:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:28:27.835 11:39:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:28:27.835 11:39:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:28:28.094 [2024-06-10 11:39:11.840428] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:28:28.094 [2024-06-10 11:39:11.840469] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1674bf0 name Existed_Raid, state offline 00:28:28.094 11:39:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:28:28.094 11:39:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:28:28.094 11:39:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:28.094 11:39:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:28:28.094 11:39:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:28:28.094 11:39:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:28:28.094 11:39:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:28:28.094 11:39:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 135403 00:28:28.094 11:39:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@949 -- # '[' -z 135403 ']' 00:28:28.094 11:39:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # kill -0 135403 00:28:28.353 11:39:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # uname 00:28:28.353 11:39:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:28:28.353 11:39:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 135403 00:28:28.353 11:39:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:28:28.353 11:39:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:28:28.353 11:39:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 135403' 00:28:28.353 killing process with pid 135403 00:28:28.353 11:39:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # kill 135403 00:28:28.353 [2024-06-10 11:39:12.088444] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:28:28.353 11:39:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@973 -- # wait 135403 00:28:28.353 [2024-06-10 11:39:12.089326] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:28:28.612 11:39:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:28:28.612 00:28:28.612 real 0m8.216s 00:28:28.612 user 0m14.451s 00:28:28.612 sys 0m1.555s 00:28:28.612 11:39:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:28:28.612 11:39:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:28:28.612 ************************************ 00:28:28.612 END TEST raid_state_function_test 00:28:28.612 ************************************ 00:28:28.612 11:39:12 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 2 true 00:28:28.612 11:39:12 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:28:28.612 11:39:12 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:28:28.612 11:39:12 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:28:28.612 ************************************ 00:28:28.612 START TEST raid_state_function_test_sb 00:28:28.612 ************************************ 00:28:28.612 11:39:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # raid_state_function_test concat 2 true 00:28:28.612 11:39:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:28:28.612 11:39:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:28:28.612 11:39:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:28:28.612 11:39:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:28:28.612 11:39:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:28:28.612 11:39:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:28:28.612 11:39:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:28:28.612 11:39:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:28:28.612 11:39:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:28:28.612 11:39:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:28:28.612 11:39:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:28:28.612 11:39:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:28:28.612 11:39:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:28:28.612 11:39:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:28:28.613 11:39:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:28:28.613 11:39:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:28:28.613 11:39:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:28:28.613 11:39:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:28:28.613 11:39:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:28:28.613 11:39:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:28:28.613 11:39:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:28:28.613 11:39:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:28:28.613 11:39:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:28:28.613 11:39:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=137106 00:28:28.613 11:39:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 137106' 00:28:28.613 Process raid pid: 137106 00:28:28.613 11:39:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:28:28.613 11:39:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 137106 /var/tmp/spdk-raid.sock 00:28:28.613 11:39:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@830 -- # '[' -z 137106 ']' 00:28:28.613 11:39:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:28:28.613 11:39:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local max_retries=100 00:28:28.613 11:39:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:28:28.613 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:28:28.613 11:39:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@839 -- # xtrace_disable 00:28:28.613 11:39:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:28:28.613 [2024-06-10 11:39:12.449226] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:28:28.613 [2024-06-10 11:39:12.449281] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:28:28.613 [2024-06-10 11:39:12.534783] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:28.872 [2024-06-10 11:39:12.620528] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:28:28.872 [2024-06-10 11:39:12.673577] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:28.872 [2024-06-10 11:39:12.673604] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:29.439 11:39:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:28:29.439 11:39:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@863 -- # return 0 00:28:29.439 11:39:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:28:29.733 [2024-06-10 11:39:13.407294] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:28:29.733 [2024-06-10 11:39:13.407331] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:28:29.733 [2024-06-10 11:39:13.407340] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:28:29.733 [2024-06-10 11:39:13.407348] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:28:29.733 11:39:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:28:29.733 11:39:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:29.733 11:39:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:29.733 11:39:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:28:29.733 11:39:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:28:29.733 11:39:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:29.733 11:39:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:29.733 11:39:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:29.733 11:39:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:29.733 11:39:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:29.733 11:39:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:29.733 11:39:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:29.733 11:39:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:29.733 "name": "Existed_Raid", 00:28:29.733 "uuid": "d602f2a0-8857-411f-b49c-54641a496af5", 00:28:29.733 "strip_size_kb": 64, 00:28:29.733 "state": "configuring", 00:28:29.733 "raid_level": "concat", 00:28:29.733 "superblock": true, 00:28:29.733 "num_base_bdevs": 2, 00:28:29.733 "num_base_bdevs_discovered": 0, 00:28:29.733 "num_base_bdevs_operational": 2, 00:28:29.733 "base_bdevs_list": [ 00:28:29.733 { 00:28:29.733 "name": "BaseBdev1", 00:28:29.733 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:29.733 "is_configured": false, 00:28:29.733 "data_offset": 0, 00:28:29.733 "data_size": 0 00:28:29.733 }, 00:28:29.733 { 00:28:29.733 "name": "BaseBdev2", 00:28:29.733 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:29.733 "is_configured": false, 00:28:29.733 "data_offset": 0, 00:28:29.733 "data_size": 0 00:28:29.733 } 00:28:29.733 ] 00:28:29.733 }' 00:28:29.733 11:39:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:29.733 11:39:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:28:30.341 11:39:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:28:30.341 [2024-06-10 11:39:14.245351] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:28:30.341 [2024-06-10 11:39:14.245372] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1245510 name Existed_Raid, state configuring 00:28:30.341 11:39:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:28:30.600 [2024-06-10 11:39:14.417830] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:28:30.600 [2024-06-10 11:39:14.417857] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:28:30.600 [2024-06-10 11:39:14.417863] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:28:30.600 [2024-06-10 11:39:14.417875] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:28:30.600 11:39:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:28:30.858 [2024-06-10 11:39:14.602824] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:30.858 BaseBdev1 00:28:30.859 11:39:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:28:30.859 11:39:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:28:30.859 11:39:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:28:30.859 11:39:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:28:30.859 11:39:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:28:30.859 11:39:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:28:30.859 11:39:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:28:30.859 11:39:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:28:31.118 [ 00:28:31.118 { 00:28:31.118 "name": "BaseBdev1", 00:28:31.118 "aliases": [ 00:28:31.118 "d6f45b60-242f-417a-b5ec-44569052db41" 00:28:31.118 ], 00:28:31.118 "product_name": "Malloc disk", 00:28:31.118 "block_size": 512, 00:28:31.118 "num_blocks": 65536, 00:28:31.118 "uuid": "d6f45b60-242f-417a-b5ec-44569052db41", 00:28:31.118 "assigned_rate_limits": { 00:28:31.118 "rw_ios_per_sec": 0, 00:28:31.118 "rw_mbytes_per_sec": 0, 00:28:31.118 "r_mbytes_per_sec": 0, 00:28:31.118 "w_mbytes_per_sec": 0 00:28:31.118 }, 00:28:31.118 "claimed": true, 00:28:31.118 "claim_type": "exclusive_write", 00:28:31.118 "zoned": false, 00:28:31.118 "supported_io_types": { 00:28:31.118 "read": true, 00:28:31.118 "write": true, 00:28:31.118 "unmap": true, 00:28:31.118 "write_zeroes": true, 00:28:31.118 "flush": true, 00:28:31.118 "reset": true, 00:28:31.118 "compare": false, 00:28:31.118 "compare_and_write": false, 00:28:31.118 "abort": true, 00:28:31.118 "nvme_admin": false, 00:28:31.118 "nvme_io": false 00:28:31.118 }, 00:28:31.118 "memory_domains": [ 00:28:31.118 { 00:28:31.118 "dma_device_id": "system", 00:28:31.118 "dma_device_type": 1 00:28:31.118 }, 00:28:31.118 { 00:28:31.118 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:31.118 "dma_device_type": 2 00:28:31.118 } 00:28:31.118 ], 00:28:31.118 "driver_specific": {} 00:28:31.118 } 00:28:31.118 ] 00:28:31.118 11:39:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:28:31.118 11:39:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:28:31.118 11:39:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:31.118 11:39:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:31.118 11:39:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:28:31.118 11:39:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:28:31.118 11:39:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:31.118 11:39:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:31.118 11:39:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:31.118 11:39:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:31.118 11:39:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:31.118 11:39:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:31.118 11:39:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:31.378 11:39:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:31.378 "name": "Existed_Raid", 00:28:31.378 "uuid": "deb3e774-fdf8-4841-8542-27bf567c79f6", 00:28:31.378 "strip_size_kb": 64, 00:28:31.378 "state": "configuring", 00:28:31.378 "raid_level": "concat", 00:28:31.378 "superblock": true, 00:28:31.378 "num_base_bdevs": 2, 00:28:31.378 "num_base_bdevs_discovered": 1, 00:28:31.378 "num_base_bdevs_operational": 2, 00:28:31.378 "base_bdevs_list": [ 00:28:31.378 { 00:28:31.378 "name": "BaseBdev1", 00:28:31.378 "uuid": "d6f45b60-242f-417a-b5ec-44569052db41", 00:28:31.378 "is_configured": true, 00:28:31.378 "data_offset": 2048, 00:28:31.378 "data_size": 63488 00:28:31.378 }, 00:28:31.378 { 00:28:31.378 "name": "BaseBdev2", 00:28:31.378 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:31.378 "is_configured": false, 00:28:31.378 "data_offset": 0, 00:28:31.378 "data_size": 0 00:28:31.378 } 00:28:31.378 ] 00:28:31.378 }' 00:28:31.378 11:39:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:31.378 11:39:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:28:31.946 11:39:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:28:31.946 [2024-06-10 11:39:15.785884] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:28:31.946 [2024-06-10 11:39:15.785918] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1244e00 name Existed_Raid, state configuring 00:28:31.946 11:39:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:28:32.206 [2024-06-10 11:39:15.966383] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:32.206 [2024-06-10 11:39:15.967441] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:28:32.206 [2024-06-10 11:39:15.967468] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:28:32.206 11:39:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:28:32.206 11:39:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:28:32.206 11:39:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:28:32.206 11:39:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:32.206 11:39:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:32.206 11:39:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:28:32.206 11:39:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:28:32.206 11:39:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:32.206 11:39:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:32.206 11:39:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:32.206 11:39:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:32.206 11:39:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:32.206 11:39:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:32.206 11:39:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:32.465 11:39:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:32.465 "name": "Existed_Raid", 00:28:32.465 "uuid": "5697af83-2500-4b5c-a3ec-b8cfe40cb3b0", 00:28:32.465 "strip_size_kb": 64, 00:28:32.465 "state": "configuring", 00:28:32.465 "raid_level": "concat", 00:28:32.465 "superblock": true, 00:28:32.465 "num_base_bdevs": 2, 00:28:32.465 "num_base_bdevs_discovered": 1, 00:28:32.465 "num_base_bdevs_operational": 2, 00:28:32.465 "base_bdevs_list": [ 00:28:32.465 { 00:28:32.465 "name": "BaseBdev1", 00:28:32.465 "uuid": "d6f45b60-242f-417a-b5ec-44569052db41", 00:28:32.465 "is_configured": true, 00:28:32.465 "data_offset": 2048, 00:28:32.465 "data_size": 63488 00:28:32.465 }, 00:28:32.465 { 00:28:32.465 "name": "BaseBdev2", 00:28:32.465 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:32.465 "is_configured": false, 00:28:32.465 "data_offset": 0, 00:28:32.465 "data_size": 0 00:28:32.465 } 00:28:32.465 ] 00:28:32.465 }' 00:28:32.465 11:39:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:32.465 11:39:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:28:33.033 11:39:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:28:33.033 [2024-06-10 11:39:16.835545] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:28:33.033 [2024-06-10 11:39:16.835664] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1245bf0 00:28:33.033 [2024-06-10 11:39:16.835673] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:28:33.033 [2024-06-10 11:39:16.835789] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13f79b0 00:28:33.033 [2024-06-10 11:39:16.835877] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1245bf0 00:28:33.033 [2024-06-10 11:39:16.835884] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1245bf0 00:28:33.033 [2024-06-10 11:39:16.835964] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:33.033 BaseBdev2 00:28:33.033 11:39:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:28:33.033 11:39:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:28:33.033 11:39:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:28:33.033 11:39:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:28:33.033 11:39:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:28:33.033 11:39:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:28:33.033 11:39:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:28:33.292 11:39:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:28:33.292 [ 00:28:33.292 { 00:28:33.292 "name": "BaseBdev2", 00:28:33.292 "aliases": [ 00:28:33.292 "78c2602d-affb-485b-ab05-7f49805b2067" 00:28:33.292 ], 00:28:33.292 "product_name": "Malloc disk", 00:28:33.292 "block_size": 512, 00:28:33.292 "num_blocks": 65536, 00:28:33.292 "uuid": "78c2602d-affb-485b-ab05-7f49805b2067", 00:28:33.292 "assigned_rate_limits": { 00:28:33.292 "rw_ios_per_sec": 0, 00:28:33.292 "rw_mbytes_per_sec": 0, 00:28:33.292 "r_mbytes_per_sec": 0, 00:28:33.292 "w_mbytes_per_sec": 0 00:28:33.292 }, 00:28:33.292 "claimed": true, 00:28:33.292 "claim_type": "exclusive_write", 00:28:33.292 "zoned": false, 00:28:33.292 "supported_io_types": { 00:28:33.292 "read": true, 00:28:33.292 "write": true, 00:28:33.292 "unmap": true, 00:28:33.292 "write_zeroes": true, 00:28:33.292 "flush": true, 00:28:33.292 "reset": true, 00:28:33.292 "compare": false, 00:28:33.292 "compare_and_write": false, 00:28:33.292 "abort": true, 00:28:33.292 "nvme_admin": false, 00:28:33.292 "nvme_io": false 00:28:33.292 }, 00:28:33.292 "memory_domains": [ 00:28:33.292 { 00:28:33.292 "dma_device_id": "system", 00:28:33.292 "dma_device_type": 1 00:28:33.292 }, 00:28:33.292 { 00:28:33.292 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:33.292 "dma_device_type": 2 00:28:33.292 } 00:28:33.292 ], 00:28:33.292 "driver_specific": {} 00:28:33.292 } 00:28:33.292 ] 00:28:33.292 11:39:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:28:33.292 11:39:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:28:33.292 11:39:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:28:33.292 11:39:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 2 00:28:33.292 11:39:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:33.292 11:39:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:33.292 11:39:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:28:33.292 11:39:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:28:33.292 11:39:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:33.292 11:39:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:33.292 11:39:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:33.292 11:39:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:33.292 11:39:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:33.292 11:39:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:33.292 11:39:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:33.552 11:39:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:33.552 "name": "Existed_Raid", 00:28:33.552 "uuid": "5697af83-2500-4b5c-a3ec-b8cfe40cb3b0", 00:28:33.552 "strip_size_kb": 64, 00:28:33.552 "state": "online", 00:28:33.552 "raid_level": "concat", 00:28:33.552 "superblock": true, 00:28:33.552 "num_base_bdevs": 2, 00:28:33.552 "num_base_bdevs_discovered": 2, 00:28:33.552 "num_base_bdevs_operational": 2, 00:28:33.552 "base_bdevs_list": [ 00:28:33.552 { 00:28:33.552 "name": "BaseBdev1", 00:28:33.552 "uuid": "d6f45b60-242f-417a-b5ec-44569052db41", 00:28:33.552 "is_configured": true, 00:28:33.552 "data_offset": 2048, 00:28:33.552 "data_size": 63488 00:28:33.552 }, 00:28:33.552 { 00:28:33.552 "name": "BaseBdev2", 00:28:33.552 "uuid": "78c2602d-affb-485b-ab05-7f49805b2067", 00:28:33.552 "is_configured": true, 00:28:33.552 "data_offset": 2048, 00:28:33.552 "data_size": 63488 00:28:33.552 } 00:28:33.552 ] 00:28:33.552 }' 00:28:33.552 11:39:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:33.552 11:39:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:28:34.119 11:39:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:28:34.119 11:39:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:28:34.119 11:39:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:28:34.119 11:39:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:28:34.119 11:39:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:28:34.119 11:39:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:28:34.119 11:39:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:28:34.119 11:39:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:28:34.119 [2024-06-10 11:39:18.026780] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:34.119 11:39:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:28:34.119 "name": "Existed_Raid", 00:28:34.119 "aliases": [ 00:28:34.119 "5697af83-2500-4b5c-a3ec-b8cfe40cb3b0" 00:28:34.119 ], 00:28:34.119 "product_name": "Raid Volume", 00:28:34.119 "block_size": 512, 00:28:34.119 "num_blocks": 126976, 00:28:34.119 "uuid": "5697af83-2500-4b5c-a3ec-b8cfe40cb3b0", 00:28:34.119 "assigned_rate_limits": { 00:28:34.119 "rw_ios_per_sec": 0, 00:28:34.119 "rw_mbytes_per_sec": 0, 00:28:34.119 "r_mbytes_per_sec": 0, 00:28:34.119 "w_mbytes_per_sec": 0 00:28:34.119 }, 00:28:34.119 "claimed": false, 00:28:34.119 "zoned": false, 00:28:34.119 "supported_io_types": { 00:28:34.119 "read": true, 00:28:34.119 "write": true, 00:28:34.119 "unmap": true, 00:28:34.119 "write_zeroes": true, 00:28:34.119 "flush": true, 00:28:34.119 "reset": true, 00:28:34.119 "compare": false, 00:28:34.119 "compare_and_write": false, 00:28:34.119 "abort": false, 00:28:34.119 "nvme_admin": false, 00:28:34.119 "nvme_io": false 00:28:34.119 }, 00:28:34.119 "memory_domains": [ 00:28:34.119 { 00:28:34.119 "dma_device_id": "system", 00:28:34.119 "dma_device_type": 1 00:28:34.119 }, 00:28:34.119 { 00:28:34.119 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:34.119 "dma_device_type": 2 00:28:34.119 }, 00:28:34.119 { 00:28:34.119 "dma_device_id": "system", 00:28:34.119 "dma_device_type": 1 00:28:34.119 }, 00:28:34.119 { 00:28:34.119 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:34.119 "dma_device_type": 2 00:28:34.119 } 00:28:34.119 ], 00:28:34.119 "driver_specific": { 00:28:34.119 "raid": { 00:28:34.119 "uuid": "5697af83-2500-4b5c-a3ec-b8cfe40cb3b0", 00:28:34.119 "strip_size_kb": 64, 00:28:34.119 "state": "online", 00:28:34.119 "raid_level": "concat", 00:28:34.119 "superblock": true, 00:28:34.119 "num_base_bdevs": 2, 00:28:34.119 "num_base_bdevs_discovered": 2, 00:28:34.119 "num_base_bdevs_operational": 2, 00:28:34.119 "base_bdevs_list": [ 00:28:34.119 { 00:28:34.119 "name": "BaseBdev1", 00:28:34.119 "uuid": "d6f45b60-242f-417a-b5ec-44569052db41", 00:28:34.119 "is_configured": true, 00:28:34.120 "data_offset": 2048, 00:28:34.120 "data_size": 63488 00:28:34.120 }, 00:28:34.120 { 00:28:34.120 "name": "BaseBdev2", 00:28:34.120 "uuid": "78c2602d-affb-485b-ab05-7f49805b2067", 00:28:34.120 "is_configured": true, 00:28:34.120 "data_offset": 2048, 00:28:34.120 "data_size": 63488 00:28:34.120 } 00:28:34.120 ] 00:28:34.120 } 00:28:34.120 } 00:28:34.120 }' 00:28:34.120 11:39:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:28:34.379 11:39:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:28:34.379 BaseBdev2' 00:28:34.379 11:39:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:34.379 11:39:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:28:34.379 11:39:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:34.379 11:39:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:34.379 "name": "BaseBdev1", 00:28:34.379 "aliases": [ 00:28:34.379 "d6f45b60-242f-417a-b5ec-44569052db41" 00:28:34.379 ], 00:28:34.379 "product_name": "Malloc disk", 00:28:34.379 "block_size": 512, 00:28:34.379 "num_blocks": 65536, 00:28:34.379 "uuid": "d6f45b60-242f-417a-b5ec-44569052db41", 00:28:34.379 "assigned_rate_limits": { 00:28:34.379 "rw_ios_per_sec": 0, 00:28:34.379 "rw_mbytes_per_sec": 0, 00:28:34.379 "r_mbytes_per_sec": 0, 00:28:34.379 "w_mbytes_per_sec": 0 00:28:34.379 }, 00:28:34.379 "claimed": true, 00:28:34.379 "claim_type": "exclusive_write", 00:28:34.379 "zoned": false, 00:28:34.379 "supported_io_types": { 00:28:34.379 "read": true, 00:28:34.379 "write": true, 00:28:34.379 "unmap": true, 00:28:34.379 "write_zeroes": true, 00:28:34.379 "flush": true, 00:28:34.379 "reset": true, 00:28:34.379 "compare": false, 00:28:34.379 "compare_and_write": false, 00:28:34.379 "abort": true, 00:28:34.379 "nvme_admin": false, 00:28:34.379 "nvme_io": false 00:28:34.379 }, 00:28:34.379 "memory_domains": [ 00:28:34.379 { 00:28:34.379 "dma_device_id": "system", 00:28:34.379 "dma_device_type": 1 00:28:34.379 }, 00:28:34.379 { 00:28:34.379 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:34.379 "dma_device_type": 2 00:28:34.379 } 00:28:34.379 ], 00:28:34.379 "driver_specific": {} 00:28:34.379 }' 00:28:34.379 11:39:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:34.379 11:39:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:34.638 11:39:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:28:34.638 11:39:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:34.638 11:39:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:34.638 11:39:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:28:34.638 11:39:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:34.638 11:39:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:34.638 11:39:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:28:34.638 11:39:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:34.638 11:39:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:34.897 11:39:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:28:34.897 11:39:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:34.897 11:39:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:28:34.897 11:39:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:34.897 11:39:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:34.897 "name": "BaseBdev2", 00:28:34.897 "aliases": [ 00:28:34.897 "78c2602d-affb-485b-ab05-7f49805b2067" 00:28:34.897 ], 00:28:34.897 "product_name": "Malloc disk", 00:28:34.897 "block_size": 512, 00:28:34.897 "num_blocks": 65536, 00:28:34.897 "uuid": "78c2602d-affb-485b-ab05-7f49805b2067", 00:28:34.897 "assigned_rate_limits": { 00:28:34.897 "rw_ios_per_sec": 0, 00:28:34.897 "rw_mbytes_per_sec": 0, 00:28:34.897 "r_mbytes_per_sec": 0, 00:28:34.897 "w_mbytes_per_sec": 0 00:28:34.897 }, 00:28:34.897 "claimed": true, 00:28:34.897 "claim_type": "exclusive_write", 00:28:34.897 "zoned": false, 00:28:34.897 "supported_io_types": { 00:28:34.897 "read": true, 00:28:34.897 "write": true, 00:28:34.897 "unmap": true, 00:28:34.897 "write_zeroes": true, 00:28:34.897 "flush": true, 00:28:34.897 "reset": true, 00:28:34.897 "compare": false, 00:28:34.897 "compare_and_write": false, 00:28:34.897 "abort": true, 00:28:34.897 "nvme_admin": false, 00:28:34.897 "nvme_io": false 00:28:34.897 }, 00:28:34.897 "memory_domains": [ 00:28:34.897 { 00:28:34.897 "dma_device_id": "system", 00:28:34.897 "dma_device_type": 1 00:28:34.897 }, 00:28:34.897 { 00:28:34.897 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:34.897 "dma_device_type": 2 00:28:34.897 } 00:28:34.897 ], 00:28:34.897 "driver_specific": {} 00:28:34.897 }' 00:28:34.897 11:39:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:34.897 11:39:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:35.156 11:39:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:28:35.156 11:39:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:35.156 11:39:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:35.156 11:39:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:28:35.156 11:39:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:35.156 11:39:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:35.156 11:39:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:28:35.156 11:39:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:35.156 11:39:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:35.156 11:39:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:28:35.156 11:39:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:28:35.415 [2024-06-10 11:39:19.225757] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:28:35.415 [2024-06-10 11:39:19.225781] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:35.415 [2024-06-10 11:39:19.225810] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:35.415 11:39:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:28:35.415 11:39:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:28:35.415 11:39:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:28:35.415 11:39:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:28:35.415 11:39:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:28:35.415 11:39:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 1 00:28:35.415 11:39:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:35.415 11:39:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:28:35.415 11:39:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:28:35.415 11:39:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:28:35.415 11:39:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:35.415 11:39:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:35.415 11:39:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:35.415 11:39:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:35.415 11:39:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:35.415 11:39:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:35.415 11:39:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:35.674 11:39:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:35.674 "name": "Existed_Raid", 00:28:35.674 "uuid": "5697af83-2500-4b5c-a3ec-b8cfe40cb3b0", 00:28:35.674 "strip_size_kb": 64, 00:28:35.674 "state": "offline", 00:28:35.674 "raid_level": "concat", 00:28:35.674 "superblock": true, 00:28:35.674 "num_base_bdevs": 2, 00:28:35.674 "num_base_bdevs_discovered": 1, 00:28:35.674 "num_base_bdevs_operational": 1, 00:28:35.674 "base_bdevs_list": [ 00:28:35.674 { 00:28:35.674 "name": null, 00:28:35.674 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:35.674 "is_configured": false, 00:28:35.674 "data_offset": 2048, 00:28:35.674 "data_size": 63488 00:28:35.674 }, 00:28:35.674 { 00:28:35.674 "name": "BaseBdev2", 00:28:35.674 "uuid": "78c2602d-affb-485b-ab05-7f49805b2067", 00:28:35.674 "is_configured": true, 00:28:35.674 "data_offset": 2048, 00:28:35.674 "data_size": 63488 00:28:35.674 } 00:28:35.674 ] 00:28:35.674 }' 00:28:35.674 11:39:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:35.674 11:39:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:28:35.933 11:39:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:28:35.933 11:39:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:28:36.192 11:39:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:28:36.192 11:39:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:36.192 11:39:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:28:36.192 11:39:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:28:36.192 11:39:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:28:36.451 [2024-06-10 11:39:20.225467] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:28:36.451 [2024-06-10 11:39:20.225508] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1245bf0 name Existed_Raid, state offline 00:28:36.451 11:39:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:28:36.451 11:39:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:28:36.451 11:39:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:36.451 11:39:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:28:36.711 11:39:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:28:36.711 11:39:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:28:36.711 11:39:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:28:36.711 11:39:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 137106 00:28:36.711 11:39:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@949 -- # '[' -z 137106 ']' 00:28:36.711 11:39:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # kill -0 137106 00:28:36.711 11:39:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # uname 00:28:36.711 11:39:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:28:36.711 11:39:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 137106 00:28:36.711 11:39:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:28:36.711 11:39:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:28:36.711 11:39:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # echo 'killing process with pid 137106' 00:28:36.711 killing process with pid 137106 00:28:36.711 11:39:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # kill 137106 00:28:36.711 [2024-06-10 11:39:20.471098] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:28:36.711 11:39:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@973 -- # wait 137106 00:28:36.711 [2024-06-10 11:39:20.472030] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:28:36.970 11:39:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:28:36.970 00:28:36.970 real 0m8.278s 00:28:36.970 user 0m14.602s 00:28:36.970 sys 0m1.598s 00:28:36.970 11:39:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # xtrace_disable 00:28:36.970 11:39:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:28:36.970 ************************************ 00:28:36.970 END TEST raid_state_function_test_sb 00:28:36.970 ************************************ 00:28:36.970 11:39:20 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 2 00:28:36.970 11:39:20 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:28:36.970 11:39:20 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:28:36.970 11:39:20 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:28:36.970 ************************************ 00:28:36.970 START TEST raid_superblock_test 00:28:36.970 ************************************ 00:28:36.970 11:39:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # raid_superblock_test concat 2 00:28:36.970 11:39:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:28:36.970 11:39:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:28:36.970 11:39:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:28:36.970 11:39:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:28:36.970 11:39:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:28:36.970 11:39:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:28:36.970 11:39:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:28:36.970 11:39:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:28:36.970 11:39:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:28:36.970 11:39:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:28:36.970 11:39:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:28:36.970 11:39:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:28:36.970 11:39:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:28:36.970 11:39:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:28:36.970 11:39:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:28:36.970 11:39:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:28:36.970 11:39:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=138418 00:28:36.970 11:39:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 138418 /var/tmp/spdk-raid.sock 00:28:36.970 11:39:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:28:36.970 11:39:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@830 -- # '[' -z 138418 ']' 00:28:36.970 11:39:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:28:36.970 11:39:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:28:36.970 11:39:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:28:36.970 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:28:36.970 11:39:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:28:36.970 11:39:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:28:36.970 [2024-06-10 11:39:20.798039] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:28:36.970 [2024-06-10 11:39:20.798089] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid138418 ] 00:28:36.970 [2024-06-10 11:39:20.883707] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:37.229 [2024-06-10 11:39:20.971278] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:28:37.229 [2024-06-10 11:39:21.027735] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:37.229 [2024-06-10 11:39:21.027767] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:37.796 11:39:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:28:37.796 11:39:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@863 -- # return 0 00:28:37.796 11:39:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:28:37.796 11:39:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:28:37.796 11:39:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:28:37.796 11:39:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:28:37.796 11:39:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:28:37.796 11:39:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:28:37.796 11:39:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:28:37.796 11:39:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:28:37.796 11:39:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:28:38.054 malloc1 00:28:38.054 11:39:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:28:38.054 [2024-06-10 11:39:21.924810] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:28:38.054 [2024-06-10 11:39:21.924849] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:38.054 [2024-06-10 11:39:21.924885] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd71100 00:28:38.054 [2024-06-10 11:39:21.924894] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:38.054 [2024-06-10 11:39:21.926135] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:38.054 [2024-06-10 11:39:21.926158] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:28:38.054 pt1 00:28:38.055 11:39:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:28:38.055 11:39:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:28:38.055 11:39:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:28:38.055 11:39:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:28:38.055 11:39:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:28:38.055 11:39:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:28:38.055 11:39:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:28:38.055 11:39:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:28:38.055 11:39:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:28:38.313 malloc2 00:28:38.313 11:39:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:28:38.313 [2024-06-10 11:39:22.249494] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:28:38.313 [2024-06-10 11:39:22.249530] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:38.313 [2024-06-10 11:39:22.249560] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd72500 00:28:38.313 [2024-06-10 11:39:22.249569] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:38.313 [2024-06-10 11:39:22.250682] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:38.313 [2024-06-10 11:39:22.250705] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:28:38.313 pt2 00:28:38.572 11:39:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:28:38.572 11:39:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:28:38.572 11:39:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2' -n raid_bdev1 -s 00:28:38.572 [2024-06-10 11:39:22.425989] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:28:38.572 [2024-06-10 11:39:22.426979] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:38.572 [2024-06-10 11:39:22.427085] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xd71c00 00:28:38.572 [2024-06-10 11:39:22.427093] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:28:38.572 [2024-06-10 11:39:22.427228] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd70a00 00:28:38.572 [2024-06-10 11:39:22.427330] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xd71c00 00:28:38.572 [2024-06-10 11:39:22.427337] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xd71c00 00:28:38.572 [2024-06-10 11:39:22.427403] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:38.572 11:39:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:28:38.572 11:39:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:38.572 11:39:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:38.572 11:39:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:28:38.572 11:39:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:28:38.572 11:39:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:38.572 11:39:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:38.572 11:39:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:38.572 11:39:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:38.572 11:39:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:38.572 11:39:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:38.572 11:39:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:38.831 11:39:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:38.831 "name": "raid_bdev1", 00:28:38.831 "uuid": "383569eb-04e9-424e-92a8-404f7ed5602c", 00:28:38.831 "strip_size_kb": 64, 00:28:38.831 "state": "online", 00:28:38.831 "raid_level": "concat", 00:28:38.831 "superblock": true, 00:28:38.831 "num_base_bdevs": 2, 00:28:38.831 "num_base_bdevs_discovered": 2, 00:28:38.831 "num_base_bdevs_operational": 2, 00:28:38.831 "base_bdevs_list": [ 00:28:38.831 { 00:28:38.831 "name": "pt1", 00:28:38.831 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:38.831 "is_configured": true, 00:28:38.831 "data_offset": 2048, 00:28:38.831 "data_size": 63488 00:28:38.831 }, 00:28:38.831 { 00:28:38.831 "name": "pt2", 00:28:38.831 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:38.831 "is_configured": true, 00:28:38.831 "data_offset": 2048, 00:28:38.831 "data_size": 63488 00:28:38.831 } 00:28:38.831 ] 00:28:38.831 }' 00:28:38.831 11:39:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:38.831 11:39:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:28:39.398 11:39:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:28:39.398 11:39:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:28:39.398 11:39:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:28:39.398 11:39:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:28:39.398 11:39:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:28:39.398 11:39:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:28:39.398 11:39:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:39.398 11:39:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:28:39.398 [2024-06-10 11:39:23.284300] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:39.398 11:39:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:28:39.398 "name": "raid_bdev1", 00:28:39.398 "aliases": [ 00:28:39.398 "383569eb-04e9-424e-92a8-404f7ed5602c" 00:28:39.398 ], 00:28:39.398 "product_name": "Raid Volume", 00:28:39.398 "block_size": 512, 00:28:39.398 "num_blocks": 126976, 00:28:39.398 "uuid": "383569eb-04e9-424e-92a8-404f7ed5602c", 00:28:39.398 "assigned_rate_limits": { 00:28:39.398 "rw_ios_per_sec": 0, 00:28:39.398 "rw_mbytes_per_sec": 0, 00:28:39.398 "r_mbytes_per_sec": 0, 00:28:39.398 "w_mbytes_per_sec": 0 00:28:39.398 }, 00:28:39.398 "claimed": false, 00:28:39.398 "zoned": false, 00:28:39.398 "supported_io_types": { 00:28:39.398 "read": true, 00:28:39.398 "write": true, 00:28:39.398 "unmap": true, 00:28:39.398 "write_zeroes": true, 00:28:39.398 "flush": true, 00:28:39.398 "reset": true, 00:28:39.398 "compare": false, 00:28:39.398 "compare_and_write": false, 00:28:39.398 "abort": false, 00:28:39.398 "nvme_admin": false, 00:28:39.398 "nvme_io": false 00:28:39.398 }, 00:28:39.398 "memory_domains": [ 00:28:39.398 { 00:28:39.398 "dma_device_id": "system", 00:28:39.398 "dma_device_type": 1 00:28:39.398 }, 00:28:39.398 { 00:28:39.398 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:39.398 "dma_device_type": 2 00:28:39.398 }, 00:28:39.398 { 00:28:39.398 "dma_device_id": "system", 00:28:39.398 "dma_device_type": 1 00:28:39.398 }, 00:28:39.398 { 00:28:39.398 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:39.398 "dma_device_type": 2 00:28:39.398 } 00:28:39.398 ], 00:28:39.398 "driver_specific": { 00:28:39.398 "raid": { 00:28:39.398 "uuid": "383569eb-04e9-424e-92a8-404f7ed5602c", 00:28:39.398 "strip_size_kb": 64, 00:28:39.398 "state": "online", 00:28:39.398 "raid_level": "concat", 00:28:39.398 "superblock": true, 00:28:39.398 "num_base_bdevs": 2, 00:28:39.398 "num_base_bdevs_discovered": 2, 00:28:39.398 "num_base_bdevs_operational": 2, 00:28:39.398 "base_bdevs_list": [ 00:28:39.398 { 00:28:39.398 "name": "pt1", 00:28:39.398 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:39.398 "is_configured": true, 00:28:39.398 "data_offset": 2048, 00:28:39.398 "data_size": 63488 00:28:39.398 }, 00:28:39.398 { 00:28:39.398 "name": "pt2", 00:28:39.399 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:39.399 "is_configured": true, 00:28:39.399 "data_offset": 2048, 00:28:39.399 "data_size": 63488 00:28:39.399 } 00:28:39.399 ] 00:28:39.399 } 00:28:39.399 } 00:28:39.399 }' 00:28:39.399 11:39:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:28:39.657 11:39:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:28:39.657 pt2' 00:28:39.657 11:39:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:39.657 11:39:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:28:39.657 11:39:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:39.657 11:39:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:39.657 "name": "pt1", 00:28:39.657 "aliases": [ 00:28:39.657 "00000000-0000-0000-0000-000000000001" 00:28:39.657 ], 00:28:39.657 "product_name": "passthru", 00:28:39.657 "block_size": 512, 00:28:39.657 "num_blocks": 65536, 00:28:39.657 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:39.657 "assigned_rate_limits": { 00:28:39.657 "rw_ios_per_sec": 0, 00:28:39.657 "rw_mbytes_per_sec": 0, 00:28:39.657 "r_mbytes_per_sec": 0, 00:28:39.657 "w_mbytes_per_sec": 0 00:28:39.657 }, 00:28:39.657 "claimed": true, 00:28:39.657 "claim_type": "exclusive_write", 00:28:39.657 "zoned": false, 00:28:39.657 "supported_io_types": { 00:28:39.657 "read": true, 00:28:39.657 "write": true, 00:28:39.657 "unmap": true, 00:28:39.657 "write_zeroes": true, 00:28:39.657 "flush": true, 00:28:39.657 "reset": true, 00:28:39.657 "compare": false, 00:28:39.657 "compare_and_write": false, 00:28:39.657 "abort": true, 00:28:39.657 "nvme_admin": false, 00:28:39.657 "nvme_io": false 00:28:39.657 }, 00:28:39.657 "memory_domains": [ 00:28:39.657 { 00:28:39.657 "dma_device_id": "system", 00:28:39.657 "dma_device_type": 1 00:28:39.657 }, 00:28:39.657 { 00:28:39.657 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:39.657 "dma_device_type": 2 00:28:39.657 } 00:28:39.657 ], 00:28:39.657 "driver_specific": { 00:28:39.657 "passthru": { 00:28:39.657 "name": "pt1", 00:28:39.657 "base_bdev_name": "malloc1" 00:28:39.657 } 00:28:39.657 } 00:28:39.657 }' 00:28:39.657 11:39:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:39.657 11:39:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:39.657 11:39:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:28:39.657 11:39:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:39.916 11:39:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:39.916 11:39:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:28:39.916 11:39:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:39.916 11:39:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:39.916 11:39:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:28:39.916 11:39:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:39.916 11:39:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:39.916 11:39:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:28:39.916 11:39:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:39.916 11:39:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:28:39.916 11:39:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:40.176 11:39:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:40.176 "name": "pt2", 00:28:40.176 "aliases": [ 00:28:40.176 "00000000-0000-0000-0000-000000000002" 00:28:40.176 ], 00:28:40.176 "product_name": "passthru", 00:28:40.176 "block_size": 512, 00:28:40.176 "num_blocks": 65536, 00:28:40.176 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:40.176 "assigned_rate_limits": { 00:28:40.176 "rw_ios_per_sec": 0, 00:28:40.176 "rw_mbytes_per_sec": 0, 00:28:40.176 "r_mbytes_per_sec": 0, 00:28:40.176 "w_mbytes_per_sec": 0 00:28:40.176 }, 00:28:40.176 "claimed": true, 00:28:40.176 "claim_type": "exclusive_write", 00:28:40.176 "zoned": false, 00:28:40.176 "supported_io_types": { 00:28:40.176 "read": true, 00:28:40.176 "write": true, 00:28:40.176 "unmap": true, 00:28:40.176 "write_zeroes": true, 00:28:40.176 "flush": true, 00:28:40.176 "reset": true, 00:28:40.176 "compare": false, 00:28:40.176 "compare_and_write": false, 00:28:40.176 "abort": true, 00:28:40.176 "nvme_admin": false, 00:28:40.176 "nvme_io": false 00:28:40.176 }, 00:28:40.176 "memory_domains": [ 00:28:40.176 { 00:28:40.176 "dma_device_id": "system", 00:28:40.176 "dma_device_type": 1 00:28:40.176 }, 00:28:40.176 { 00:28:40.176 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:40.176 "dma_device_type": 2 00:28:40.176 } 00:28:40.176 ], 00:28:40.176 "driver_specific": { 00:28:40.176 "passthru": { 00:28:40.176 "name": "pt2", 00:28:40.176 "base_bdev_name": "malloc2" 00:28:40.176 } 00:28:40.176 } 00:28:40.176 }' 00:28:40.176 11:39:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:40.176 11:39:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:40.176 11:39:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:28:40.176 11:39:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:40.176 11:39:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:40.435 11:39:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:28:40.435 11:39:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:40.435 11:39:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:40.435 11:39:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:28:40.435 11:39:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:40.435 11:39:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:40.435 11:39:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:28:40.435 11:39:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:28:40.435 11:39:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:40.694 [2024-06-10 11:39:24.467362] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:40.695 11:39:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=383569eb-04e9-424e-92a8-404f7ed5602c 00:28:40.695 11:39:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 383569eb-04e9-424e-92a8-404f7ed5602c ']' 00:28:40.695 11:39:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:40.954 [2024-06-10 11:39:24.643671] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:40.954 [2024-06-10 11:39:24.643688] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:40.954 [2024-06-10 11:39:24.643726] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:40.954 [2024-06-10 11:39:24.643757] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:40.954 [2024-06-10 11:39:24.643764] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd71c00 name raid_bdev1, state offline 00:28:40.954 11:39:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:40.954 11:39:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:28:40.954 11:39:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:28:40.954 11:39:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:28:40.954 11:39:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:28:40.954 11:39:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:28:41.213 11:39:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:28:41.213 11:39:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:28:41.473 11:39:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:28:41.473 11:39:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:28:41.473 11:39:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:28:41.473 11:39:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:28:41.473 11:39:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@649 -- # local es=0 00:28:41.473 11:39:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:28:41.473 11:39:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:41.473 11:39:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:28:41.473 11:39:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:41.473 11:39:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:28:41.473 11:39:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:41.473 11:39:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:28:41.473 11:39:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:41.473 11:39:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:28:41.473 11:39:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:28:41.732 [2024-06-10 11:39:25.525939] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:28:41.732 [2024-06-10 11:39:25.526987] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:28:41.732 [2024-06-10 11:39:25.527031] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:28:41.732 [2024-06-10 11:39:25.527079] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:28:41.732 [2024-06-10 11:39:25.527092] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:41.732 [2024-06-10 11:39:25.527099] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf1f060 name raid_bdev1, state configuring 00:28:41.732 request: 00:28:41.732 { 00:28:41.732 "name": "raid_bdev1", 00:28:41.732 "raid_level": "concat", 00:28:41.732 "base_bdevs": [ 00:28:41.732 "malloc1", 00:28:41.732 "malloc2" 00:28:41.732 ], 00:28:41.732 "superblock": false, 00:28:41.732 "strip_size_kb": 64, 00:28:41.732 "method": "bdev_raid_create", 00:28:41.732 "req_id": 1 00:28:41.732 } 00:28:41.732 Got JSON-RPC error response 00:28:41.732 response: 00:28:41.732 { 00:28:41.732 "code": -17, 00:28:41.732 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:28:41.732 } 00:28:41.732 11:39:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # es=1 00:28:41.732 11:39:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:28:41.732 11:39:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:28:41.732 11:39:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:28:41.732 11:39:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:41.732 11:39:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:28:41.992 11:39:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:28:41.992 11:39:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:28:41.992 11:39:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:28:41.992 [2024-06-10 11:39:25.870788] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:28:41.992 [2024-06-10 11:39:25.870826] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:41.992 [2024-06-10 11:39:25.870838] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf1ecd0 00:28:41.992 [2024-06-10 11:39:25.870846] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:41.992 [2024-06-10 11:39:25.872102] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:41.992 [2024-06-10 11:39:25.872125] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:28:41.992 [2024-06-10 11:39:25.872177] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:28:41.992 [2024-06-10 11:39:25.872197] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:28:41.992 pt1 00:28:41.992 11:39:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 2 00:28:41.992 11:39:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:41.992 11:39:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:41.992 11:39:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:28:41.992 11:39:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:28:41.992 11:39:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:41.992 11:39:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:41.992 11:39:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:41.992 11:39:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:41.992 11:39:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:41.992 11:39:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:41.992 11:39:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:42.252 11:39:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:42.252 "name": "raid_bdev1", 00:28:42.252 "uuid": "383569eb-04e9-424e-92a8-404f7ed5602c", 00:28:42.252 "strip_size_kb": 64, 00:28:42.252 "state": "configuring", 00:28:42.252 "raid_level": "concat", 00:28:42.252 "superblock": true, 00:28:42.252 "num_base_bdevs": 2, 00:28:42.252 "num_base_bdevs_discovered": 1, 00:28:42.252 "num_base_bdevs_operational": 2, 00:28:42.252 "base_bdevs_list": [ 00:28:42.252 { 00:28:42.252 "name": "pt1", 00:28:42.252 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:42.252 "is_configured": true, 00:28:42.252 "data_offset": 2048, 00:28:42.252 "data_size": 63488 00:28:42.252 }, 00:28:42.252 { 00:28:42.252 "name": null, 00:28:42.252 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:42.252 "is_configured": false, 00:28:42.252 "data_offset": 2048, 00:28:42.252 "data_size": 63488 00:28:42.252 } 00:28:42.252 ] 00:28:42.252 }' 00:28:42.252 11:39:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:42.252 11:39:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:28:42.821 11:39:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:28:42.821 11:39:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:28:42.821 11:39:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:28:42.821 11:39:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:28:42.821 [2024-06-10 11:39:26.696919] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:28:42.821 [2024-06-10 11:39:26.696959] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:42.821 [2024-06-10 11:39:26.696988] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd71330 00:28:42.821 [2024-06-10 11:39:26.696997] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:42.821 [2024-06-10 11:39:26.697247] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:42.821 [2024-06-10 11:39:26.697259] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:28:42.821 [2024-06-10 11:39:26.697304] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:28:42.821 [2024-06-10 11:39:26.697317] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:42.821 [2024-06-10 11:39:26.697381] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xd704f0 00:28:42.821 [2024-06-10 11:39:26.697387] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:28:42.821 [2024-06-10 11:39:26.697491] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf1b9c0 00:28:42.821 [2024-06-10 11:39:26.697567] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xd704f0 00:28:42.821 [2024-06-10 11:39:26.697573] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xd704f0 00:28:42.821 [2024-06-10 11:39:26.697641] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:42.821 pt2 00:28:42.821 11:39:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:28:42.821 11:39:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:28:42.821 11:39:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:28:42.821 11:39:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:42.821 11:39:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:42.821 11:39:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:28:42.821 11:39:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:28:42.821 11:39:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:42.821 11:39:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:42.821 11:39:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:42.821 11:39:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:42.821 11:39:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:42.822 11:39:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:42.822 11:39:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:43.081 11:39:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:43.081 "name": "raid_bdev1", 00:28:43.081 "uuid": "383569eb-04e9-424e-92a8-404f7ed5602c", 00:28:43.081 "strip_size_kb": 64, 00:28:43.081 "state": "online", 00:28:43.081 "raid_level": "concat", 00:28:43.081 "superblock": true, 00:28:43.081 "num_base_bdevs": 2, 00:28:43.081 "num_base_bdevs_discovered": 2, 00:28:43.081 "num_base_bdevs_operational": 2, 00:28:43.081 "base_bdevs_list": [ 00:28:43.081 { 00:28:43.081 "name": "pt1", 00:28:43.081 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:43.081 "is_configured": true, 00:28:43.081 "data_offset": 2048, 00:28:43.081 "data_size": 63488 00:28:43.081 }, 00:28:43.081 { 00:28:43.081 "name": "pt2", 00:28:43.081 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:43.081 "is_configured": true, 00:28:43.081 "data_offset": 2048, 00:28:43.081 "data_size": 63488 00:28:43.081 } 00:28:43.081 ] 00:28:43.081 }' 00:28:43.081 11:39:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:43.081 11:39:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:28:43.649 11:39:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:28:43.649 11:39:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:28:43.649 11:39:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:28:43.649 11:39:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:28:43.649 11:39:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:28:43.649 11:39:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:28:43.649 11:39:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:43.649 11:39:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:28:43.649 [2024-06-10 11:39:27.535235] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:43.649 11:39:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:28:43.649 "name": "raid_bdev1", 00:28:43.649 "aliases": [ 00:28:43.649 "383569eb-04e9-424e-92a8-404f7ed5602c" 00:28:43.649 ], 00:28:43.649 "product_name": "Raid Volume", 00:28:43.649 "block_size": 512, 00:28:43.649 "num_blocks": 126976, 00:28:43.649 "uuid": "383569eb-04e9-424e-92a8-404f7ed5602c", 00:28:43.649 "assigned_rate_limits": { 00:28:43.649 "rw_ios_per_sec": 0, 00:28:43.649 "rw_mbytes_per_sec": 0, 00:28:43.649 "r_mbytes_per_sec": 0, 00:28:43.649 "w_mbytes_per_sec": 0 00:28:43.649 }, 00:28:43.649 "claimed": false, 00:28:43.649 "zoned": false, 00:28:43.649 "supported_io_types": { 00:28:43.649 "read": true, 00:28:43.649 "write": true, 00:28:43.649 "unmap": true, 00:28:43.649 "write_zeroes": true, 00:28:43.649 "flush": true, 00:28:43.649 "reset": true, 00:28:43.649 "compare": false, 00:28:43.649 "compare_and_write": false, 00:28:43.649 "abort": false, 00:28:43.649 "nvme_admin": false, 00:28:43.649 "nvme_io": false 00:28:43.649 }, 00:28:43.649 "memory_domains": [ 00:28:43.649 { 00:28:43.649 "dma_device_id": "system", 00:28:43.649 "dma_device_type": 1 00:28:43.649 }, 00:28:43.649 { 00:28:43.649 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:43.649 "dma_device_type": 2 00:28:43.649 }, 00:28:43.649 { 00:28:43.649 "dma_device_id": "system", 00:28:43.649 "dma_device_type": 1 00:28:43.649 }, 00:28:43.649 { 00:28:43.649 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:43.649 "dma_device_type": 2 00:28:43.649 } 00:28:43.649 ], 00:28:43.649 "driver_specific": { 00:28:43.649 "raid": { 00:28:43.649 "uuid": "383569eb-04e9-424e-92a8-404f7ed5602c", 00:28:43.649 "strip_size_kb": 64, 00:28:43.649 "state": "online", 00:28:43.649 "raid_level": "concat", 00:28:43.649 "superblock": true, 00:28:43.649 "num_base_bdevs": 2, 00:28:43.649 "num_base_bdevs_discovered": 2, 00:28:43.649 "num_base_bdevs_operational": 2, 00:28:43.649 "base_bdevs_list": [ 00:28:43.649 { 00:28:43.649 "name": "pt1", 00:28:43.649 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:43.649 "is_configured": true, 00:28:43.649 "data_offset": 2048, 00:28:43.649 "data_size": 63488 00:28:43.649 }, 00:28:43.649 { 00:28:43.649 "name": "pt2", 00:28:43.649 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:43.649 "is_configured": true, 00:28:43.649 "data_offset": 2048, 00:28:43.649 "data_size": 63488 00:28:43.649 } 00:28:43.649 ] 00:28:43.649 } 00:28:43.649 } 00:28:43.649 }' 00:28:43.649 11:39:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:28:43.649 11:39:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:28:43.649 pt2' 00:28:43.650 11:39:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:43.650 11:39:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:28:43.650 11:39:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:43.909 11:39:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:43.910 "name": "pt1", 00:28:43.910 "aliases": [ 00:28:43.910 "00000000-0000-0000-0000-000000000001" 00:28:43.910 ], 00:28:43.910 "product_name": "passthru", 00:28:43.910 "block_size": 512, 00:28:43.910 "num_blocks": 65536, 00:28:43.910 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:43.910 "assigned_rate_limits": { 00:28:43.910 "rw_ios_per_sec": 0, 00:28:43.910 "rw_mbytes_per_sec": 0, 00:28:43.910 "r_mbytes_per_sec": 0, 00:28:43.910 "w_mbytes_per_sec": 0 00:28:43.910 }, 00:28:43.910 "claimed": true, 00:28:43.910 "claim_type": "exclusive_write", 00:28:43.910 "zoned": false, 00:28:43.910 "supported_io_types": { 00:28:43.910 "read": true, 00:28:43.910 "write": true, 00:28:43.910 "unmap": true, 00:28:43.910 "write_zeroes": true, 00:28:43.910 "flush": true, 00:28:43.910 "reset": true, 00:28:43.910 "compare": false, 00:28:43.910 "compare_and_write": false, 00:28:43.910 "abort": true, 00:28:43.910 "nvme_admin": false, 00:28:43.910 "nvme_io": false 00:28:43.910 }, 00:28:43.910 "memory_domains": [ 00:28:43.910 { 00:28:43.910 "dma_device_id": "system", 00:28:43.910 "dma_device_type": 1 00:28:43.910 }, 00:28:43.910 { 00:28:43.910 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:43.910 "dma_device_type": 2 00:28:43.910 } 00:28:43.910 ], 00:28:43.910 "driver_specific": { 00:28:43.910 "passthru": { 00:28:43.910 "name": "pt1", 00:28:43.910 "base_bdev_name": "malloc1" 00:28:43.910 } 00:28:43.910 } 00:28:43.910 }' 00:28:43.910 11:39:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:43.910 11:39:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:43.910 11:39:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:28:43.910 11:39:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:44.169 11:39:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:44.169 11:39:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:28:44.169 11:39:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:44.169 11:39:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:44.169 11:39:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:28:44.169 11:39:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:44.169 11:39:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:44.169 11:39:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:28:44.169 11:39:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:44.169 11:39:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:44.169 11:39:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:28:44.428 11:39:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:44.428 "name": "pt2", 00:28:44.428 "aliases": [ 00:28:44.428 "00000000-0000-0000-0000-000000000002" 00:28:44.428 ], 00:28:44.428 "product_name": "passthru", 00:28:44.428 "block_size": 512, 00:28:44.428 "num_blocks": 65536, 00:28:44.428 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:44.428 "assigned_rate_limits": { 00:28:44.428 "rw_ios_per_sec": 0, 00:28:44.428 "rw_mbytes_per_sec": 0, 00:28:44.428 "r_mbytes_per_sec": 0, 00:28:44.428 "w_mbytes_per_sec": 0 00:28:44.428 }, 00:28:44.428 "claimed": true, 00:28:44.428 "claim_type": "exclusive_write", 00:28:44.428 "zoned": false, 00:28:44.428 "supported_io_types": { 00:28:44.428 "read": true, 00:28:44.428 "write": true, 00:28:44.428 "unmap": true, 00:28:44.428 "write_zeroes": true, 00:28:44.428 "flush": true, 00:28:44.428 "reset": true, 00:28:44.428 "compare": false, 00:28:44.428 "compare_and_write": false, 00:28:44.428 "abort": true, 00:28:44.428 "nvme_admin": false, 00:28:44.428 "nvme_io": false 00:28:44.428 }, 00:28:44.428 "memory_domains": [ 00:28:44.428 { 00:28:44.428 "dma_device_id": "system", 00:28:44.428 "dma_device_type": 1 00:28:44.428 }, 00:28:44.428 { 00:28:44.428 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:44.428 "dma_device_type": 2 00:28:44.428 } 00:28:44.428 ], 00:28:44.428 "driver_specific": { 00:28:44.428 "passthru": { 00:28:44.428 "name": "pt2", 00:28:44.428 "base_bdev_name": "malloc2" 00:28:44.428 } 00:28:44.428 } 00:28:44.428 }' 00:28:44.428 11:39:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:44.428 11:39:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:44.428 11:39:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:28:44.428 11:39:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:44.428 11:39:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:44.428 11:39:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:28:44.428 11:39:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:44.688 11:39:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:44.688 11:39:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:28:44.688 11:39:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:44.688 11:39:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:44.688 11:39:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:28:44.688 11:39:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:44.688 11:39:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:28:44.947 [2024-06-10 11:39:28.658134] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:44.947 11:39:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 383569eb-04e9-424e-92a8-404f7ed5602c '!=' 383569eb-04e9-424e-92a8-404f7ed5602c ']' 00:28:44.947 11:39:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:28:44.947 11:39:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:28:44.947 11:39:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:28:44.947 11:39:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 138418 00:28:44.947 11:39:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@949 -- # '[' -z 138418 ']' 00:28:44.947 11:39:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # kill -0 138418 00:28:44.947 11:39:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # uname 00:28:44.947 11:39:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:28:44.947 11:39:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 138418 00:28:44.947 11:39:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:28:44.947 11:39:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:28:44.947 11:39:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 138418' 00:28:44.947 killing process with pid 138418 00:28:44.947 11:39:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # kill 138418 00:28:44.947 [2024-06-10 11:39:28.716878] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:28:44.947 11:39:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@973 -- # wait 138418 00:28:44.947 [2024-06-10 11:39:28.716918] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:44.947 [2024-06-10 11:39:28.716948] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:44.947 [2024-06-10 11:39:28.716956] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd704f0 name raid_bdev1, state offline 00:28:44.947 [2024-06-10 11:39:28.732811] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:28:45.207 11:39:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:28:45.207 00:28:45.207 real 0m8.170s 00:28:45.207 user 0m14.360s 00:28:45.207 sys 0m1.661s 00:28:45.207 11:39:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:28:45.207 11:39:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:28:45.207 ************************************ 00:28:45.207 END TEST raid_superblock_test 00:28:45.207 ************************************ 00:28:45.207 11:39:28 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 2 read 00:28:45.207 11:39:28 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:28:45.207 11:39:28 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:28:45.207 11:39:28 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:28:45.207 ************************************ 00:28:45.207 START TEST raid_read_error_test 00:28:45.207 ************************************ 00:28:45.207 11:39:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test concat 2 read 00:28:45.207 11:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:28:45.207 11:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:28:45.207 11:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:28:45.207 11:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:28:45.207 11:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:28:45.207 11:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:28:45.207 11:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:28:45.207 11:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:28:45.207 11:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:28:45.207 11:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:28:45.207 11:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:28:45.207 11:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:28:45.207 11:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:28:45.207 11:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:28:45.207 11:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:28:45.207 11:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:28:45.207 11:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:28:45.207 11:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:28:45.207 11:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:28:45.207 11:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:28:45.207 11:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:28:45.207 11:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:28:45.207 11:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.sF6zP32iXk 00:28:45.207 11:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=139704 00:28:45.207 11:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 139704 /var/tmp/spdk-raid.sock 00:28:45.207 11:39:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@830 -- # '[' -z 139704 ']' 00:28:45.207 11:39:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:28:45.207 11:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:28:45.207 11:39:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:28:45.207 11:39:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:28:45.207 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:28:45.207 11:39:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:28:45.207 11:39:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:28:45.207 [2024-06-10 11:39:29.034636] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:28:45.207 [2024-06-10 11:39:29.034684] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid139704 ] 00:28:45.207 [2024-06-10 11:39:29.122633] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:45.467 [2024-06-10 11:39:29.211516] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:28:45.467 [2024-06-10 11:39:29.271315] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:45.467 [2024-06-10 11:39:29.271347] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:46.036 11:39:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:28:46.036 11:39:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@863 -- # return 0 00:28:46.036 11:39:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:28:46.036 11:39:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:28:46.294 BaseBdev1_malloc 00:28:46.294 11:39:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:28:46.294 true 00:28:46.294 11:39:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:28:46.554 [2024-06-10 11:39:30.336187] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:28:46.554 [2024-06-10 11:39:30.336224] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:46.554 [2024-06-10 11:39:30.336238] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1debb10 00:28:46.554 [2024-06-10 11:39:30.336247] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:46.554 [2024-06-10 11:39:30.337592] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:46.554 [2024-06-10 11:39:30.337616] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:28:46.554 BaseBdev1 00:28:46.554 11:39:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:28:46.554 11:39:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:28:46.813 BaseBdev2_malloc 00:28:46.813 11:39:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:28:46.813 true 00:28:46.813 11:39:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:28:47.072 [2024-06-10 11:39:30.838502] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:28:47.072 [2024-06-10 11:39:30.838538] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:47.072 [2024-06-10 11:39:30.838572] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1df0280 00:28:47.072 [2024-06-10 11:39:30.838581] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:47.072 [2024-06-10 11:39:30.839750] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:47.072 [2024-06-10 11:39:30.839773] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:28:47.072 BaseBdev2 00:28:47.072 11:39:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:28:47.072 [2024-06-10 11:39:31.002952] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:47.072 [2024-06-10 11:39:31.004012] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:28:47.072 [2024-06-10 11:39:31.004152] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1defca0 00:28:47.072 [2024-06-10 11:39:31.004161] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:28:47.072 [2024-06-10 11:39:31.004304] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1df2d10 00:28:47.072 [2024-06-10 11:39:31.004409] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1defca0 00:28:47.072 [2024-06-10 11:39:31.004415] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1defca0 00:28:47.072 [2024-06-10 11:39:31.004487] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:47.332 11:39:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:28:47.332 11:39:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:47.332 11:39:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:47.332 11:39:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:28:47.332 11:39:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:28:47.332 11:39:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:47.332 11:39:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:47.332 11:39:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:47.332 11:39:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:47.332 11:39:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:47.332 11:39:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:47.332 11:39:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:47.332 11:39:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:47.332 "name": "raid_bdev1", 00:28:47.332 "uuid": "f098c500-5e94-47cc-a9fd-c9dff0149d95", 00:28:47.332 "strip_size_kb": 64, 00:28:47.332 "state": "online", 00:28:47.332 "raid_level": "concat", 00:28:47.332 "superblock": true, 00:28:47.332 "num_base_bdevs": 2, 00:28:47.332 "num_base_bdevs_discovered": 2, 00:28:47.332 "num_base_bdevs_operational": 2, 00:28:47.332 "base_bdevs_list": [ 00:28:47.332 { 00:28:47.332 "name": "BaseBdev1", 00:28:47.332 "uuid": "2e6dcf6d-af60-5d3e-a759-686d9173eec4", 00:28:47.332 "is_configured": true, 00:28:47.332 "data_offset": 2048, 00:28:47.332 "data_size": 63488 00:28:47.332 }, 00:28:47.332 { 00:28:47.332 "name": "BaseBdev2", 00:28:47.332 "uuid": "bca02fb4-2e0c-5bb4-a0bc-107fded51da4", 00:28:47.332 "is_configured": true, 00:28:47.332 "data_offset": 2048, 00:28:47.332 "data_size": 63488 00:28:47.332 } 00:28:47.332 ] 00:28:47.332 }' 00:28:47.332 11:39:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:47.332 11:39:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:28:47.899 11:39:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:28:47.899 11:39:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:28:47.899 [2024-06-10 11:39:31.769264] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ded450 00:28:48.835 11:39:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:28:49.094 11:39:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:28:49.094 11:39:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:28:49.094 11:39:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:28:49.094 11:39:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:28:49.094 11:39:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:49.094 11:39:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:49.094 11:39:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:28:49.094 11:39:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:28:49.094 11:39:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:49.095 11:39:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:49.095 11:39:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:49.095 11:39:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:49.095 11:39:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:49.095 11:39:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:49.095 11:39:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:49.354 11:39:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:49.354 "name": "raid_bdev1", 00:28:49.354 "uuid": "f098c500-5e94-47cc-a9fd-c9dff0149d95", 00:28:49.354 "strip_size_kb": 64, 00:28:49.354 "state": "online", 00:28:49.354 "raid_level": "concat", 00:28:49.354 "superblock": true, 00:28:49.354 "num_base_bdevs": 2, 00:28:49.354 "num_base_bdevs_discovered": 2, 00:28:49.354 "num_base_bdevs_operational": 2, 00:28:49.354 "base_bdevs_list": [ 00:28:49.354 { 00:28:49.354 "name": "BaseBdev1", 00:28:49.354 "uuid": "2e6dcf6d-af60-5d3e-a759-686d9173eec4", 00:28:49.354 "is_configured": true, 00:28:49.354 "data_offset": 2048, 00:28:49.354 "data_size": 63488 00:28:49.354 }, 00:28:49.354 { 00:28:49.354 "name": "BaseBdev2", 00:28:49.354 "uuid": "bca02fb4-2e0c-5bb4-a0bc-107fded51da4", 00:28:49.354 "is_configured": true, 00:28:49.354 "data_offset": 2048, 00:28:49.354 "data_size": 63488 00:28:49.354 } 00:28:49.354 ] 00:28:49.354 }' 00:28:49.354 11:39:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:49.354 11:39:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:28:49.613 11:39:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:49.873 [2024-06-10 11:39:33.705324] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:49.873 [2024-06-10 11:39:33.705360] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:49.873 [2024-06-10 11:39:33.707440] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:49.873 [2024-06-10 11:39:33.707464] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:49.873 [2024-06-10 11:39:33.707481] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:49.873 [2024-06-10 11:39:33.707489] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1defca0 name raid_bdev1, state offline 00:28:49.873 0 00:28:49.873 11:39:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 139704 00:28:49.873 11:39:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@949 -- # '[' -z 139704 ']' 00:28:49.873 11:39:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # kill -0 139704 00:28:49.873 11:39:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # uname 00:28:49.873 11:39:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:28:49.873 11:39:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 139704 00:28:49.873 11:39:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:28:49.873 11:39:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:28:49.873 11:39:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 139704' 00:28:49.873 killing process with pid 139704 00:28:49.873 11:39:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # kill 139704 00:28:49.873 [2024-06-10 11:39:33.771985] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:28:49.873 11:39:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@973 -- # wait 139704 00:28:49.873 [2024-06-10 11:39:33.782056] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:28:50.133 11:39:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.sF6zP32iXk 00:28:50.133 11:39:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:28:50.133 11:39:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:28:50.133 11:39:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.52 00:28:50.133 11:39:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:28:50.133 11:39:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:28:50.134 11:39:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:28:50.134 11:39:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.52 != \0\.\0\0 ]] 00:28:50.134 00:28:50.134 real 0m4.995s 00:28:50.134 user 0m7.510s 00:28:50.134 sys 0m0.880s 00:28:50.134 11:39:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:28:50.134 11:39:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:28:50.134 ************************************ 00:28:50.134 END TEST raid_read_error_test 00:28:50.134 ************************************ 00:28:50.134 11:39:34 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 2 write 00:28:50.134 11:39:34 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:28:50.134 11:39:34 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:28:50.134 11:39:34 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:28:50.134 ************************************ 00:28:50.134 START TEST raid_write_error_test 00:28:50.134 ************************************ 00:28:50.134 11:39:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test concat 2 write 00:28:50.134 11:39:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:28:50.134 11:39:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:28:50.134 11:39:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:28:50.134 11:39:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:28:50.134 11:39:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:28:50.134 11:39:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:28:50.134 11:39:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:28:50.134 11:39:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:28:50.134 11:39:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:28:50.134 11:39:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:28:50.134 11:39:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:28:50.134 11:39:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:28:50.134 11:39:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:28:50.134 11:39:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:28:50.134 11:39:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:28:50.134 11:39:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:28:50.134 11:39:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:28:50.134 11:39:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:28:50.134 11:39:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:28:50.134 11:39:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:28:50.134 11:39:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:28:50.134 11:39:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:28:50.134 11:39:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.vJNx5k2GMd 00:28:50.134 11:39:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=140506 00:28:50.134 11:39:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 140506 /var/tmp/spdk-raid.sock 00:28:50.134 11:39:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@830 -- # '[' -z 140506 ']' 00:28:50.134 11:39:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:28:50.134 11:39:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:28:50.134 11:39:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:28:50.134 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:28:50.134 11:39:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:28:50.134 11:39:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:28:50.134 11:39:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:28:50.393 [2024-06-10 11:39:34.116720] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:28:50.393 [2024-06-10 11:39:34.116769] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid140506 ] 00:28:50.393 [2024-06-10 11:39:34.200789] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:50.393 [2024-06-10 11:39:34.285265] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:28:50.393 [2024-06-10 11:39:34.337672] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:50.393 [2024-06-10 11:39:34.337698] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:51.381 11:39:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:28:51.381 11:39:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@863 -- # return 0 00:28:51.381 11:39:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:28:51.381 11:39:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:28:51.381 BaseBdev1_malloc 00:28:51.382 11:39:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:28:51.382 true 00:28:51.382 11:39:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:28:51.640 [2024-06-10 11:39:35.420197] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:28:51.640 [2024-06-10 11:39:35.420231] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:51.640 [2024-06-10 11:39:35.420245] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfd4b10 00:28:51.640 [2024-06-10 11:39:35.420253] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:51.640 [2024-06-10 11:39:35.421561] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:51.640 [2024-06-10 11:39:35.421583] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:28:51.640 BaseBdev1 00:28:51.640 11:39:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:28:51.640 11:39:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:28:51.899 BaseBdev2_malloc 00:28:51.899 11:39:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:28:51.899 true 00:28:51.899 11:39:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:28:52.158 [2024-06-10 11:39:35.949302] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:28:52.159 [2024-06-10 11:39:35.949337] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:52.159 [2024-06-10 11:39:35.949351] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfd9280 00:28:52.159 [2024-06-10 11:39:35.949359] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:52.159 [2024-06-10 11:39:35.950520] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:52.159 [2024-06-10 11:39:35.950543] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:28:52.159 BaseBdev2 00:28:52.159 11:39:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:28:52.418 [2024-06-10 11:39:36.121784] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:52.418 [2024-06-10 11:39:36.122771] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:28:52.418 [2024-06-10 11:39:36.122914] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xfd8ca0 00:28:52.418 [2024-06-10 11:39:36.122924] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:28:52.418 [2024-06-10 11:39:36.123059] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfdbd10 00:28:52.418 [2024-06-10 11:39:36.123159] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xfd8ca0 00:28:52.418 [2024-06-10 11:39:36.123165] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xfd8ca0 00:28:52.418 [2024-06-10 11:39:36.123235] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:52.418 11:39:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:28:52.418 11:39:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:52.418 11:39:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:52.418 11:39:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:28:52.418 11:39:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:28:52.418 11:39:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:52.418 11:39:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:52.418 11:39:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:52.418 11:39:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:52.418 11:39:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:52.418 11:39:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:52.418 11:39:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:52.418 11:39:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:52.418 "name": "raid_bdev1", 00:28:52.418 "uuid": "adee7309-f747-45a4-9762-6615d4bc55e9", 00:28:52.418 "strip_size_kb": 64, 00:28:52.418 "state": "online", 00:28:52.418 "raid_level": "concat", 00:28:52.418 "superblock": true, 00:28:52.418 "num_base_bdevs": 2, 00:28:52.418 "num_base_bdevs_discovered": 2, 00:28:52.418 "num_base_bdevs_operational": 2, 00:28:52.418 "base_bdevs_list": [ 00:28:52.418 { 00:28:52.418 "name": "BaseBdev1", 00:28:52.418 "uuid": "34cb9c21-8fcc-5bdd-8011-4cf01db0e0c9", 00:28:52.418 "is_configured": true, 00:28:52.418 "data_offset": 2048, 00:28:52.418 "data_size": 63488 00:28:52.418 }, 00:28:52.418 { 00:28:52.418 "name": "BaseBdev2", 00:28:52.418 "uuid": "7162b989-bd4c-5645-b279-282ad91c397d", 00:28:52.418 "is_configured": true, 00:28:52.418 "data_offset": 2048, 00:28:52.418 "data_size": 63488 00:28:52.418 } 00:28:52.418 ] 00:28:52.418 }' 00:28:52.418 11:39:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:52.418 11:39:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:28:52.986 11:39:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:28:52.986 11:39:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:28:52.986 [2024-06-10 11:39:36.895958] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfd6450 00:28:53.922 11:39:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:28:54.181 11:39:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:28:54.181 11:39:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:28:54.181 11:39:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:28:54.181 11:39:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:28:54.181 11:39:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:54.181 11:39:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:54.181 11:39:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:28:54.181 11:39:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:28:54.181 11:39:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:54.181 11:39:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:54.181 11:39:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:54.181 11:39:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:54.181 11:39:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:54.181 11:39:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:54.181 11:39:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:54.440 11:39:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:54.440 "name": "raid_bdev1", 00:28:54.440 "uuid": "adee7309-f747-45a4-9762-6615d4bc55e9", 00:28:54.440 "strip_size_kb": 64, 00:28:54.440 "state": "online", 00:28:54.440 "raid_level": "concat", 00:28:54.440 "superblock": true, 00:28:54.440 "num_base_bdevs": 2, 00:28:54.440 "num_base_bdevs_discovered": 2, 00:28:54.440 "num_base_bdevs_operational": 2, 00:28:54.440 "base_bdevs_list": [ 00:28:54.440 { 00:28:54.440 "name": "BaseBdev1", 00:28:54.440 "uuid": "34cb9c21-8fcc-5bdd-8011-4cf01db0e0c9", 00:28:54.440 "is_configured": true, 00:28:54.440 "data_offset": 2048, 00:28:54.440 "data_size": 63488 00:28:54.440 }, 00:28:54.440 { 00:28:54.440 "name": "BaseBdev2", 00:28:54.440 "uuid": "7162b989-bd4c-5645-b279-282ad91c397d", 00:28:54.440 "is_configured": true, 00:28:54.440 "data_offset": 2048, 00:28:54.440 "data_size": 63488 00:28:54.440 } 00:28:54.440 ] 00:28:54.440 }' 00:28:54.440 11:39:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:54.440 11:39:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:28:55.008 11:39:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:55.008 [2024-06-10 11:39:38.824262] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:55.008 [2024-06-10 11:39:38.824301] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:55.008 [2024-06-10 11:39:38.826366] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:55.008 [2024-06-10 11:39:38.826390] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:55.008 [2024-06-10 11:39:38.826406] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:55.008 [2024-06-10 11:39:38.826413] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfd8ca0 name raid_bdev1, state offline 00:28:55.008 0 00:28:55.008 11:39:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 140506 00:28:55.008 11:39:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@949 -- # '[' -z 140506 ']' 00:28:55.008 11:39:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # kill -0 140506 00:28:55.008 11:39:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # uname 00:28:55.008 11:39:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:28:55.008 11:39:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 140506 00:28:55.008 11:39:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:28:55.008 11:39:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:28:55.008 11:39:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 140506' 00:28:55.008 killing process with pid 140506 00:28:55.008 11:39:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # kill 140506 00:28:55.008 [2024-06-10 11:39:38.889796] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:28:55.008 11:39:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@973 -- # wait 140506 00:28:55.008 [2024-06-10 11:39:38.899586] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:28:55.267 11:39:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.vJNx5k2GMd 00:28:55.267 11:39:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:28:55.267 11:39:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:28:55.267 11:39:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.52 00:28:55.267 11:39:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:28:55.267 11:39:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:28:55.267 11:39:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:28:55.267 11:39:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.52 != \0\.\0\0 ]] 00:28:55.267 00:28:55.267 real 0m5.053s 00:28:55.267 user 0m7.612s 00:28:55.267 sys 0m0.870s 00:28:55.267 11:39:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:28:55.267 11:39:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:28:55.267 ************************************ 00:28:55.267 END TEST raid_write_error_test 00:28:55.267 ************************************ 00:28:55.267 11:39:39 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:28:55.267 11:39:39 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 2 false 00:28:55.267 11:39:39 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:28:55.267 11:39:39 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:28:55.267 11:39:39 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:28:55.267 ************************************ 00:28:55.267 START TEST raid_state_function_test 00:28:55.267 ************************************ 00:28:55.267 11:39:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # raid_state_function_test raid1 2 false 00:28:55.267 11:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:28:55.267 11:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:28:55.267 11:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:28:55.267 11:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:28:55.267 11:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:28:55.267 11:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:28:55.268 11:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:28:55.268 11:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:28:55.268 11:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:28:55.268 11:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:28:55.268 11:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:28:55.268 11:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:28:55.268 11:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:28:55.268 11:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:28:55.268 11:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:28:55.268 11:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:28:55.268 11:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:28:55.268 11:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:28:55.268 11:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:28:55.268 11:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:28:55.268 11:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:28:55.268 11:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:28:55.268 11:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=141263 00:28:55.268 11:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 141263' 00:28:55.268 Process raid pid: 141263 00:28:55.268 11:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:28:55.268 11:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 141263 /var/tmp/spdk-raid.sock 00:28:55.268 11:39:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@830 -- # '[' -z 141263 ']' 00:28:55.268 11:39:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:28:55.268 11:39:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:28:55.268 11:39:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:28:55.268 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:28:55.268 11:39:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:28:55.268 11:39:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:28:55.526 [2024-06-10 11:39:39.257328] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:28:55.526 [2024-06-10 11:39:39.257384] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:28:55.526 [2024-06-10 11:39:39.344195] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:55.526 [2024-06-10 11:39:39.427473] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:28:55.785 [2024-06-10 11:39:39.490892] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:55.785 [2024-06-10 11:39:39.490916] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:56.352 11:39:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:28:56.352 11:39:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@863 -- # return 0 00:28:56.352 11:39:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:28:56.352 [2024-06-10 11:39:40.207653] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:28:56.352 [2024-06-10 11:39:40.207694] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:28:56.352 [2024-06-10 11:39:40.207701] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:28:56.352 [2024-06-10 11:39:40.207724] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:28:56.352 11:39:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:28:56.352 11:39:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:56.352 11:39:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:56.352 11:39:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:56.352 11:39:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:56.352 11:39:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:56.352 11:39:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:56.352 11:39:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:56.352 11:39:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:56.352 11:39:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:56.352 11:39:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:56.352 11:39:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:56.611 11:39:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:56.611 "name": "Existed_Raid", 00:28:56.611 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:56.611 "strip_size_kb": 0, 00:28:56.611 "state": "configuring", 00:28:56.611 "raid_level": "raid1", 00:28:56.611 "superblock": false, 00:28:56.611 "num_base_bdevs": 2, 00:28:56.611 "num_base_bdevs_discovered": 0, 00:28:56.611 "num_base_bdevs_operational": 2, 00:28:56.611 "base_bdevs_list": [ 00:28:56.611 { 00:28:56.611 "name": "BaseBdev1", 00:28:56.611 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:56.611 "is_configured": false, 00:28:56.611 "data_offset": 0, 00:28:56.611 "data_size": 0 00:28:56.611 }, 00:28:56.611 { 00:28:56.611 "name": "BaseBdev2", 00:28:56.611 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:56.611 "is_configured": false, 00:28:56.611 "data_offset": 0, 00:28:56.611 "data_size": 0 00:28:56.611 } 00:28:56.611 ] 00:28:56.611 }' 00:28:56.611 11:39:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:56.611 11:39:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:28:57.179 11:39:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:28:57.179 [2024-06-10 11:39:41.053750] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:28:57.179 [2024-06-10 11:39:41.053773] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x261d510 name Existed_Raid, state configuring 00:28:57.179 11:39:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:28:57.437 [2024-06-10 11:39:41.226207] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:28:57.437 [2024-06-10 11:39:41.226228] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:28:57.437 [2024-06-10 11:39:41.226235] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:28:57.437 [2024-06-10 11:39:41.226242] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:28:57.437 11:39:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:28:57.696 [2024-06-10 11:39:41.407295] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:57.696 BaseBdev1 00:28:57.696 11:39:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:28:57.696 11:39:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:28:57.696 11:39:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:28:57.696 11:39:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:28:57.696 11:39:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:28:57.696 11:39:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:28:57.696 11:39:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:28:57.696 11:39:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:28:57.954 [ 00:28:57.954 { 00:28:57.954 "name": "BaseBdev1", 00:28:57.954 "aliases": [ 00:28:57.954 "fd194712-740c-44c5-bdc5-60b7fa6a5c64" 00:28:57.954 ], 00:28:57.954 "product_name": "Malloc disk", 00:28:57.954 "block_size": 512, 00:28:57.954 "num_blocks": 65536, 00:28:57.954 "uuid": "fd194712-740c-44c5-bdc5-60b7fa6a5c64", 00:28:57.954 "assigned_rate_limits": { 00:28:57.954 "rw_ios_per_sec": 0, 00:28:57.954 "rw_mbytes_per_sec": 0, 00:28:57.954 "r_mbytes_per_sec": 0, 00:28:57.954 "w_mbytes_per_sec": 0 00:28:57.954 }, 00:28:57.954 "claimed": true, 00:28:57.954 "claim_type": "exclusive_write", 00:28:57.954 "zoned": false, 00:28:57.954 "supported_io_types": { 00:28:57.954 "read": true, 00:28:57.954 "write": true, 00:28:57.954 "unmap": true, 00:28:57.954 "write_zeroes": true, 00:28:57.954 "flush": true, 00:28:57.954 "reset": true, 00:28:57.954 "compare": false, 00:28:57.954 "compare_and_write": false, 00:28:57.954 "abort": true, 00:28:57.954 "nvme_admin": false, 00:28:57.954 "nvme_io": false 00:28:57.954 }, 00:28:57.954 "memory_domains": [ 00:28:57.954 { 00:28:57.954 "dma_device_id": "system", 00:28:57.954 "dma_device_type": 1 00:28:57.954 }, 00:28:57.954 { 00:28:57.954 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:57.954 "dma_device_type": 2 00:28:57.954 } 00:28:57.954 ], 00:28:57.954 "driver_specific": {} 00:28:57.954 } 00:28:57.954 ] 00:28:57.954 11:39:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:28:57.954 11:39:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:28:57.954 11:39:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:57.954 11:39:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:57.954 11:39:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:57.954 11:39:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:57.954 11:39:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:57.954 11:39:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:57.954 11:39:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:57.954 11:39:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:57.954 11:39:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:57.954 11:39:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:57.954 11:39:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:58.213 11:39:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:58.213 "name": "Existed_Raid", 00:28:58.213 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:58.213 "strip_size_kb": 0, 00:28:58.213 "state": "configuring", 00:28:58.213 "raid_level": "raid1", 00:28:58.213 "superblock": false, 00:28:58.213 "num_base_bdevs": 2, 00:28:58.213 "num_base_bdevs_discovered": 1, 00:28:58.213 "num_base_bdevs_operational": 2, 00:28:58.213 "base_bdevs_list": [ 00:28:58.213 { 00:28:58.213 "name": "BaseBdev1", 00:28:58.213 "uuid": "fd194712-740c-44c5-bdc5-60b7fa6a5c64", 00:28:58.213 "is_configured": true, 00:28:58.213 "data_offset": 0, 00:28:58.213 "data_size": 65536 00:28:58.213 }, 00:28:58.213 { 00:28:58.213 "name": "BaseBdev2", 00:28:58.213 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:58.213 "is_configured": false, 00:28:58.213 "data_offset": 0, 00:28:58.213 "data_size": 0 00:28:58.213 } 00:28:58.213 ] 00:28:58.213 }' 00:28:58.213 11:39:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:58.213 11:39:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:28:58.779 11:39:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:28:58.779 [2024-06-10 11:39:42.594389] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:28:58.779 [2024-06-10 11:39:42.594425] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x261ce00 name Existed_Raid, state configuring 00:28:58.779 11:39:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:28:59.038 [2024-06-10 11:39:42.766846] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:59.038 [2024-06-10 11:39:42.767862] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:28:59.038 [2024-06-10 11:39:42.767911] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:28:59.038 11:39:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:28:59.038 11:39:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:28:59.038 11:39:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:28:59.038 11:39:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:59.038 11:39:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:59.038 11:39:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:59.038 11:39:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:59.038 11:39:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:59.038 11:39:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:59.038 11:39:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:59.038 11:39:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:59.038 11:39:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:59.038 11:39:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:59.038 11:39:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:59.038 11:39:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:59.038 "name": "Existed_Raid", 00:28:59.038 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:59.038 "strip_size_kb": 0, 00:28:59.038 "state": "configuring", 00:28:59.038 "raid_level": "raid1", 00:28:59.038 "superblock": false, 00:28:59.038 "num_base_bdevs": 2, 00:28:59.038 "num_base_bdevs_discovered": 1, 00:28:59.038 "num_base_bdevs_operational": 2, 00:28:59.038 "base_bdevs_list": [ 00:28:59.038 { 00:28:59.038 "name": "BaseBdev1", 00:28:59.038 "uuid": "fd194712-740c-44c5-bdc5-60b7fa6a5c64", 00:28:59.038 "is_configured": true, 00:28:59.038 "data_offset": 0, 00:28:59.038 "data_size": 65536 00:28:59.038 }, 00:28:59.038 { 00:28:59.038 "name": "BaseBdev2", 00:28:59.038 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:59.038 "is_configured": false, 00:28:59.038 "data_offset": 0, 00:28:59.038 "data_size": 0 00:28:59.038 } 00:28:59.038 ] 00:28:59.038 }' 00:28:59.038 11:39:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:59.038 11:39:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:28:59.603 11:39:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:28:59.862 [2024-06-10 11:39:43.612005] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:28:59.862 [2024-06-10 11:39:43.612041] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x261dbf0 00:28:59.862 [2024-06-10 11:39:43.612047] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:28:59.862 [2024-06-10 11:39:43.612223] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x27cf9b0 00:28:59.862 [2024-06-10 11:39:43.612307] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x261dbf0 00:28:59.862 [2024-06-10 11:39:43.612314] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x261dbf0 00:28:59.862 [2024-06-10 11:39:43.612432] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:59.862 BaseBdev2 00:28:59.862 11:39:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:28:59.862 11:39:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:28:59.862 11:39:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:28:59.862 11:39:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:28:59.862 11:39:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:28:59.862 11:39:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:28:59.862 11:39:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:28:59.862 11:39:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:29:00.120 [ 00:29:00.120 { 00:29:00.120 "name": "BaseBdev2", 00:29:00.120 "aliases": [ 00:29:00.120 "0eaf3707-5c12-4c2f-996d-9e0103131c91" 00:29:00.120 ], 00:29:00.120 "product_name": "Malloc disk", 00:29:00.120 "block_size": 512, 00:29:00.120 "num_blocks": 65536, 00:29:00.120 "uuid": "0eaf3707-5c12-4c2f-996d-9e0103131c91", 00:29:00.120 "assigned_rate_limits": { 00:29:00.120 "rw_ios_per_sec": 0, 00:29:00.120 "rw_mbytes_per_sec": 0, 00:29:00.120 "r_mbytes_per_sec": 0, 00:29:00.120 "w_mbytes_per_sec": 0 00:29:00.120 }, 00:29:00.120 "claimed": true, 00:29:00.120 "claim_type": "exclusive_write", 00:29:00.120 "zoned": false, 00:29:00.120 "supported_io_types": { 00:29:00.120 "read": true, 00:29:00.120 "write": true, 00:29:00.120 "unmap": true, 00:29:00.120 "write_zeroes": true, 00:29:00.120 "flush": true, 00:29:00.120 "reset": true, 00:29:00.120 "compare": false, 00:29:00.120 "compare_and_write": false, 00:29:00.120 "abort": true, 00:29:00.120 "nvme_admin": false, 00:29:00.120 "nvme_io": false 00:29:00.120 }, 00:29:00.120 "memory_domains": [ 00:29:00.120 { 00:29:00.120 "dma_device_id": "system", 00:29:00.120 "dma_device_type": 1 00:29:00.120 }, 00:29:00.120 { 00:29:00.120 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:00.120 "dma_device_type": 2 00:29:00.120 } 00:29:00.120 ], 00:29:00.120 "driver_specific": {} 00:29:00.120 } 00:29:00.120 ] 00:29:00.120 11:39:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:29:00.120 11:39:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:29:00.120 11:39:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:29:00.120 11:39:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:29:00.120 11:39:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:00.120 11:39:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:00.120 11:39:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:00.120 11:39:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:00.120 11:39:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:00.120 11:39:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:00.120 11:39:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:00.120 11:39:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:00.120 11:39:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:00.120 11:39:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:00.120 11:39:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:00.378 11:39:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:00.378 "name": "Existed_Raid", 00:29:00.378 "uuid": "20870ace-2c13-4e7c-a874-f01889dc856d", 00:29:00.378 "strip_size_kb": 0, 00:29:00.378 "state": "online", 00:29:00.378 "raid_level": "raid1", 00:29:00.378 "superblock": false, 00:29:00.378 "num_base_bdevs": 2, 00:29:00.378 "num_base_bdevs_discovered": 2, 00:29:00.378 "num_base_bdevs_operational": 2, 00:29:00.378 "base_bdevs_list": [ 00:29:00.378 { 00:29:00.378 "name": "BaseBdev1", 00:29:00.378 "uuid": "fd194712-740c-44c5-bdc5-60b7fa6a5c64", 00:29:00.378 "is_configured": true, 00:29:00.378 "data_offset": 0, 00:29:00.378 "data_size": 65536 00:29:00.378 }, 00:29:00.378 { 00:29:00.378 "name": "BaseBdev2", 00:29:00.378 "uuid": "0eaf3707-5c12-4c2f-996d-9e0103131c91", 00:29:00.378 "is_configured": true, 00:29:00.378 "data_offset": 0, 00:29:00.378 "data_size": 65536 00:29:00.378 } 00:29:00.378 ] 00:29:00.378 }' 00:29:00.378 11:39:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:00.378 11:39:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:29:00.945 11:39:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:29:00.945 11:39:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:29:00.945 11:39:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:29:00.945 11:39:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:29:00.945 11:39:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:29:00.945 11:39:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:29:00.945 11:39:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:29:00.945 11:39:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:29:00.945 [2024-06-10 11:39:44.799210] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:00.945 11:39:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:29:00.945 "name": "Existed_Raid", 00:29:00.945 "aliases": [ 00:29:00.945 "20870ace-2c13-4e7c-a874-f01889dc856d" 00:29:00.945 ], 00:29:00.945 "product_name": "Raid Volume", 00:29:00.945 "block_size": 512, 00:29:00.945 "num_blocks": 65536, 00:29:00.945 "uuid": "20870ace-2c13-4e7c-a874-f01889dc856d", 00:29:00.945 "assigned_rate_limits": { 00:29:00.945 "rw_ios_per_sec": 0, 00:29:00.945 "rw_mbytes_per_sec": 0, 00:29:00.945 "r_mbytes_per_sec": 0, 00:29:00.945 "w_mbytes_per_sec": 0 00:29:00.945 }, 00:29:00.945 "claimed": false, 00:29:00.945 "zoned": false, 00:29:00.945 "supported_io_types": { 00:29:00.945 "read": true, 00:29:00.945 "write": true, 00:29:00.945 "unmap": false, 00:29:00.945 "write_zeroes": true, 00:29:00.945 "flush": false, 00:29:00.945 "reset": true, 00:29:00.945 "compare": false, 00:29:00.945 "compare_and_write": false, 00:29:00.945 "abort": false, 00:29:00.945 "nvme_admin": false, 00:29:00.945 "nvme_io": false 00:29:00.945 }, 00:29:00.945 "memory_domains": [ 00:29:00.945 { 00:29:00.945 "dma_device_id": "system", 00:29:00.945 "dma_device_type": 1 00:29:00.945 }, 00:29:00.945 { 00:29:00.945 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:00.945 "dma_device_type": 2 00:29:00.945 }, 00:29:00.945 { 00:29:00.945 "dma_device_id": "system", 00:29:00.945 "dma_device_type": 1 00:29:00.945 }, 00:29:00.945 { 00:29:00.945 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:00.945 "dma_device_type": 2 00:29:00.945 } 00:29:00.945 ], 00:29:00.945 "driver_specific": { 00:29:00.945 "raid": { 00:29:00.945 "uuid": "20870ace-2c13-4e7c-a874-f01889dc856d", 00:29:00.945 "strip_size_kb": 0, 00:29:00.945 "state": "online", 00:29:00.945 "raid_level": "raid1", 00:29:00.945 "superblock": false, 00:29:00.945 "num_base_bdevs": 2, 00:29:00.945 "num_base_bdevs_discovered": 2, 00:29:00.945 "num_base_bdevs_operational": 2, 00:29:00.945 "base_bdevs_list": [ 00:29:00.945 { 00:29:00.945 "name": "BaseBdev1", 00:29:00.945 "uuid": "fd194712-740c-44c5-bdc5-60b7fa6a5c64", 00:29:00.945 "is_configured": true, 00:29:00.945 "data_offset": 0, 00:29:00.945 "data_size": 65536 00:29:00.945 }, 00:29:00.945 { 00:29:00.945 "name": "BaseBdev2", 00:29:00.946 "uuid": "0eaf3707-5c12-4c2f-996d-9e0103131c91", 00:29:00.946 "is_configured": true, 00:29:00.946 "data_offset": 0, 00:29:00.946 "data_size": 65536 00:29:00.946 } 00:29:00.946 ] 00:29:00.946 } 00:29:00.946 } 00:29:00.946 }' 00:29:00.946 11:39:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:29:00.946 11:39:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:29:00.946 BaseBdev2' 00:29:00.946 11:39:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:00.946 11:39:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:29:00.946 11:39:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:01.204 11:39:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:01.204 "name": "BaseBdev1", 00:29:01.204 "aliases": [ 00:29:01.204 "fd194712-740c-44c5-bdc5-60b7fa6a5c64" 00:29:01.204 ], 00:29:01.204 "product_name": "Malloc disk", 00:29:01.204 "block_size": 512, 00:29:01.204 "num_blocks": 65536, 00:29:01.204 "uuid": "fd194712-740c-44c5-bdc5-60b7fa6a5c64", 00:29:01.204 "assigned_rate_limits": { 00:29:01.204 "rw_ios_per_sec": 0, 00:29:01.204 "rw_mbytes_per_sec": 0, 00:29:01.204 "r_mbytes_per_sec": 0, 00:29:01.204 "w_mbytes_per_sec": 0 00:29:01.204 }, 00:29:01.204 "claimed": true, 00:29:01.204 "claim_type": "exclusive_write", 00:29:01.204 "zoned": false, 00:29:01.204 "supported_io_types": { 00:29:01.204 "read": true, 00:29:01.204 "write": true, 00:29:01.204 "unmap": true, 00:29:01.204 "write_zeroes": true, 00:29:01.204 "flush": true, 00:29:01.204 "reset": true, 00:29:01.204 "compare": false, 00:29:01.204 "compare_and_write": false, 00:29:01.204 "abort": true, 00:29:01.204 "nvme_admin": false, 00:29:01.204 "nvme_io": false 00:29:01.204 }, 00:29:01.204 "memory_domains": [ 00:29:01.204 { 00:29:01.204 "dma_device_id": "system", 00:29:01.204 "dma_device_type": 1 00:29:01.204 }, 00:29:01.204 { 00:29:01.204 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:01.204 "dma_device_type": 2 00:29:01.204 } 00:29:01.204 ], 00:29:01.204 "driver_specific": {} 00:29:01.204 }' 00:29:01.204 11:39:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:01.204 11:39:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:01.204 11:39:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:29:01.204 11:39:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:01.462 11:39:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:01.462 11:39:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:29:01.462 11:39:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:01.462 11:39:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:01.462 11:39:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:29:01.462 11:39:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:01.462 11:39:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:01.462 11:39:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:29:01.462 11:39:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:01.462 11:39:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:29:01.462 11:39:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:01.720 11:39:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:01.720 "name": "BaseBdev2", 00:29:01.720 "aliases": [ 00:29:01.720 "0eaf3707-5c12-4c2f-996d-9e0103131c91" 00:29:01.720 ], 00:29:01.720 "product_name": "Malloc disk", 00:29:01.720 "block_size": 512, 00:29:01.720 "num_blocks": 65536, 00:29:01.720 "uuid": "0eaf3707-5c12-4c2f-996d-9e0103131c91", 00:29:01.720 "assigned_rate_limits": { 00:29:01.720 "rw_ios_per_sec": 0, 00:29:01.720 "rw_mbytes_per_sec": 0, 00:29:01.720 "r_mbytes_per_sec": 0, 00:29:01.720 "w_mbytes_per_sec": 0 00:29:01.720 }, 00:29:01.720 "claimed": true, 00:29:01.720 "claim_type": "exclusive_write", 00:29:01.720 "zoned": false, 00:29:01.720 "supported_io_types": { 00:29:01.720 "read": true, 00:29:01.720 "write": true, 00:29:01.720 "unmap": true, 00:29:01.720 "write_zeroes": true, 00:29:01.720 "flush": true, 00:29:01.720 "reset": true, 00:29:01.720 "compare": false, 00:29:01.720 "compare_and_write": false, 00:29:01.720 "abort": true, 00:29:01.720 "nvme_admin": false, 00:29:01.720 "nvme_io": false 00:29:01.720 }, 00:29:01.720 "memory_domains": [ 00:29:01.720 { 00:29:01.720 "dma_device_id": "system", 00:29:01.720 "dma_device_type": 1 00:29:01.720 }, 00:29:01.720 { 00:29:01.720 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:01.720 "dma_device_type": 2 00:29:01.720 } 00:29:01.720 ], 00:29:01.720 "driver_specific": {} 00:29:01.720 }' 00:29:01.720 11:39:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:01.720 11:39:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:01.720 11:39:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:29:01.720 11:39:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:01.720 11:39:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:01.979 11:39:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:29:01.979 11:39:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:01.979 11:39:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:01.979 11:39:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:29:01.979 11:39:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:01.979 11:39:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:01.979 11:39:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:29:01.979 11:39:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:29:02.245 [2024-06-10 11:39:46.006224] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:29:02.245 11:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:29:02.245 11:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:29:02.245 11:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:29:02.245 11:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:29:02.245 11:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:29:02.245 11:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:29:02.245 11:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:02.245 11:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:02.245 11:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:02.245 11:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:02.245 11:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:02.245 11:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:02.245 11:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:02.245 11:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:02.245 11:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:02.245 11:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:02.245 11:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:02.504 11:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:02.504 "name": "Existed_Raid", 00:29:02.504 "uuid": "20870ace-2c13-4e7c-a874-f01889dc856d", 00:29:02.504 "strip_size_kb": 0, 00:29:02.504 "state": "online", 00:29:02.504 "raid_level": "raid1", 00:29:02.504 "superblock": false, 00:29:02.504 "num_base_bdevs": 2, 00:29:02.504 "num_base_bdevs_discovered": 1, 00:29:02.504 "num_base_bdevs_operational": 1, 00:29:02.504 "base_bdevs_list": [ 00:29:02.504 { 00:29:02.504 "name": null, 00:29:02.504 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:02.504 "is_configured": false, 00:29:02.504 "data_offset": 0, 00:29:02.504 "data_size": 65536 00:29:02.504 }, 00:29:02.504 { 00:29:02.504 "name": "BaseBdev2", 00:29:02.505 "uuid": "0eaf3707-5c12-4c2f-996d-9e0103131c91", 00:29:02.505 "is_configured": true, 00:29:02.505 "data_offset": 0, 00:29:02.505 "data_size": 65536 00:29:02.505 } 00:29:02.505 ] 00:29:02.505 }' 00:29:02.505 11:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:02.505 11:39:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:29:02.764 11:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:29:02.764 11:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:29:02.764 11:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:29:02.764 11:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:03.023 11:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:29:03.023 11:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:29:03.023 11:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:29:03.283 [2024-06-10 11:39:47.018548] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:29:03.283 [2024-06-10 11:39:47.018617] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:29:03.283 [2024-06-10 11:39:47.030549] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:03.283 [2024-06-10 11:39:47.030594] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:03.283 [2024-06-10 11:39:47.030603] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x261dbf0 name Existed_Raid, state offline 00:29:03.283 11:39:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:29:03.283 11:39:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:29:03.283 11:39:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:03.283 11:39:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:29:03.283 11:39:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:29:03.542 11:39:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:29:03.542 11:39:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:29:03.542 11:39:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 141263 00:29:03.542 11:39:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@949 -- # '[' -z 141263 ']' 00:29:03.542 11:39:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # kill -0 141263 00:29:03.542 11:39:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # uname 00:29:03.542 11:39:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:29:03.542 11:39:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 141263 00:29:03.542 11:39:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:29:03.542 11:39:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:29:03.542 11:39:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 141263' 00:29:03.542 killing process with pid 141263 00:29:03.542 11:39:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # kill 141263 00:29:03.542 [2024-06-10 11:39:47.277116] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:29:03.542 11:39:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@973 -- # wait 141263 00:29:03.542 [2024-06-10 11:39:47.277969] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:29:03.542 11:39:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:29:03.542 00:29:03.542 real 0m8.279s 00:29:03.542 user 0m14.534s 00:29:03.542 sys 0m1.633s 00:29:03.542 11:39:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:29:03.542 11:39:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:29:03.542 ************************************ 00:29:03.542 END TEST raid_state_function_test 00:29:03.542 ************************************ 00:29:03.801 11:39:47 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 2 true 00:29:03.802 11:39:47 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:29:03.802 11:39:47 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:29:03.802 11:39:47 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:29:03.802 ************************************ 00:29:03.802 START TEST raid_state_function_test_sb 00:29:03.802 ************************************ 00:29:03.802 11:39:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # raid_state_function_test raid1 2 true 00:29:03.802 11:39:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:29:03.802 11:39:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:29:03.802 11:39:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:29:03.802 11:39:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:29:03.802 11:39:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:29:03.802 11:39:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:29:03.802 11:39:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:29:03.802 11:39:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:29:03.802 11:39:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:29:03.802 11:39:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:29:03.802 11:39:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:29:03.802 11:39:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:29:03.802 11:39:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:29:03.802 11:39:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:29:03.802 11:39:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:29:03.802 11:39:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:29:03.802 11:39:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:29:03.802 11:39:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:29:03.802 11:39:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:29:03.802 11:39:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:29:03.802 11:39:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:29:03.802 11:39:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:29:03.802 11:39:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=142594 00:29:03.802 11:39:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 142594' 00:29:03.802 Process raid pid: 142594 00:29:03.802 11:39:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:29:03.802 11:39:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 142594 /var/tmp/spdk-raid.sock 00:29:03.802 11:39:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@830 -- # '[' -z 142594 ']' 00:29:03.802 11:39:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:29:03.802 11:39:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local max_retries=100 00:29:03.802 11:39:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:29:03.802 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:29:03.802 11:39:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@839 -- # xtrace_disable 00:29:03.802 11:39:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:29:03.802 [2024-06-10 11:39:47.617025] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:29:03.802 [2024-06-10 11:39:47.617081] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:29:03.802 [2024-06-10 11:39:47.706894] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:04.061 [2024-06-10 11:39:47.794135] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:29:04.061 [2024-06-10 11:39:47.848394] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:04.061 [2024-06-10 11:39:47.848420] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:04.628 11:39:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:29:04.628 11:39:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@863 -- # return 0 00:29:04.628 11:39:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:29:04.628 [2024-06-10 11:39:48.572206] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:29:04.628 [2024-06-10 11:39:48.572245] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:29:04.628 [2024-06-10 11:39:48.572252] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:29:04.628 [2024-06-10 11:39:48.572260] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:29:04.888 11:39:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:29:04.888 11:39:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:04.888 11:39:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:04.888 11:39:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:04.888 11:39:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:04.888 11:39:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:04.888 11:39:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:04.888 11:39:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:04.888 11:39:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:04.888 11:39:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:04.888 11:39:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:04.888 11:39:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:04.888 11:39:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:04.888 "name": "Existed_Raid", 00:29:04.888 "uuid": "16d10034-d3cb-4050-b1e2-982f538567b6", 00:29:04.888 "strip_size_kb": 0, 00:29:04.888 "state": "configuring", 00:29:04.888 "raid_level": "raid1", 00:29:04.888 "superblock": true, 00:29:04.888 "num_base_bdevs": 2, 00:29:04.888 "num_base_bdevs_discovered": 0, 00:29:04.888 "num_base_bdevs_operational": 2, 00:29:04.888 "base_bdevs_list": [ 00:29:04.888 { 00:29:04.888 "name": "BaseBdev1", 00:29:04.888 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:04.888 "is_configured": false, 00:29:04.888 "data_offset": 0, 00:29:04.888 "data_size": 0 00:29:04.888 }, 00:29:04.888 { 00:29:04.888 "name": "BaseBdev2", 00:29:04.888 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:04.888 "is_configured": false, 00:29:04.888 "data_offset": 0, 00:29:04.888 "data_size": 0 00:29:04.888 } 00:29:04.888 ] 00:29:04.888 }' 00:29:04.888 11:39:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:04.888 11:39:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:29:05.457 11:39:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:29:05.457 [2024-06-10 11:39:49.402230] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:29:05.457 [2024-06-10 11:39:49.402257] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa60510 name Existed_Raid, state configuring 00:29:05.716 11:39:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:29:05.716 [2024-06-10 11:39:49.574685] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:29:05.716 [2024-06-10 11:39:49.574705] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:29:05.716 [2024-06-10 11:39:49.574711] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:29:05.716 [2024-06-10 11:39:49.574719] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:29:05.716 11:39:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:29:05.975 [2024-06-10 11:39:49.759736] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:29:05.975 BaseBdev1 00:29:05.975 11:39:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:29:05.975 11:39:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:29:05.975 11:39:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:29:05.975 11:39:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:29:05.975 11:39:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:29:05.975 11:39:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:29:05.975 11:39:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:29:06.235 11:39:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:29:06.235 [ 00:29:06.235 { 00:29:06.235 "name": "BaseBdev1", 00:29:06.235 "aliases": [ 00:29:06.235 "71dc5be3-a1d3-48cb-99ee-b48bdecf6b56" 00:29:06.235 ], 00:29:06.235 "product_name": "Malloc disk", 00:29:06.235 "block_size": 512, 00:29:06.235 "num_blocks": 65536, 00:29:06.235 "uuid": "71dc5be3-a1d3-48cb-99ee-b48bdecf6b56", 00:29:06.235 "assigned_rate_limits": { 00:29:06.235 "rw_ios_per_sec": 0, 00:29:06.235 "rw_mbytes_per_sec": 0, 00:29:06.235 "r_mbytes_per_sec": 0, 00:29:06.235 "w_mbytes_per_sec": 0 00:29:06.235 }, 00:29:06.235 "claimed": true, 00:29:06.235 "claim_type": "exclusive_write", 00:29:06.235 "zoned": false, 00:29:06.235 "supported_io_types": { 00:29:06.235 "read": true, 00:29:06.235 "write": true, 00:29:06.235 "unmap": true, 00:29:06.235 "write_zeroes": true, 00:29:06.235 "flush": true, 00:29:06.235 "reset": true, 00:29:06.235 "compare": false, 00:29:06.235 "compare_and_write": false, 00:29:06.235 "abort": true, 00:29:06.235 "nvme_admin": false, 00:29:06.235 "nvme_io": false 00:29:06.235 }, 00:29:06.235 "memory_domains": [ 00:29:06.235 { 00:29:06.235 "dma_device_id": "system", 00:29:06.235 "dma_device_type": 1 00:29:06.235 }, 00:29:06.235 { 00:29:06.235 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:06.235 "dma_device_type": 2 00:29:06.235 } 00:29:06.235 ], 00:29:06.235 "driver_specific": {} 00:29:06.235 } 00:29:06.235 ] 00:29:06.235 11:39:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:29:06.235 11:39:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:29:06.235 11:39:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:06.235 11:39:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:06.235 11:39:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:06.235 11:39:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:06.235 11:39:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:06.235 11:39:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:06.235 11:39:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:06.235 11:39:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:06.235 11:39:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:06.235 11:39:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:06.235 11:39:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:06.494 11:39:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:06.494 "name": "Existed_Raid", 00:29:06.494 "uuid": "3173e39f-16b3-4f87-9586-36033f14e53e", 00:29:06.494 "strip_size_kb": 0, 00:29:06.494 "state": "configuring", 00:29:06.494 "raid_level": "raid1", 00:29:06.494 "superblock": true, 00:29:06.494 "num_base_bdevs": 2, 00:29:06.494 "num_base_bdevs_discovered": 1, 00:29:06.494 "num_base_bdevs_operational": 2, 00:29:06.494 "base_bdevs_list": [ 00:29:06.494 { 00:29:06.494 "name": "BaseBdev1", 00:29:06.494 "uuid": "71dc5be3-a1d3-48cb-99ee-b48bdecf6b56", 00:29:06.494 "is_configured": true, 00:29:06.494 "data_offset": 2048, 00:29:06.494 "data_size": 63488 00:29:06.494 }, 00:29:06.494 { 00:29:06.494 "name": "BaseBdev2", 00:29:06.494 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:06.494 "is_configured": false, 00:29:06.494 "data_offset": 0, 00:29:06.494 "data_size": 0 00:29:06.494 } 00:29:06.494 ] 00:29:06.494 }' 00:29:06.494 11:39:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:06.494 11:39:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:29:07.063 11:39:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:29:07.063 [2024-06-10 11:39:50.938775] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:29:07.063 [2024-06-10 11:39:50.938806] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa5fe00 name Existed_Raid, state configuring 00:29:07.063 11:39:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:29:07.322 [2024-06-10 11:39:51.119281] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:29:07.322 [2024-06-10 11:39:51.120449] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:29:07.322 [2024-06-10 11:39:51.120476] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:29:07.322 11:39:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:29:07.322 11:39:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:29:07.322 11:39:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:29:07.323 11:39:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:07.323 11:39:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:07.323 11:39:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:07.323 11:39:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:07.323 11:39:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:07.323 11:39:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:07.323 11:39:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:07.323 11:39:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:07.323 11:39:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:07.323 11:39:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:07.323 11:39:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:07.582 11:39:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:07.582 "name": "Existed_Raid", 00:29:07.582 "uuid": "0ce3be51-7f8d-4f2e-bd49-31aaa0341f70", 00:29:07.582 "strip_size_kb": 0, 00:29:07.582 "state": "configuring", 00:29:07.582 "raid_level": "raid1", 00:29:07.582 "superblock": true, 00:29:07.582 "num_base_bdevs": 2, 00:29:07.582 "num_base_bdevs_discovered": 1, 00:29:07.582 "num_base_bdevs_operational": 2, 00:29:07.582 "base_bdevs_list": [ 00:29:07.582 { 00:29:07.582 "name": "BaseBdev1", 00:29:07.582 "uuid": "71dc5be3-a1d3-48cb-99ee-b48bdecf6b56", 00:29:07.582 "is_configured": true, 00:29:07.582 "data_offset": 2048, 00:29:07.582 "data_size": 63488 00:29:07.582 }, 00:29:07.582 { 00:29:07.582 "name": "BaseBdev2", 00:29:07.582 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:07.582 "is_configured": false, 00:29:07.582 "data_offset": 0, 00:29:07.582 "data_size": 0 00:29:07.582 } 00:29:07.582 ] 00:29:07.582 }' 00:29:07.582 11:39:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:07.582 11:39:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:29:07.841 11:39:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:29:08.101 [2024-06-10 11:39:51.948316] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:29:08.101 [2024-06-10 11:39:51.948432] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xa60bf0 00:29:08.101 [2024-06-10 11:39:51.948442] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:29:08.101 [2024-06-10 11:39:51.948559] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc129b0 00:29:08.101 [2024-06-10 11:39:51.948644] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xa60bf0 00:29:08.101 [2024-06-10 11:39:51.948651] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xa60bf0 00:29:08.101 [2024-06-10 11:39:51.948713] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:08.101 BaseBdev2 00:29:08.101 11:39:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:29:08.101 11:39:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:29:08.101 11:39:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:29:08.101 11:39:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:29:08.101 11:39:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:29:08.101 11:39:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:29:08.102 11:39:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:29:08.371 11:39:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:29:08.371 [ 00:29:08.371 { 00:29:08.371 "name": "BaseBdev2", 00:29:08.371 "aliases": [ 00:29:08.371 "8dede834-df5b-47a9-a3a6-eb4ca87d7627" 00:29:08.371 ], 00:29:08.371 "product_name": "Malloc disk", 00:29:08.371 "block_size": 512, 00:29:08.371 "num_blocks": 65536, 00:29:08.371 "uuid": "8dede834-df5b-47a9-a3a6-eb4ca87d7627", 00:29:08.371 "assigned_rate_limits": { 00:29:08.372 "rw_ios_per_sec": 0, 00:29:08.372 "rw_mbytes_per_sec": 0, 00:29:08.372 "r_mbytes_per_sec": 0, 00:29:08.372 "w_mbytes_per_sec": 0 00:29:08.372 }, 00:29:08.372 "claimed": true, 00:29:08.372 "claim_type": "exclusive_write", 00:29:08.372 "zoned": false, 00:29:08.372 "supported_io_types": { 00:29:08.372 "read": true, 00:29:08.372 "write": true, 00:29:08.372 "unmap": true, 00:29:08.372 "write_zeroes": true, 00:29:08.372 "flush": true, 00:29:08.372 "reset": true, 00:29:08.372 "compare": false, 00:29:08.372 "compare_and_write": false, 00:29:08.372 "abort": true, 00:29:08.372 "nvme_admin": false, 00:29:08.372 "nvme_io": false 00:29:08.372 }, 00:29:08.372 "memory_domains": [ 00:29:08.372 { 00:29:08.372 "dma_device_id": "system", 00:29:08.372 "dma_device_type": 1 00:29:08.372 }, 00:29:08.372 { 00:29:08.372 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:08.372 "dma_device_type": 2 00:29:08.372 } 00:29:08.372 ], 00:29:08.372 "driver_specific": {} 00:29:08.372 } 00:29:08.372 ] 00:29:08.637 11:39:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:29:08.637 11:39:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:29:08.637 11:39:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:29:08.637 11:39:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:29:08.637 11:39:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:08.637 11:39:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:08.637 11:39:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:08.637 11:39:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:08.637 11:39:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:08.637 11:39:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:08.637 11:39:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:08.637 11:39:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:08.637 11:39:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:08.637 11:39:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:08.637 11:39:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:08.637 11:39:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:08.637 "name": "Existed_Raid", 00:29:08.637 "uuid": "0ce3be51-7f8d-4f2e-bd49-31aaa0341f70", 00:29:08.637 "strip_size_kb": 0, 00:29:08.637 "state": "online", 00:29:08.637 "raid_level": "raid1", 00:29:08.637 "superblock": true, 00:29:08.637 "num_base_bdevs": 2, 00:29:08.637 "num_base_bdevs_discovered": 2, 00:29:08.637 "num_base_bdevs_operational": 2, 00:29:08.637 "base_bdevs_list": [ 00:29:08.637 { 00:29:08.637 "name": "BaseBdev1", 00:29:08.637 "uuid": "71dc5be3-a1d3-48cb-99ee-b48bdecf6b56", 00:29:08.637 "is_configured": true, 00:29:08.637 "data_offset": 2048, 00:29:08.637 "data_size": 63488 00:29:08.637 }, 00:29:08.637 { 00:29:08.637 "name": "BaseBdev2", 00:29:08.637 "uuid": "8dede834-df5b-47a9-a3a6-eb4ca87d7627", 00:29:08.637 "is_configured": true, 00:29:08.637 "data_offset": 2048, 00:29:08.637 "data_size": 63488 00:29:08.637 } 00:29:08.637 ] 00:29:08.637 }' 00:29:08.637 11:39:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:08.637 11:39:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:29:09.205 11:39:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:29:09.205 11:39:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:29:09.205 11:39:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:29:09.205 11:39:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:29:09.205 11:39:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:29:09.205 11:39:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:29:09.205 11:39:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:29:09.205 11:39:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:29:09.205 [2024-06-10 11:39:53.131534] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:09.463 11:39:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:29:09.463 "name": "Existed_Raid", 00:29:09.463 "aliases": [ 00:29:09.463 "0ce3be51-7f8d-4f2e-bd49-31aaa0341f70" 00:29:09.463 ], 00:29:09.463 "product_name": "Raid Volume", 00:29:09.463 "block_size": 512, 00:29:09.463 "num_blocks": 63488, 00:29:09.463 "uuid": "0ce3be51-7f8d-4f2e-bd49-31aaa0341f70", 00:29:09.463 "assigned_rate_limits": { 00:29:09.463 "rw_ios_per_sec": 0, 00:29:09.463 "rw_mbytes_per_sec": 0, 00:29:09.463 "r_mbytes_per_sec": 0, 00:29:09.463 "w_mbytes_per_sec": 0 00:29:09.463 }, 00:29:09.463 "claimed": false, 00:29:09.463 "zoned": false, 00:29:09.463 "supported_io_types": { 00:29:09.463 "read": true, 00:29:09.463 "write": true, 00:29:09.463 "unmap": false, 00:29:09.463 "write_zeroes": true, 00:29:09.463 "flush": false, 00:29:09.463 "reset": true, 00:29:09.463 "compare": false, 00:29:09.463 "compare_and_write": false, 00:29:09.463 "abort": false, 00:29:09.463 "nvme_admin": false, 00:29:09.463 "nvme_io": false 00:29:09.463 }, 00:29:09.463 "memory_domains": [ 00:29:09.463 { 00:29:09.463 "dma_device_id": "system", 00:29:09.463 "dma_device_type": 1 00:29:09.463 }, 00:29:09.463 { 00:29:09.463 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:09.463 "dma_device_type": 2 00:29:09.463 }, 00:29:09.463 { 00:29:09.463 "dma_device_id": "system", 00:29:09.463 "dma_device_type": 1 00:29:09.463 }, 00:29:09.463 { 00:29:09.463 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:09.463 "dma_device_type": 2 00:29:09.463 } 00:29:09.463 ], 00:29:09.463 "driver_specific": { 00:29:09.463 "raid": { 00:29:09.463 "uuid": "0ce3be51-7f8d-4f2e-bd49-31aaa0341f70", 00:29:09.463 "strip_size_kb": 0, 00:29:09.463 "state": "online", 00:29:09.463 "raid_level": "raid1", 00:29:09.463 "superblock": true, 00:29:09.463 "num_base_bdevs": 2, 00:29:09.463 "num_base_bdevs_discovered": 2, 00:29:09.463 "num_base_bdevs_operational": 2, 00:29:09.463 "base_bdevs_list": [ 00:29:09.463 { 00:29:09.463 "name": "BaseBdev1", 00:29:09.463 "uuid": "71dc5be3-a1d3-48cb-99ee-b48bdecf6b56", 00:29:09.463 "is_configured": true, 00:29:09.463 "data_offset": 2048, 00:29:09.463 "data_size": 63488 00:29:09.463 }, 00:29:09.463 { 00:29:09.463 "name": "BaseBdev2", 00:29:09.463 "uuid": "8dede834-df5b-47a9-a3a6-eb4ca87d7627", 00:29:09.463 "is_configured": true, 00:29:09.463 "data_offset": 2048, 00:29:09.463 "data_size": 63488 00:29:09.463 } 00:29:09.463 ] 00:29:09.463 } 00:29:09.463 } 00:29:09.463 }' 00:29:09.463 11:39:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:29:09.463 11:39:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:29:09.463 BaseBdev2' 00:29:09.463 11:39:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:09.463 11:39:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:29:09.463 11:39:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:09.463 11:39:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:09.463 "name": "BaseBdev1", 00:29:09.463 "aliases": [ 00:29:09.463 "71dc5be3-a1d3-48cb-99ee-b48bdecf6b56" 00:29:09.463 ], 00:29:09.463 "product_name": "Malloc disk", 00:29:09.463 "block_size": 512, 00:29:09.463 "num_blocks": 65536, 00:29:09.463 "uuid": "71dc5be3-a1d3-48cb-99ee-b48bdecf6b56", 00:29:09.463 "assigned_rate_limits": { 00:29:09.463 "rw_ios_per_sec": 0, 00:29:09.463 "rw_mbytes_per_sec": 0, 00:29:09.463 "r_mbytes_per_sec": 0, 00:29:09.463 "w_mbytes_per_sec": 0 00:29:09.463 }, 00:29:09.463 "claimed": true, 00:29:09.463 "claim_type": "exclusive_write", 00:29:09.463 "zoned": false, 00:29:09.463 "supported_io_types": { 00:29:09.463 "read": true, 00:29:09.463 "write": true, 00:29:09.463 "unmap": true, 00:29:09.463 "write_zeroes": true, 00:29:09.463 "flush": true, 00:29:09.463 "reset": true, 00:29:09.463 "compare": false, 00:29:09.463 "compare_and_write": false, 00:29:09.463 "abort": true, 00:29:09.463 "nvme_admin": false, 00:29:09.463 "nvme_io": false 00:29:09.463 }, 00:29:09.463 "memory_domains": [ 00:29:09.463 { 00:29:09.463 "dma_device_id": "system", 00:29:09.463 "dma_device_type": 1 00:29:09.463 }, 00:29:09.463 { 00:29:09.463 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:09.463 "dma_device_type": 2 00:29:09.463 } 00:29:09.463 ], 00:29:09.463 "driver_specific": {} 00:29:09.463 }' 00:29:09.463 11:39:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:09.463 11:39:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:09.722 11:39:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:29:09.722 11:39:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:09.722 11:39:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:09.722 11:39:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:29:09.722 11:39:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:09.722 11:39:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:09.722 11:39:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:29:09.722 11:39:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:09.722 11:39:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:09.722 11:39:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:29:09.722 11:39:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:09.981 11:39:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:29:09.981 11:39:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:09.981 11:39:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:09.981 "name": "BaseBdev2", 00:29:09.981 "aliases": [ 00:29:09.981 "8dede834-df5b-47a9-a3a6-eb4ca87d7627" 00:29:09.981 ], 00:29:09.981 "product_name": "Malloc disk", 00:29:09.981 "block_size": 512, 00:29:09.981 "num_blocks": 65536, 00:29:09.981 "uuid": "8dede834-df5b-47a9-a3a6-eb4ca87d7627", 00:29:09.981 "assigned_rate_limits": { 00:29:09.981 "rw_ios_per_sec": 0, 00:29:09.981 "rw_mbytes_per_sec": 0, 00:29:09.981 "r_mbytes_per_sec": 0, 00:29:09.981 "w_mbytes_per_sec": 0 00:29:09.981 }, 00:29:09.981 "claimed": true, 00:29:09.981 "claim_type": "exclusive_write", 00:29:09.981 "zoned": false, 00:29:09.981 "supported_io_types": { 00:29:09.981 "read": true, 00:29:09.981 "write": true, 00:29:09.981 "unmap": true, 00:29:09.981 "write_zeroes": true, 00:29:09.981 "flush": true, 00:29:09.981 "reset": true, 00:29:09.981 "compare": false, 00:29:09.981 "compare_and_write": false, 00:29:09.981 "abort": true, 00:29:09.981 "nvme_admin": false, 00:29:09.981 "nvme_io": false 00:29:09.981 }, 00:29:09.981 "memory_domains": [ 00:29:09.981 { 00:29:09.981 "dma_device_id": "system", 00:29:09.981 "dma_device_type": 1 00:29:09.981 }, 00:29:09.981 { 00:29:09.981 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:09.981 "dma_device_type": 2 00:29:09.981 } 00:29:09.981 ], 00:29:09.981 "driver_specific": {} 00:29:09.981 }' 00:29:09.981 11:39:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:09.981 11:39:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:10.240 11:39:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:29:10.240 11:39:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:10.240 11:39:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:10.240 11:39:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:29:10.240 11:39:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:10.240 11:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:10.240 11:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:29:10.240 11:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:10.240 11:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:10.240 11:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:29:10.240 11:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:29:10.500 [2024-06-10 11:39:54.262341] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:29:10.500 11:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:29:10.500 11:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:29:10.500 11:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:29:10.500 11:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:29:10.500 11:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:29:10.500 11:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:29:10.500 11:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:10.500 11:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:10.500 11:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:10.500 11:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:10.500 11:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:10.500 11:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:10.500 11:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:10.500 11:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:10.500 11:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:10.500 11:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:10.500 11:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:10.779 11:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:10.779 "name": "Existed_Raid", 00:29:10.779 "uuid": "0ce3be51-7f8d-4f2e-bd49-31aaa0341f70", 00:29:10.779 "strip_size_kb": 0, 00:29:10.779 "state": "online", 00:29:10.779 "raid_level": "raid1", 00:29:10.779 "superblock": true, 00:29:10.779 "num_base_bdevs": 2, 00:29:10.779 "num_base_bdevs_discovered": 1, 00:29:10.779 "num_base_bdevs_operational": 1, 00:29:10.779 "base_bdevs_list": [ 00:29:10.779 { 00:29:10.779 "name": null, 00:29:10.779 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:10.779 "is_configured": false, 00:29:10.779 "data_offset": 2048, 00:29:10.779 "data_size": 63488 00:29:10.779 }, 00:29:10.779 { 00:29:10.779 "name": "BaseBdev2", 00:29:10.780 "uuid": "8dede834-df5b-47a9-a3a6-eb4ca87d7627", 00:29:10.780 "is_configured": true, 00:29:10.780 "data_offset": 2048, 00:29:10.780 "data_size": 63488 00:29:10.780 } 00:29:10.780 ] 00:29:10.780 }' 00:29:10.780 11:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:10.780 11:39:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:29:11.038 11:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:29:11.039 11:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:29:11.039 11:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:11.039 11:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:29:11.298 11:39:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:29:11.298 11:39:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:29:11.298 11:39:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:29:11.298 [2024-06-10 11:39:55.218411] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:29:11.298 [2024-06-10 11:39:55.218472] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:29:11.298 [2024-06-10 11:39:55.229057] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:11.298 [2024-06-10 11:39:55.229084] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:11.298 [2024-06-10 11:39:55.229092] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa60bf0 name Existed_Raid, state offline 00:29:11.557 11:39:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:29:11.557 11:39:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:29:11.557 11:39:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:11.557 11:39:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:29:11.557 11:39:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:29:11.557 11:39:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:29:11.557 11:39:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:29:11.557 11:39:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 142594 00:29:11.557 11:39:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@949 -- # '[' -z 142594 ']' 00:29:11.557 11:39:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # kill -0 142594 00:29:11.557 11:39:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # uname 00:29:11.557 11:39:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:29:11.557 11:39:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 142594 00:29:11.557 11:39:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:29:11.557 11:39:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:29:11.557 11:39:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # echo 'killing process with pid 142594' 00:29:11.557 killing process with pid 142594 00:29:11.557 11:39:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # kill 142594 00:29:11.557 [2024-06-10 11:39:55.460999] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:29:11.557 11:39:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@973 -- # wait 142594 00:29:11.557 [2024-06-10 11:39:55.461862] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:29:11.817 11:39:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:29:11.817 00:29:11.817 real 0m8.109s 00:29:11.817 user 0m14.230s 00:29:11.817 sys 0m1.600s 00:29:11.817 11:39:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # xtrace_disable 00:29:11.817 11:39:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:29:11.817 ************************************ 00:29:11.817 END TEST raid_state_function_test_sb 00:29:11.817 ************************************ 00:29:11.817 11:39:55 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 2 00:29:11.817 11:39:55 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:29:11.817 11:39:55 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:29:11.817 11:39:55 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:29:11.817 ************************************ 00:29:11.817 START TEST raid_superblock_test 00:29:11.817 ************************************ 00:29:11.817 11:39:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # raid_superblock_test raid1 2 00:29:11.817 11:39:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:29:11.817 11:39:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:29:11.817 11:39:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:29:11.817 11:39:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:29:11.817 11:39:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:29:11.817 11:39:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:29:11.817 11:39:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:29:11.817 11:39:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:29:11.817 11:39:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:29:11.817 11:39:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:29:11.817 11:39:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:29:11.817 11:39:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:29:11.817 11:39:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:29:11.817 11:39:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:29:11.817 11:39:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:29:11.817 11:39:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=143885 00:29:11.817 11:39:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 143885 /var/tmp/spdk-raid.sock 00:29:11.817 11:39:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:29:11.817 11:39:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@830 -- # '[' -z 143885 ']' 00:29:11.817 11:39:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:29:11.817 11:39:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:29:11.818 11:39:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:29:11.818 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:29:11.818 11:39:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:29:11.818 11:39:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:29:12.077 [2024-06-10 11:39:55.798102] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:29:12.077 [2024-06-10 11:39:55.798153] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid143885 ] 00:29:12.077 [2024-06-10 11:39:55.885113] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:12.077 [2024-06-10 11:39:55.973049] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:29:12.336 [2024-06-10 11:39:56.035061] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:12.336 [2024-06-10 11:39:56.035088] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:12.990 11:39:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:29:12.990 11:39:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@863 -- # return 0 00:29:12.990 11:39:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:29:12.990 11:39:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:29:12.990 11:39:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:29:12.990 11:39:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:29:12.991 11:39:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:29:12.991 11:39:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:29:12.991 11:39:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:29:12.991 11:39:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:29:12.991 11:39:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:29:12.991 malloc1 00:29:12.991 11:39:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:29:12.991 [2024-06-10 11:39:56.930338] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:29:12.991 [2024-06-10 11:39:56.930380] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:12.991 [2024-06-10 11:39:56.930396] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f66100 00:29:12.991 [2024-06-10 11:39:56.930405] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:12.991 [2024-06-10 11:39:56.931704] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:12.991 [2024-06-10 11:39:56.931728] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:29:12.991 pt1 00:29:13.250 11:39:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:29:13.250 11:39:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:29:13.250 11:39:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:29:13.250 11:39:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:29:13.250 11:39:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:29:13.250 11:39:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:29:13.250 11:39:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:29:13.250 11:39:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:29:13.250 11:39:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:29:13.250 malloc2 00:29:13.250 11:39:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:29:13.508 [2024-06-10 11:39:57.272230] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:29:13.508 [2024-06-10 11:39:57.272269] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:13.508 [2024-06-10 11:39:57.272300] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f67500 00:29:13.508 [2024-06-10 11:39:57.272309] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:13.508 [2024-06-10 11:39:57.273514] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:13.508 [2024-06-10 11:39:57.273536] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:29:13.508 pt2 00:29:13.508 11:39:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:29:13.508 11:39:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:29:13.508 11:39:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:29:13.508 [2024-06-10 11:39:57.444686] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:29:13.508 [2024-06-10 11:39:57.445607] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:29:13.508 [2024-06-10 11:39:57.445713] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f66c00 00:29:13.508 [2024-06-10 11:39:57.445723] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:29:13.508 [2024-06-10 11:39:57.445854] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f7cff0 00:29:13.509 [2024-06-10 11:39:57.445967] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f66c00 00:29:13.509 [2024-06-10 11:39:57.445974] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1f66c00 00:29:13.509 [2024-06-10 11:39:57.446039] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:13.767 11:39:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:29:13.767 11:39:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:13.767 11:39:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:13.767 11:39:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:13.767 11:39:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:13.767 11:39:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:13.767 11:39:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:13.767 11:39:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:13.767 11:39:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:13.767 11:39:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:13.767 11:39:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:13.767 11:39:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:13.767 11:39:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:13.767 "name": "raid_bdev1", 00:29:13.767 "uuid": "bf0ed8e7-fffd-4a23-9ce4-85ed9af36674", 00:29:13.767 "strip_size_kb": 0, 00:29:13.767 "state": "online", 00:29:13.767 "raid_level": "raid1", 00:29:13.767 "superblock": true, 00:29:13.767 "num_base_bdevs": 2, 00:29:13.767 "num_base_bdevs_discovered": 2, 00:29:13.767 "num_base_bdevs_operational": 2, 00:29:13.767 "base_bdevs_list": [ 00:29:13.767 { 00:29:13.767 "name": "pt1", 00:29:13.767 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:13.767 "is_configured": true, 00:29:13.767 "data_offset": 2048, 00:29:13.768 "data_size": 63488 00:29:13.768 }, 00:29:13.768 { 00:29:13.768 "name": "pt2", 00:29:13.768 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:13.768 "is_configured": true, 00:29:13.768 "data_offset": 2048, 00:29:13.768 "data_size": 63488 00:29:13.768 } 00:29:13.768 ] 00:29:13.768 }' 00:29:13.768 11:39:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:13.768 11:39:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:29:14.335 11:39:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:29:14.335 11:39:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:29:14.335 11:39:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:29:14.335 11:39:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:29:14.335 11:39:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:29:14.335 11:39:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:29:14.335 11:39:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:14.335 11:39:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:29:14.335 [2024-06-10 11:39:58.218820] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:14.335 11:39:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:29:14.335 "name": "raid_bdev1", 00:29:14.335 "aliases": [ 00:29:14.335 "bf0ed8e7-fffd-4a23-9ce4-85ed9af36674" 00:29:14.335 ], 00:29:14.335 "product_name": "Raid Volume", 00:29:14.335 "block_size": 512, 00:29:14.335 "num_blocks": 63488, 00:29:14.335 "uuid": "bf0ed8e7-fffd-4a23-9ce4-85ed9af36674", 00:29:14.335 "assigned_rate_limits": { 00:29:14.335 "rw_ios_per_sec": 0, 00:29:14.335 "rw_mbytes_per_sec": 0, 00:29:14.335 "r_mbytes_per_sec": 0, 00:29:14.335 "w_mbytes_per_sec": 0 00:29:14.335 }, 00:29:14.335 "claimed": false, 00:29:14.335 "zoned": false, 00:29:14.335 "supported_io_types": { 00:29:14.335 "read": true, 00:29:14.335 "write": true, 00:29:14.335 "unmap": false, 00:29:14.335 "write_zeroes": true, 00:29:14.335 "flush": false, 00:29:14.335 "reset": true, 00:29:14.335 "compare": false, 00:29:14.335 "compare_and_write": false, 00:29:14.335 "abort": false, 00:29:14.335 "nvme_admin": false, 00:29:14.335 "nvme_io": false 00:29:14.335 }, 00:29:14.335 "memory_domains": [ 00:29:14.335 { 00:29:14.335 "dma_device_id": "system", 00:29:14.335 "dma_device_type": 1 00:29:14.335 }, 00:29:14.335 { 00:29:14.335 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:14.335 "dma_device_type": 2 00:29:14.335 }, 00:29:14.335 { 00:29:14.335 "dma_device_id": "system", 00:29:14.335 "dma_device_type": 1 00:29:14.335 }, 00:29:14.335 { 00:29:14.335 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:14.335 "dma_device_type": 2 00:29:14.335 } 00:29:14.335 ], 00:29:14.335 "driver_specific": { 00:29:14.335 "raid": { 00:29:14.335 "uuid": "bf0ed8e7-fffd-4a23-9ce4-85ed9af36674", 00:29:14.335 "strip_size_kb": 0, 00:29:14.335 "state": "online", 00:29:14.335 "raid_level": "raid1", 00:29:14.335 "superblock": true, 00:29:14.335 "num_base_bdevs": 2, 00:29:14.335 "num_base_bdevs_discovered": 2, 00:29:14.335 "num_base_bdevs_operational": 2, 00:29:14.335 "base_bdevs_list": [ 00:29:14.335 { 00:29:14.335 "name": "pt1", 00:29:14.335 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:14.335 "is_configured": true, 00:29:14.335 "data_offset": 2048, 00:29:14.335 "data_size": 63488 00:29:14.335 }, 00:29:14.335 { 00:29:14.335 "name": "pt2", 00:29:14.335 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:14.335 "is_configured": true, 00:29:14.335 "data_offset": 2048, 00:29:14.335 "data_size": 63488 00:29:14.335 } 00:29:14.335 ] 00:29:14.335 } 00:29:14.335 } 00:29:14.335 }' 00:29:14.335 11:39:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:29:14.335 11:39:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:29:14.335 pt2' 00:29:14.335 11:39:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:14.335 11:39:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:29:14.335 11:39:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:14.593 11:39:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:14.593 "name": "pt1", 00:29:14.593 "aliases": [ 00:29:14.593 "00000000-0000-0000-0000-000000000001" 00:29:14.593 ], 00:29:14.593 "product_name": "passthru", 00:29:14.593 "block_size": 512, 00:29:14.593 "num_blocks": 65536, 00:29:14.593 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:14.593 "assigned_rate_limits": { 00:29:14.593 "rw_ios_per_sec": 0, 00:29:14.593 "rw_mbytes_per_sec": 0, 00:29:14.593 "r_mbytes_per_sec": 0, 00:29:14.593 "w_mbytes_per_sec": 0 00:29:14.593 }, 00:29:14.593 "claimed": true, 00:29:14.593 "claim_type": "exclusive_write", 00:29:14.593 "zoned": false, 00:29:14.593 "supported_io_types": { 00:29:14.593 "read": true, 00:29:14.593 "write": true, 00:29:14.593 "unmap": true, 00:29:14.593 "write_zeroes": true, 00:29:14.593 "flush": true, 00:29:14.593 "reset": true, 00:29:14.593 "compare": false, 00:29:14.593 "compare_and_write": false, 00:29:14.593 "abort": true, 00:29:14.593 "nvme_admin": false, 00:29:14.593 "nvme_io": false 00:29:14.593 }, 00:29:14.593 "memory_domains": [ 00:29:14.593 { 00:29:14.593 "dma_device_id": "system", 00:29:14.593 "dma_device_type": 1 00:29:14.593 }, 00:29:14.593 { 00:29:14.594 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:14.594 "dma_device_type": 2 00:29:14.594 } 00:29:14.594 ], 00:29:14.594 "driver_specific": { 00:29:14.594 "passthru": { 00:29:14.594 "name": "pt1", 00:29:14.594 "base_bdev_name": "malloc1" 00:29:14.594 } 00:29:14.594 } 00:29:14.594 }' 00:29:14.594 11:39:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:14.594 11:39:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:14.594 11:39:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:29:14.594 11:39:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:14.594 11:39:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:14.852 11:39:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:29:14.852 11:39:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:14.852 11:39:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:14.852 11:39:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:29:14.852 11:39:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:14.852 11:39:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:14.852 11:39:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:29:14.852 11:39:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:14.852 11:39:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:29:14.852 11:39:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:15.112 11:39:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:15.112 "name": "pt2", 00:29:15.112 "aliases": [ 00:29:15.112 "00000000-0000-0000-0000-000000000002" 00:29:15.112 ], 00:29:15.112 "product_name": "passthru", 00:29:15.112 "block_size": 512, 00:29:15.112 "num_blocks": 65536, 00:29:15.112 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:15.112 "assigned_rate_limits": { 00:29:15.112 "rw_ios_per_sec": 0, 00:29:15.112 "rw_mbytes_per_sec": 0, 00:29:15.112 "r_mbytes_per_sec": 0, 00:29:15.112 "w_mbytes_per_sec": 0 00:29:15.112 }, 00:29:15.112 "claimed": true, 00:29:15.112 "claim_type": "exclusive_write", 00:29:15.112 "zoned": false, 00:29:15.112 "supported_io_types": { 00:29:15.112 "read": true, 00:29:15.112 "write": true, 00:29:15.112 "unmap": true, 00:29:15.112 "write_zeroes": true, 00:29:15.112 "flush": true, 00:29:15.112 "reset": true, 00:29:15.112 "compare": false, 00:29:15.112 "compare_and_write": false, 00:29:15.112 "abort": true, 00:29:15.112 "nvme_admin": false, 00:29:15.112 "nvme_io": false 00:29:15.112 }, 00:29:15.112 "memory_domains": [ 00:29:15.112 { 00:29:15.112 "dma_device_id": "system", 00:29:15.112 "dma_device_type": 1 00:29:15.112 }, 00:29:15.112 { 00:29:15.112 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:15.112 "dma_device_type": 2 00:29:15.112 } 00:29:15.112 ], 00:29:15.112 "driver_specific": { 00:29:15.112 "passthru": { 00:29:15.112 "name": "pt2", 00:29:15.112 "base_bdev_name": "malloc2" 00:29:15.112 } 00:29:15.112 } 00:29:15.112 }' 00:29:15.112 11:39:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:15.112 11:39:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:15.112 11:39:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:29:15.112 11:39:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:15.112 11:39:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:15.112 11:39:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:29:15.112 11:39:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:15.112 11:39:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:15.371 11:39:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:29:15.371 11:39:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:15.371 11:39:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:15.371 11:39:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:29:15.371 11:39:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:15.371 11:39:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:29:15.371 [2024-06-10 11:39:59.301634] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:15.630 11:39:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=bf0ed8e7-fffd-4a23-9ce4-85ed9af36674 00:29:15.630 11:39:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z bf0ed8e7-fffd-4a23-9ce4-85ed9af36674 ']' 00:29:15.630 11:39:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:29:15.630 [2024-06-10 11:39:59.481943] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:15.630 [2024-06-10 11:39:59.481961] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:29:15.630 [2024-06-10 11:39:59.481997] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:15.630 [2024-06-10 11:39:59.482033] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:15.630 [2024-06-10 11:39:59.482041] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f66c00 name raid_bdev1, state offline 00:29:15.630 11:39:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:15.630 11:39:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:29:15.889 11:39:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:29:15.889 11:39:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:29:15.889 11:39:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:29:15.889 11:39:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:29:16.148 11:39:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:29:16.148 11:39:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:29:16.148 11:40:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:29:16.148 11:40:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:29:16.407 11:40:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:29:16.407 11:40:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:29:16.407 11:40:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@649 -- # local es=0 00:29:16.407 11:40:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:29:16.407 11:40:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:16.407 11:40:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:29:16.407 11:40:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:16.407 11:40:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:29:16.407 11:40:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:16.407 11:40:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:29:16.407 11:40:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:16.407 11:40:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:29:16.407 11:40:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:29:16.666 [2024-06-10 11:40:00.372220] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:29:16.666 [2024-06-10 11:40:00.373248] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:29:16.666 [2024-06-10 11:40:00.373292] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:29:16.666 [2024-06-10 11:40:00.373323] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:29:16.666 [2024-06-10 11:40:00.373336] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:16.666 [2024-06-10 11:40:00.373343] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21109e0 name raid_bdev1, state configuring 00:29:16.666 request: 00:29:16.666 { 00:29:16.666 "name": "raid_bdev1", 00:29:16.666 "raid_level": "raid1", 00:29:16.666 "base_bdevs": [ 00:29:16.666 "malloc1", 00:29:16.666 "malloc2" 00:29:16.666 ], 00:29:16.666 "superblock": false, 00:29:16.666 "method": "bdev_raid_create", 00:29:16.666 "req_id": 1 00:29:16.666 } 00:29:16.666 Got JSON-RPC error response 00:29:16.666 response: 00:29:16.666 { 00:29:16.666 "code": -17, 00:29:16.666 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:29:16.666 } 00:29:16.666 11:40:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # es=1 00:29:16.666 11:40:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:29:16.666 11:40:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:29:16.666 11:40:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:29:16.666 11:40:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:29:16.666 11:40:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:16.666 11:40:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:29:16.666 11:40:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:29:16.666 11:40:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:29:16.925 [2024-06-10 11:40:00.737130] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:29:16.925 [2024-06-10 11:40:00.737166] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:16.925 [2024-06-10 11:40:00.737180] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2111e70 00:29:16.925 [2024-06-10 11:40:00.737188] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:16.925 [2024-06-10 11:40:00.738414] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:16.925 [2024-06-10 11:40:00.738436] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:29:16.925 [2024-06-10 11:40:00.738486] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:29:16.925 [2024-06-10 11:40:00.738506] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:29:16.925 pt1 00:29:16.925 11:40:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:29:16.925 11:40:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:16.925 11:40:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:16.925 11:40:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:16.925 11:40:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:16.925 11:40:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:16.925 11:40:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:16.925 11:40:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:16.925 11:40:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:16.925 11:40:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:16.925 11:40:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:16.925 11:40:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:17.184 11:40:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:17.184 "name": "raid_bdev1", 00:29:17.184 "uuid": "bf0ed8e7-fffd-4a23-9ce4-85ed9af36674", 00:29:17.184 "strip_size_kb": 0, 00:29:17.184 "state": "configuring", 00:29:17.184 "raid_level": "raid1", 00:29:17.184 "superblock": true, 00:29:17.184 "num_base_bdevs": 2, 00:29:17.184 "num_base_bdevs_discovered": 1, 00:29:17.184 "num_base_bdevs_operational": 2, 00:29:17.184 "base_bdevs_list": [ 00:29:17.184 { 00:29:17.184 "name": "pt1", 00:29:17.184 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:17.184 "is_configured": true, 00:29:17.184 "data_offset": 2048, 00:29:17.184 "data_size": 63488 00:29:17.184 }, 00:29:17.184 { 00:29:17.184 "name": null, 00:29:17.184 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:17.184 "is_configured": false, 00:29:17.184 "data_offset": 2048, 00:29:17.184 "data_size": 63488 00:29:17.184 } 00:29:17.184 ] 00:29:17.184 }' 00:29:17.184 11:40:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:17.184 11:40:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:29:17.443 11:40:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:29:17.443 11:40:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:29:17.443 11:40:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:29:17.443 11:40:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:29:17.702 [2024-06-10 11:40:01.507132] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:29:17.702 [2024-06-10 11:40:01.507172] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:17.702 [2024-06-10 11:40:01.507185] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x210f340 00:29:17.703 [2024-06-10 11:40:01.507193] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:17.703 [2024-06-10 11:40:01.507439] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:17.703 [2024-06-10 11:40:01.507451] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:29:17.703 [2024-06-10 11:40:01.507498] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:29:17.703 [2024-06-10 11:40:01.507513] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:29:17.703 [2024-06-10 11:40:01.507585] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f65490 00:29:17.703 [2024-06-10 11:40:01.507592] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:29:17.703 [2024-06-10 11:40:01.507698] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21165d0 00:29:17.703 [2024-06-10 11:40:01.507782] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f65490 00:29:17.703 [2024-06-10 11:40:01.507789] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1f65490 00:29:17.703 [2024-06-10 11:40:01.507852] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:17.703 pt2 00:29:17.703 11:40:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:29:17.703 11:40:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:29:17.703 11:40:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:29:17.703 11:40:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:17.703 11:40:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:17.703 11:40:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:17.703 11:40:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:17.703 11:40:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:17.703 11:40:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:17.703 11:40:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:17.703 11:40:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:17.703 11:40:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:17.703 11:40:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:17.703 11:40:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:17.962 11:40:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:17.962 "name": "raid_bdev1", 00:29:17.962 "uuid": "bf0ed8e7-fffd-4a23-9ce4-85ed9af36674", 00:29:17.962 "strip_size_kb": 0, 00:29:17.962 "state": "online", 00:29:17.962 "raid_level": "raid1", 00:29:17.962 "superblock": true, 00:29:17.962 "num_base_bdevs": 2, 00:29:17.962 "num_base_bdevs_discovered": 2, 00:29:17.962 "num_base_bdevs_operational": 2, 00:29:17.962 "base_bdevs_list": [ 00:29:17.962 { 00:29:17.962 "name": "pt1", 00:29:17.962 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:17.962 "is_configured": true, 00:29:17.962 "data_offset": 2048, 00:29:17.962 "data_size": 63488 00:29:17.962 }, 00:29:17.962 { 00:29:17.962 "name": "pt2", 00:29:17.962 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:17.962 "is_configured": true, 00:29:17.962 "data_offset": 2048, 00:29:17.962 "data_size": 63488 00:29:17.962 } 00:29:17.962 ] 00:29:17.962 }' 00:29:17.962 11:40:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:17.962 11:40:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:29:18.528 11:40:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:29:18.528 11:40:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:29:18.528 11:40:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:29:18.528 11:40:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:29:18.528 11:40:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:29:18.528 11:40:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:29:18.528 11:40:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:18.528 11:40:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:29:18.529 [2024-06-10 11:40:02.337425] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:18.529 11:40:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:29:18.529 "name": "raid_bdev1", 00:29:18.529 "aliases": [ 00:29:18.529 "bf0ed8e7-fffd-4a23-9ce4-85ed9af36674" 00:29:18.529 ], 00:29:18.529 "product_name": "Raid Volume", 00:29:18.529 "block_size": 512, 00:29:18.529 "num_blocks": 63488, 00:29:18.529 "uuid": "bf0ed8e7-fffd-4a23-9ce4-85ed9af36674", 00:29:18.529 "assigned_rate_limits": { 00:29:18.529 "rw_ios_per_sec": 0, 00:29:18.529 "rw_mbytes_per_sec": 0, 00:29:18.529 "r_mbytes_per_sec": 0, 00:29:18.529 "w_mbytes_per_sec": 0 00:29:18.529 }, 00:29:18.529 "claimed": false, 00:29:18.529 "zoned": false, 00:29:18.529 "supported_io_types": { 00:29:18.529 "read": true, 00:29:18.529 "write": true, 00:29:18.529 "unmap": false, 00:29:18.529 "write_zeroes": true, 00:29:18.529 "flush": false, 00:29:18.529 "reset": true, 00:29:18.529 "compare": false, 00:29:18.529 "compare_and_write": false, 00:29:18.529 "abort": false, 00:29:18.529 "nvme_admin": false, 00:29:18.529 "nvme_io": false 00:29:18.529 }, 00:29:18.529 "memory_domains": [ 00:29:18.529 { 00:29:18.529 "dma_device_id": "system", 00:29:18.529 "dma_device_type": 1 00:29:18.529 }, 00:29:18.529 { 00:29:18.529 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:18.529 "dma_device_type": 2 00:29:18.529 }, 00:29:18.529 { 00:29:18.529 "dma_device_id": "system", 00:29:18.529 "dma_device_type": 1 00:29:18.529 }, 00:29:18.529 { 00:29:18.529 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:18.529 "dma_device_type": 2 00:29:18.529 } 00:29:18.529 ], 00:29:18.529 "driver_specific": { 00:29:18.529 "raid": { 00:29:18.529 "uuid": "bf0ed8e7-fffd-4a23-9ce4-85ed9af36674", 00:29:18.529 "strip_size_kb": 0, 00:29:18.529 "state": "online", 00:29:18.529 "raid_level": "raid1", 00:29:18.529 "superblock": true, 00:29:18.529 "num_base_bdevs": 2, 00:29:18.529 "num_base_bdevs_discovered": 2, 00:29:18.529 "num_base_bdevs_operational": 2, 00:29:18.529 "base_bdevs_list": [ 00:29:18.529 { 00:29:18.529 "name": "pt1", 00:29:18.529 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:18.529 "is_configured": true, 00:29:18.529 "data_offset": 2048, 00:29:18.529 "data_size": 63488 00:29:18.529 }, 00:29:18.529 { 00:29:18.529 "name": "pt2", 00:29:18.529 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:18.529 "is_configured": true, 00:29:18.529 "data_offset": 2048, 00:29:18.529 "data_size": 63488 00:29:18.529 } 00:29:18.529 ] 00:29:18.529 } 00:29:18.529 } 00:29:18.529 }' 00:29:18.529 11:40:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:29:18.529 11:40:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:29:18.529 pt2' 00:29:18.529 11:40:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:18.529 11:40:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:29:18.529 11:40:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:18.787 11:40:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:18.787 "name": "pt1", 00:29:18.787 "aliases": [ 00:29:18.787 "00000000-0000-0000-0000-000000000001" 00:29:18.787 ], 00:29:18.787 "product_name": "passthru", 00:29:18.787 "block_size": 512, 00:29:18.787 "num_blocks": 65536, 00:29:18.787 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:18.787 "assigned_rate_limits": { 00:29:18.787 "rw_ios_per_sec": 0, 00:29:18.787 "rw_mbytes_per_sec": 0, 00:29:18.787 "r_mbytes_per_sec": 0, 00:29:18.787 "w_mbytes_per_sec": 0 00:29:18.787 }, 00:29:18.787 "claimed": true, 00:29:18.787 "claim_type": "exclusive_write", 00:29:18.787 "zoned": false, 00:29:18.787 "supported_io_types": { 00:29:18.787 "read": true, 00:29:18.788 "write": true, 00:29:18.788 "unmap": true, 00:29:18.788 "write_zeroes": true, 00:29:18.788 "flush": true, 00:29:18.788 "reset": true, 00:29:18.788 "compare": false, 00:29:18.788 "compare_and_write": false, 00:29:18.788 "abort": true, 00:29:18.788 "nvme_admin": false, 00:29:18.788 "nvme_io": false 00:29:18.788 }, 00:29:18.788 "memory_domains": [ 00:29:18.788 { 00:29:18.788 "dma_device_id": "system", 00:29:18.788 "dma_device_type": 1 00:29:18.788 }, 00:29:18.788 { 00:29:18.788 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:18.788 "dma_device_type": 2 00:29:18.788 } 00:29:18.788 ], 00:29:18.788 "driver_specific": { 00:29:18.788 "passthru": { 00:29:18.788 "name": "pt1", 00:29:18.788 "base_bdev_name": "malloc1" 00:29:18.788 } 00:29:18.788 } 00:29:18.788 }' 00:29:18.788 11:40:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:18.788 11:40:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:18.788 11:40:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:29:18.788 11:40:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:18.788 11:40:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:18.788 11:40:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:29:18.788 11:40:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:19.046 11:40:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:19.046 11:40:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:29:19.046 11:40:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:19.046 11:40:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:19.046 11:40:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:29:19.046 11:40:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:19.046 11:40:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:29:19.046 11:40:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:19.305 11:40:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:19.305 "name": "pt2", 00:29:19.305 "aliases": [ 00:29:19.305 "00000000-0000-0000-0000-000000000002" 00:29:19.305 ], 00:29:19.305 "product_name": "passthru", 00:29:19.305 "block_size": 512, 00:29:19.305 "num_blocks": 65536, 00:29:19.305 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:19.305 "assigned_rate_limits": { 00:29:19.305 "rw_ios_per_sec": 0, 00:29:19.305 "rw_mbytes_per_sec": 0, 00:29:19.305 "r_mbytes_per_sec": 0, 00:29:19.305 "w_mbytes_per_sec": 0 00:29:19.305 }, 00:29:19.305 "claimed": true, 00:29:19.305 "claim_type": "exclusive_write", 00:29:19.305 "zoned": false, 00:29:19.305 "supported_io_types": { 00:29:19.305 "read": true, 00:29:19.305 "write": true, 00:29:19.305 "unmap": true, 00:29:19.305 "write_zeroes": true, 00:29:19.305 "flush": true, 00:29:19.305 "reset": true, 00:29:19.305 "compare": false, 00:29:19.305 "compare_and_write": false, 00:29:19.305 "abort": true, 00:29:19.305 "nvme_admin": false, 00:29:19.306 "nvme_io": false 00:29:19.306 }, 00:29:19.306 "memory_domains": [ 00:29:19.306 { 00:29:19.306 "dma_device_id": "system", 00:29:19.306 "dma_device_type": 1 00:29:19.306 }, 00:29:19.306 { 00:29:19.306 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:19.306 "dma_device_type": 2 00:29:19.306 } 00:29:19.306 ], 00:29:19.306 "driver_specific": { 00:29:19.306 "passthru": { 00:29:19.306 "name": "pt2", 00:29:19.306 "base_bdev_name": "malloc2" 00:29:19.306 } 00:29:19.306 } 00:29:19.306 }' 00:29:19.306 11:40:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:19.306 11:40:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:19.306 11:40:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:29:19.306 11:40:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:19.306 11:40:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:19.306 11:40:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:29:19.306 11:40:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:19.306 11:40:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:19.306 11:40:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:29:19.306 11:40:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:19.565 11:40:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:19.565 11:40:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:29:19.565 11:40:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:19.565 11:40:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:29:19.565 [2024-06-10 11:40:03.476342] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:19.565 11:40:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' bf0ed8e7-fffd-4a23-9ce4-85ed9af36674 '!=' bf0ed8e7-fffd-4a23-9ce4-85ed9af36674 ']' 00:29:19.565 11:40:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:29:19.565 11:40:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:29:19.565 11:40:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:29:19.565 11:40:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:29:19.824 [2024-06-10 11:40:03.656694] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:29:19.824 11:40:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:19.824 11:40:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:19.824 11:40:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:19.824 11:40:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:19.824 11:40:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:19.824 11:40:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:19.824 11:40:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:19.824 11:40:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:19.824 11:40:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:19.824 11:40:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:19.824 11:40:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:19.824 11:40:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:20.083 11:40:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:20.083 "name": "raid_bdev1", 00:29:20.083 "uuid": "bf0ed8e7-fffd-4a23-9ce4-85ed9af36674", 00:29:20.083 "strip_size_kb": 0, 00:29:20.083 "state": "online", 00:29:20.083 "raid_level": "raid1", 00:29:20.083 "superblock": true, 00:29:20.083 "num_base_bdevs": 2, 00:29:20.083 "num_base_bdevs_discovered": 1, 00:29:20.083 "num_base_bdevs_operational": 1, 00:29:20.083 "base_bdevs_list": [ 00:29:20.083 { 00:29:20.083 "name": null, 00:29:20.083 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:20.083 "is_configured": false, 00:29:20.083 "data_offset": 2048, 00:29:20.083 "data_size": 63488 00:29:20.083 }, 00:29:20.083 { 00:29:20.083 "name": "pt2", 00:29:20.083 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:20.083 "is_configured": true, 00:29:20.083 "data_offset": 2048, 00:29:20.083 "data_size": 63488 00:29:20.083 } 00:29:20.083 ] 00:29:20.083 }' 00:29:20.083 11:40:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:20.083 11:40:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:29:20.651 11:40:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:29:20.651 [2024-06-10 11:40:04.490843] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:20.651 [2024-06-10 11:40:04.490865] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:29:20.651 [2024-06-10 11:40:04.490910] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:20.651 [2024-06-10 11:40:04.490940] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:20.651 [2024-06-10 11:40:04.490948] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f65490 name raid_bdev1, state offline 00:29:20.651 11:40:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:20.651 11:40:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:29:20.910 11:40:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:29:20.910 11:40:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:29:20.910 11:40:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:29:20.910 11:40:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:29:20.910 11:40:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:29:21.169 11:40:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:29:21.169 11:40:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:29:21.169 11:40:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:29:21.169 11:40:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:29:21.169 11:40:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=1 00:29:21.169 11:40:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:29:21.169 [2024-06-10 11:40:05.024202] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:29:21.169 [2024-06-10 11:40:05.024236] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:21.169 [2024-06-10 11:40:05.024247] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x210f570 00:29:21.169 [2024-06-10 11:40:05.024271] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:21.169 [2024-06-10 11:40:05.025428] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:21.169 [2024-06-10 11:40:05.025449] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:29:21.169 [2024-06-10 11:40:05.025501] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:29:21.169 [2024-06-10 11:40:05.025520] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:29:21.170 [2024-06-10 11:40:05.025579] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2113340 00:29:21.170 [2024-06-10 11:40:05.025587] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:29:21.170 [2024-06-10 11:40:05.025702] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2111050 00:29:21.170 [2024-06-10 11:40:05.025784] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2113340 00:29:21.170 [2024-06-10 11:40:05.025790] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2113340 00:29:21.170 [2024-06-10 11:40:05.025853] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:21.170 pt2 00:29:21.170 11:40:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:21.170 11:40:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:21.170 11:40:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:21.170 11:40:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:21.170 11:40:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:21.170 11:40:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:21.170 11:40:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:21.170 11:40:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:21.170 11:40:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:21.170 11:40:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:21.170 11:40:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:21.170 11:40:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:21.429 11:40:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:21.429 "name": "raid_bdev1", 00:29:21.429 "uuid": "bf0ed8e7-fffd-4a23-9ce4-85ed9af36674", 00:29:21.429 "strip_size_kb": 0, 00:29:21.429 "state": "online", 00:29:21.429 "raid_level": "raid1", 00:29:21.429 "superblock": true, 00:29:21.429 "num_base_bdevs": 2, 00:29:21.429 "num_base_bdevs_discovered": 1, 00:29:21.429 "num_base_bdevs_operational": 1, 00:29:21.429 "base_bdevs_list": [ 00:29:21.429 { 00:29:21.429 "name": null, 00:29:21.429 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:21.429 "is_configured": false, 00:29:21.429 "data_offset": 2048, 00:29:21.429 "data_size": 63488 00:29:21.429 }, 00:29:21.429 { 00:29:21.429 "name": "pt2", 00:29:21.429 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:21.429 "is_configured": true, 00:29:21.429 "data_offset": 2048, 00:29:21.429 "data_size": 63488 00:29:21.429 } 00:29:21.429 ] 00:29:21.429 }' 00:29:21.429 11:40:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:21.429 11:40:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:29:21.997 11:40:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:29:21.997 [2024-06-10 11:40:05.870393] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:21.997 [2024-06-10 11:40:05.870413] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:29:21.997 [2024-06-10 11:40:05.870454] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:21.997 [2024-06-10 11:40:05.870488] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:21.997 [2024-06-10 11:40:05.870497] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2113340 name raid_bdev1, state offline 00:29:21.997 11:40:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:21.997 11:40:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:29:22.256 11:40:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:29:22.256 11:40:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:29:22.256 11:40:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:29:22.256 11:40:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:29:22.517 [2024-06-10 11:40:06.227445] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:29:22.517 [2024-06-10 11:40:06.227488] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:22.517 [2024-06-10 11:40:06.227500] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21130d0 00:29:22.517 [2024-06-10 11:40:06.227508] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:22.517 [2024-06-10 11:40:06.228674] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:22.517 [2024-06-10 11:40:06.228695] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:29:22.517 [2024-06-10 11:40:06.228743] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:29:22.517 [2024-06-10 11:40:06.228762] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:29:22.517 [2024-06-10 11:40:06.228829] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:29:22.518 [2024-06-10 11:40:06.228838] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:22.518 [2024-06-10 11:40:06.228846] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2110330 name raid_bdev1, state configuring 00:29:22.518 [2024-06-10 11:40:06.228862] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:29:22.518 [2024-06-10 11:40:06.228912] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2116360 00:29:22.518 [2024-06-10 11:40:06.228919] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:29:22.518 [2024-06-10 11:40:06.229032] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2111050 00:29:22.518 [2024-06-10 11:40:06.229114] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2116360 00:29:22.518 [2024-06-10 11:40:06.229121] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2116360 00:29:22.518 [2024-06-10 11:40:06.229186] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:22.518 pt1 00:29:22.518 11:40:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:29:22.518 11:40:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:22.518 11:40:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:22.518 11:40:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:22.518 11:40:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:22.518 11:40:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:22.518 11:40:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:22.518 11:40:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:22.518 11:40:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:22.518 11:40:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:22.518 11:40:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:22.518 11:40:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:22.518 11:40:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:22.518 11:40:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:22.518 "name": "raid_bdev1", 00:29:22.518 "uuid": "bf0ed8e7-fffd-4a23-9ce4-85ed9af36674", 00:29:22.518 "strip_size_kb": 0, 00:29:22.518 "state": "online", 00:29:22.518 "raid_level": "raid1", 00:29:22.518 "superblock": true, 00:29:22.518 "num_base_bdevs": 2, 00:29:22.518 "num_base_bdevs_discovered": 1, 00:29:22.518 "num_base_bdevs_operational": 1, 00:29:22.518 "base_bdevs_list": [ 00:29:22.518 { 00:29:22.518 "name": null, 00:29:22.518 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:22.518 "is_configured": false, 00:29:22.518 "data_offset": 2048, 00:29:22.518 "data_size": 63488 00:29:22.518 }, 00:29:22.518 { 00:29:22.518 "name": "pt2", 00:29:22.518 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:22.518 "is_configured": true, 00:29:22.518 "data_offset": 2048, 00:29:22.518 "data_size": 63488 00:29:22.518 } 00:29:22.518 ] 00:29:22.518 }' 00:29:22.518 11:40:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:22.518 11:40:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:29:23.086 11:40:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:29:23.086 11:40:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:29:23.346 11:40:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:29:23.346 11:40:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:23.346 11:40:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:29:23.346 [2024-06-10 11:40:07.254228] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:23.346 11:40:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' bf0ed8e7-fffd-4a23-9ce4-85ed9af36674 '!=' bf0ed8e7-fffd-4a23-9ce4-85ed9af36674 ']' 00:29:23.346 11:40:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 143885 00:29:23.346 11:40:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@949 -- # '[' -z 143885 ']' 00:29:23.346 11:40:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # kill -0 143885 00:29:23.346 11:40:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # uname 00:29:23.346 11:40:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:29:23.346 11:40:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 143885 00:29:23.606 11:40:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:29:23.606 11:40:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:29:23.606 11:40:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 143885' 00:29:23.606 killing process with pid 143885 00:29:23.606 11:40:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # kill 143885 00:29:23.606 [2024-06-10 11:40:07.304168] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:29:23.606 [2024-06-10 11:40:07.304210] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:23.606 [2024-06-10 11:40:07.304240] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:23.606 [2024-06-10 11:40:07.304248] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2116360 name raid_bdev1, state offline 00:29:23.606 11:40:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@973 -- # wait 143885 00:29:23.606 [2024-06-10 11:40:07.322481] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:29:23.606 11:40:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:29:23.606 00:29:23.606 real 0m11.778s 00:29:23.606 user 0m21.219s 00:29:23.606 sys 0m2.254s 00:29:23.606 11:40:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:29:23.606 11:40:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:29:23.606 ************************************ 00:29:23.606 END TEST raid_superblock_test 00:29:23.606 ************************************ 00:29:23.866 11:40:07 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 2 read 00:29:23.866 11:40:07 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:29:23.866 11:40:07 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:29:23.866 11:40:07 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:29:23.866 ************************************ 00:29:23.866 START TEST raid_read_error_test 00:29:23.866 ************************************ 00:29:23.866 11:40:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test raid1 2 read 00:29:23.866 11:40:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:29:23.866 11:40:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:29:23.866 11:40:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:29:23.866 11:40:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:29:23.866 11:40:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:29:23.866 11:40:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:29:23.866 11:40:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:29:23.866 11:40:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:29:23.866 11:40:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:29:23.866 11:40:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:29:23.866 11:40:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:29:23.866 11:40:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:29:23.866 11:40:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:29:23.866 11:40:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:29:23.866 11:40:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:29:23.867 11:40:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:29:23.867 11:40:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:29:23.867 11:40:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:29:23.867 11:40:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:29:23.867 11:40:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:29:23.867 11:40:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:29:23.867 11:40:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.0w2ViLK0kw 00:29:23.867 11:40:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=145803 00:29:23.867 11:40:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 145803 /var/tmp/spdk-raid.sock 00:29:23.867 11:40:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:29:23.867 11:40:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@830 -- # '[' -z 145803 ']' 00:29:23.867 11:40:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:29:23.867 11:40:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:29:23.867 11:40:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:29:23.867 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:29:23.867 11:40:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:29:23.867 11:40:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:29:23.867 [2024-06-10 11:40:07.670533] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:29:23.867 [2024-06-10 11:40:07.670586] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid145803 ] 00:29:23.867 [2024-06-10 11:40:07.757091] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:24.126 [2024-06-10 11:40:07.845166] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:29:24.126 [2024-06-10 11:40:07.903854] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:24.126 [2024-06-10 11:40:07.903895] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:24.694 11:40:08 bdev_raid.raid_read_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:29:24.694 11:40:08 bdev_raid.raid_read_error_test -- common/autotest_common.sh@863 -- # return 0 00:29:24.694 11:40:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:29:24.694 11:40:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:29:24.694 BaseBdev1_malloc 00:29:24.694 11:40:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:29:24.953 true 00:29:24.953 11:40:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:29:25.212 [2024-06-10 11:40:08.956200] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:29:25.212 [2024-06-10 11:40:08.956236] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:25.212 [2024-06-10 11:40:08.956249] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1902b10 00:29:25.212 [2024-06-10 11:40:08.956273] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:25.212 [2024-06-10 11:40:08.957644] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:25.212 [2024-06-10 11:40:08.957667] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:29:25.212 BaseBdev1 00:29:25.212 11:40:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:29:25.212 11:40:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:29:25.212 BaseBdev2_malloc 00:29:25.212 11:40:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:29:25.471 true 00:29:25.471 11:40:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:29:25.731 [2024-06-10 11:40:09.474436] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:29:25.731 [2024-06-10 11:40:09.474472] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:25.731 [2024-06-10 11:40:09.474486] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1907280 00:29:25.731 [2024-06-10 11:40:09.474511] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:25.731 [2024-06-10 11:40:09.475686] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:25.731 [2024-06-10 11:40:09.475710] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:29:25.731 BaseBdev2 00:29:25.731 11:40:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:29:25.731 [2024-06-10 11:40:09.650921] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:29:25.731 [2024-06-10 11:40:09.651919] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:29:25.731 [2024-06-10 11:40:09.652062] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1906ca0 00:29:25.731 [2024-06-10 11:40:09.652071] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:29:25.731 [2024-06-10 11:40:09.652216] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1909d10 00:29:25.731 [2024-06-10 11:40:09.652326] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1906ca0 00:29:25.731 [2024-06-10 11:40:09.652332] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1906ca0 00:29:25.731 [2024-06-10 11:40:09.652414] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:25.731 11:40:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:29:25.731 11:40:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:25.731 11:40:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:25.731 11:40:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:25.731 11:40:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:25.731 11:40:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:25.731 11:40:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:25.731 11:40:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:25.731 11:40:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:25.731 11:40:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:25.731 11:40:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:25.731 11:40:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:25.990 11:40:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:25.990 "name": "raid_bdev1", 00:29:25.990 "uuid": "813077d2-83b4-403a-a180-a9ce0c2329a5", 00:29:25.990 "strip_size_kb": 0, 00:29:25.990 "state": "online", 00:29:25.990 "raid_level": "raid1", 00:29:25.990 "superblock": true, 00:29:25.990 "num_base_bdevs": 2, 00:29:25.990 "num_base_bdevs_discovered": 2, 00:29:25.990 "num_base_bdevs_operational": 2, 00:29:25.990 "base_bdevs_list": [ 00:29:25.990 { 00:29:25.990 "name": "BaseBdev1", 00:29:25.990 "uuid": "136c888a-2f80-5d88-8010-ee5439100a81", 00:29:25.990 "is_configured": true, 00:29:25.990 "data_offset": 2048, 00:29:25.990 "data_size": 63488 00:29:25.990 }, 00:29:25.990 { 00:29:25.990 "name": "BaseBdev2", 00:29:25.990 "uuid": "c55f2b34-c369-55f7-b546-9f314983b2d8", 00:29:25.990 "is_configured": true, 00:29:25.990 "data_offset": 2048, 00:29:25.990 "data_size": 63488 00:29:25.990 } 00:29:25.990 ] 00:29:25.990 }' 00:29:25.990 11:40:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:25.990 11:40:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:29:26.559 11:40:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:29:26.559 11:40:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:29:26.559 [2024-06-10 11:40:10.405066] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1904720 00:29:27.496 11:40:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:29:27.755 11:40:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:29:27.755 11:40:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:29:27.756 11:40:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:29:27.756 11:40:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:29:27.756 11:40:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:29:27.756 11:40:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:27.756 11:40:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:27.756 11:40:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:27.756 11:40:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:27.756 11:40:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:27.756 11:40:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:27.756 11:40:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:27.756 11:40:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:27.756 11:40:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:27.756 11:40:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:27.756 11:40:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:27.756 11:40:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:27.756 "name": "raid_bdev1", 00:29:27.756 "uuid": "813077d2-83b4-403a-a180-a9ce0c2329a5", 00:29:27.756 "strip_size_kb": 0, 00:29:27.756 "state": "online", 00:29:27.756 "raid_level": "raid1", 00:29:27.756 "superblock": true, 00:29:27.756 "num_base_bdevs": 2, 00:29:27.756 "num_base_bdevs_discovered": 2, 00:29:27.756 "num_base_bdevs_operational": 2, 00:29:27.756 "base_bdevs_list": [ 00:29:27.756 { 00:29:27.756 "name": "BaseBdev1", 00:29:27.756 "uuid": "136c888a-2f80-5d88-8010-ee5439100a81", 00:29:27.756 "is_configured": true, 00:29:27.756 "data_offset": 2048, 00:29:27.756 "data_size": 63488 00:29:27.756 }, 00:29:27.756 { 00:29:27.756 "name": "BaseBdev2", 00:29:27.756 "uuid": "c55f2b34-c369-55f7-b546-9f314983b2d8", 00:29:27.756 "is_configured": true, 00:29:27.756 "data_offset": 2048, 00:29:27.756 "data_size": 63488 00:29:27.756 } 00:29:27.756 ] 00:29:27.756 }' 00:29:27.756 11:40:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:27.756 11:40:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:29:28.336 11:40:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:29:28.594 [2024-06-10 11:40:12.317450] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:28.594 [2024-06-10 11:40:12.317486] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:29:28.594 [2024-06-10 11:40:12.319517] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:28.594 [2024-06-10 11:40:12.319539] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:28.594 [2024-06-10 11:40:12.319587] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:28.594 [2024-06-10 11:40:12.319595] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1906ca0 name raid_bdev1, state offline 00:29:28.594 0 00:29:28.594 11:40:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 145803 00:29:28.594 11:40:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@949 -- # '[' -z 145803 ']' 00:29:28.594 11:40:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # kill -0 145803 00:29:28.594 11:40:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # uname 00:29:28.594 11:40:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:29:28.594 11:40:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 145803 00:29:28.594 11:40:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:29:28.595 11:40:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:29:28.595 11:40:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 145803' 00:29:28.595 killing process with pid 145803 00:29:28.595 11:40:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # kill 145803 00:29:28.595 [2024-06-10 11:40:12.385091] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:29:28.595 11:40:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@973 -- # wait 145803 00:29:28.595 [2024-06-10 11:40:12.395369] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:29:28.854 11:40:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.0w2ViLK0kw 00:29:28.854 11:40:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:29:28.854 11:40:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:29:28.854 11:40:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:29:28.854 11:40:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:29:28.854 11:40:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:29:28.854 11:40:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:29:28.854 11:40:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:29:28.854 00:29:28.854 real 0m5.004s 00:29:28.854 user 0m7.494s 00:29:28.854 sys 0m0.898s 00:29:28.854 11:40:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:29:28.854 11:40:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:29:28.854 ************************************ 00:29:28.854 END TEST raid_read_error_test 00:29:28.854 ************************************ 00:29:28.854 11:40:12 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 2 write 00:29:28.854 11:40:12 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:29:28.854 11:40:12 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:29:28.854 11:40:12 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:29:28.854 ************************************ 00:29:28.854 START TEST raid_write_error_test 00:29:28.854 ************************************ 00:29:28.854 11:40:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test raid1 2 write 00:29:28.854 11:40:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:29:28.854 11:40:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:29:28.854 11:40:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:29:28.854 11:40:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:29:28.854 11:40:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:29:28.854 11:40:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:29:28.854 11:40:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:29:28.854 11:40:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:29:28.854 11:40:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:29:28.854 11:40:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:29:28.854 11:40:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:29:28.854 11:40:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:29:28.854 11:40:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:29:28.854 11:40:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:29:28.854 11:40:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:29:28.854 11:40:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:29:28.854 11:40:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:29:28.854 11:40:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:29:28.854 11:40:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:29:28.854 11:40:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:29:28.854 11:40:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:29:28.854 11:40:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.8Aau4uZetj 00:29:28.854 11:40:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=146558 00:29:28.854 11:40:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 146558 /var/tmp/spdk-raid.sock 00:29:28.854 11:40:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:29:28.854 11:40:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@830 -- # '[' -z 146558 ']' 00:29:28.854 11:40:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:29:28.854 11:40:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:29:28.854 11:40:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:29:28.854 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:29:28.854 11:40:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:29:28.854 11:40:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:29:28.854 [2024-06-10 11:40:12.757835] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:29:28.854 [2024-06-10 11:40:12.757893] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid146558 ] 00:29:29.113 [2024-06-10 11:40:12.845213] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:29.113 [2024-06-10 11:40:12.937224] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:29:29.113 [2024-06-10 11:40:12.995740] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:29.113 [2024-06-10 11:40:12.995770] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:29.680 11:40:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:29:29.680 11:40:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@863 -- # return 0 00:29:29.680 11:40:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:29:29.680 11:40:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:29:29.938 BaseBdev1_malloc 00:29:29.938 11:40:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:29:30.196 true 00:29:30.196 11:40:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:29:30.196 [2024-06-10 11:40:14.065103] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:29:30.196 [2024-06-10 11:40:14.065138] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:30.196 [2024-06-10 11:40:14.065151] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x175ab10 00:29:30.196 [2024-06-10 11:40:14.065175] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:30.196 [2024-06-10 11:40:14.066509] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:30.196 [2024-06-10 11:40:14.066531] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:29:30.196 BaseBdev1 00:29:30.196 11:40:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:29:30.196 11:40:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:29:30.455 BaseBdev2_malloc 00:29:30.455 11:40:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:29:30.713 true 00:29:30.713 11:40:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:29:30.713 [2024-06-10 11:40:14.578048] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:29:30.713 [2024-06-10 11:40:14.578085] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:30.713 [2024-06-10 11:40:14.578114] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x175f280 00:29:30.713 [2024-06-10 11:40:14.578122] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:30.713 [2024-06-10 11:40:14.579286] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:30.714 [2024-06-10 11:40:14.579309] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:29:30.714 BaseBdev2 00:29:30.714 11:40:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:29:30.972 [2024-06-10 11:40:14.750520] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:29:30.972 [2024-06-10 11:40:14.751522] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:29:30.972 [2024-06-10 11:40:14.751664] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x175eca0 00:29:30.972 [2024-06-10 11:40:14.751673] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:29:30.972 [2024-06-10 11:40:14.751813] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1761d10 00:29:30.973 [2024-06-10 11:40:14.751930] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x175eca0 00:29:30.973 [2024-06-10 11:40:14.751937] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x175eca0 00:29:30.973 [2024-06-10 11:40:14.752012] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:30.973 11:40:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:29:30.973 11:40:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:30.973 11:40:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:30.973 11:40:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:30.973 11:40:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:30.973 11:40:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:30.973 11:40:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:30.973 11:40:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:30.973 11:40:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:30.973 11:40:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:30.973 11:40:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:30.973 11:40:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:31.231 11:40:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:31.231 "name": "raid_bdev1", 00:29:31.231 "uuid": "62db8f06-e4f4-42ef-937a-e03116fb4f92", 00:29:31.231 "strip_size_kb": 0, 00:29:31.231 "state": "online", 00:29:31.231 "raid_level": "raid1", 00:29:31.231 "superblock": true, 00:29:31.231 "num_base_bdevs": 2, 00:29:31.231 "num_base_bdevs_discovered": 2, 00:29:31.231 "num_base_bdevs_operational": 2, 00:29:31.231 "base_bdevs_list": [ 00:29:31.231 { 00:29:31.231 "name": "BaseBdev1", 00:29:31.231 "uuid": "1e261042-e056-56a9-a4af-5d525b5bd1a6", 00:29:31.231 "is_configured": true, 00:29:31.231 "data_offset": 2048, 00:29:31.231 "data_size": 63488 00:29:31.231 }, 00:29:31.231 { 00:29:31.231 "name": "BaseBdev2", 00:29:31.231 "uuid": "f17e75c4-d179-5657-84e6-92574109a57e", 00:29:31.231 "is_configured": true, 00:29:31.231 "data_offset": 2048, 00:29:31.231 "data_size": 63488 00:29:31.231 } 00:29:31.231 ] 00:29:31.231 }' 00:29:31.231 11:40:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:31.231 11:40:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:29:31.490 11:40:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:29:31.490 11:40:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:29:31.748 [2024-06-10 11:40:15.504669] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x175c720 00:29:32.685 11:40:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:29:32.685 [2024-06-10 11:40:16.586016] bdev_raid.c:2221:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:29:32.685 [2024-06-10 11:40:16.586063] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:29:32.685 [2024-06-10 11:40:16.586224] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x175c720 00:29:32.685 11:40:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:29:32.685 11:40:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:29:32.685 11:40:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:29:32.685 11:40:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=1 00:29:32.685 11:40:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:32.685 11:40:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:32.685 11:40:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:32.685 11:40:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:32.685 11:40:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:32.685 11:40:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:32.685 11:40:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:32.685 11:40:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:32.685 11:40:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:32.685 11:40:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:32.685 11:40:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:32.685 11:40:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:32.943 11:40:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:32.943 "name": "raid_bdev1", 00:29:32.943 "uuid": "62db8f06-e4f4-42ef-937a-e03116fb4f92", 00:29:32.943 "strip_size_kb": 0, 00:29:32.943 "state": "online", 00:29:32.943 "raid_level": "raid1", 00:29:32.943 "superblock": true, 00:29:32.943 "num_base_bdevs": 2, 00:29:32.943 "num_base_bdevs_discovered": 1, 00:29:32.943 "num_base_bdevs_operational": 1, 00:29:32.943 "base_bdevs_list": [ 00:29:32.943 { 00:29:32.943 "name": null, 00:29:32.943 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:32.943 "is_configured": false, 00:29:32.943 "data_offset": 2048, 00:29:32.943 "data_size": 63488 00:29:32.943 }, 00:29:32.943 { 00:29:32.943 "name": "BaseBdev2", 00:29:32.943 "uuid": "f17e75c4-d179-5657-84e6-92574109a57e", 00:29:32.943 "is_configured": true, 00:29:32.943 "data_offset": 2048, 00:29:32.943 "data_size": 63488 00:29:32.943 } 00:29:32.943 ] 00:29:32.943 }' 00:29:32.943 11:40:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:32.943 11:40:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:29:33.511 11:40:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:29:33.511 [2024-06-10 11:40:17.418916] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:33.511 [2024-06-10 11:40:17.418952] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:29:33.511 [2024-06-10 11:40:17.420912] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:33.511 [2024-06-10 11:40:17.420928] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:33.511 [2024-06-10 11:40:17.420960] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:33.511 [2024-06-10 11:40:17.420967] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x175eca0 name raid_bdev1, state offline 00:29:33.511 0 00:29:33.511 11:40:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 146558 00:29:33.511 11:40:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@949 -- # '[' -z 146558 ']' 00:29:33.511 11:40:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # kill -0 146558 00:29:33.511 11:40:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # uname 00:29:33.511 11:40:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:29:33.511 11:40:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 146558 00:29:33.770 11:40:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:29:33.770 11:40:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:29:33.770 11:40:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 146558' 00:29:33.770 killing process with pid 146558 00:29:33.770 11:40:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # kill 146558 00:29:33.770 [2024-06-10 11:40:17.481030] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:29:33.770 11:40:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@973 -- # wait 146558 00:29:33.770 [2024-06-10 11:40:17.490733] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:29:33.770 11:40:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.8Aau4uZetj 00:29:33.770 11:40:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:29:33.770 11:40:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:29:33.770 11:40:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:29:33.770 11:40:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:29:33.770 11:40:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:29:33.770 11:40:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:29:33.770 11:40:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:29:33.770 00:29:33.770 real 0m4.999s 00:29:33.770 user 0m7.506s 00:29:33.770 sys 0m0.874s 00:29:33.770 11:40:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:29:33.770 11:40:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:29:33.770 ************************************ 00:29:33.770 END TEST raid_write_error_test 00:29:33.770 ************************************ 00:29:34.064 11:40:17 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:29:34.064 11:40:17 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:29:34.064 11:40:17 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 3 false 00:29:34.064 11:40:17 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:29:34.064 11:40:17 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:29:34.064 11:40:17 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:29:34.064 ************************************ 00:29:34.064 START TEST raid_state_function_test 00:29:34.064 ************************************ 00:29:34.064 11:40:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # raid_state_function_test raid0 3 false 00:29:34.064 11:40:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:29:34.064 11:40:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:29:34.064 11:40:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:29:34.064 11:40:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:29:34.064 11:40:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:29:34.064 11:40:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:29:34.064 11:40:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:29:34.064 11:40:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:29:34.064 11:40:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:29:34.064 11:40:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:29:34.065 11:40:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:29:34.065 11:40:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:29:34.065 11:40:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:29:34.065 11:40:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:29:34.065 11:40:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:29:34.065 11:40:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:29:34.065 11:40:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:29:34.065 11:40:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:29:34.065 11:40:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:29:34.065 11:40:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:29:34.065 11:40:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:29:34.065 11:40:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:29:34.065 11:40:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:29:34.065 11:40:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:29:34.065 11:40:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:29:34.065 11:40:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:29:34.065 11:40:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=147235 00:29:34.065 11:40:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 147235' 00:29:34.065 Process raid pid: 147235 00:29:34.065 11:40:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:29:34.065 11:40:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 147235 /var/tmp/spdk-raid.sock 00:29:34.065 11:40:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@830 -- # '[' -z 147235 ']' 00:29:34.065 11:40:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:29:34.065 11:40:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:29:34.065 11:40:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:29:34.065 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:29:34.065 11:40:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:29:34.065 11:40:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:29:34.065 [2024-06-10 11:40:17.836219] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:29:34.065 [2024-06-10 11:40:17.836276] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:29:34.065 [2024-06-10 11:40:17.928397] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:34.352 [2024-06-10 11:40:18.020946] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:29:34.352 [2024-06-10 11:40:18.090981] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:34.352 [2024-06-10 11:40:18.091011] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:34.922 11:40:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:29:34.922 11:40:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@863 -- # return 0 00:29:34.922 11:40:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:29:34.922 [2024-06-10 11:40:18.786620] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:29:34.922 [2024-06-10 11:40:18.786658] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:29:34.922 [2024-06-10 11:40:18.786665] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:29:34.922 [2024-06-10 11:40:18.786673] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:29:34.922 [2024-06-10 11:40:18.786678] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:29:34.922 [2024-06-10 11:40:18.786689] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:29:34.922 11:40:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:29:34.923 11:40:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:34.923 11:40:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:34.923 11:40:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:29:34.923 11:40:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:29:34.923 11:40:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:29:34.923 11:40:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:34.923 11:40:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:34.923 11:40:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:34.923 11:40:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:34.923 11:40:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:34.923 11:40:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:35.181 11:40:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:35.181 "name": "Existed_Raid", 00:29:35.181 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:35.181 "strip_size_kb": 64, 00:29:35.181 "state": "configuring", 00:29:35.181 "raid_level": "raid0", 00:29:35.181 "superblock": false, 00:29:35.181 "num_base_bdevs": 3, 00:29:35.181 "num_base_bdevs_discovered": 0, 00:29:35.181 "num_base_bdevs_operational": 3, 00:29:35.181 "base_bdevs_list": [ 00:29:35.181 { 00:29:35.181 "name": "BaseBdev1", 00:29:35.181 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:35.181 "is_configured": false, 00:29:35.181 "data_offset": 0, 00:29:35.181 "data_size": 0 00:29:35.181 }, 00:29:35.181 { 00:29:35.181 "name": "BaseBdev2", 00:29:35.181 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:35.181 "is_configured": false, 00:29:35.181 "data_offset": 0, 00:29:35.181 "data_size": 0 00:29:35.181 }, 00:29:35.181 { 00:29:35.181 "name": "BaseBdev3", 00:29:35.181 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:35.181 "is_configured": false, 00:29:35.181 "data_offset": 0, 00:29:35.181 "data_size": 0 00:29:35.181 } 00:29:35.181 ] 00:29:35.182 }' 00:29:35.182 11:40:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:35.182 11:40:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:29:35.749 11:40:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:29:35.749 [2024-06-10 11:40:19.576584] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:29:35.749 [2024-06-10 11:40:19.576608] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12c3530 name Existed_Raid, state configuring 00:29:35.749 11:40:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:29:36.007 [2024-06-10 11:40:19.749043] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:29:36.007 [2024-06-10 11:40:19.749068] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:29:36.008 [2024-06-10 11:40:19.749074] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:29:36.008 [2024-06-10 11:40:19.749082] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:29:36.008 [2024-06-10 11:40:19.749087] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:29:36.008 [2024-06-10 11:40:19.749094] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:29:36.008 11:40:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:29:36.008 [2024-06-10 11:40:19.918154] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:29:36.008 BaseBdev1 00:29:36.008 11:40:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:29:36.008 11:40:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:29:36.008 11:40:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:29:36.008 11:40:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:29:36.008 11:40:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:29:36.008 11:40:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:29:36.008 11:40:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:29:36.266 11:40:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:29:36.525 [ 00:29:36.525 { 00:29:36.525 "name": "BaseBdev1", 00:29:36.525 "aliases": [ 00:29:36.525 "6834e13f-07cb-467a-8c3d-9e496eb694a6" 00:29:36.525 ], 00:29:36.525 "product_name": "Malloc disk", 00:29:36.525 "block_size": 512, 00:29:36.525 "num_blocks": 65536, 00:29:36.525 "uuid": "6834e13f-07cb-467a-8c3d-9e496eb694a6", 00:29:36.525 "assigned_rate_limits": { 00:29:36.525 "rw_ios_per_sec": 0, 00:29:36.525 "rw_mbytes_per_sec": 0, 00:29:36.525 "r_mbytes_per_sec": 0, 00:29:36.525 "w_mbytes_per_sec": 0 00:29:36.525 }, 00:29:36.525 "claimed": true, 00:29:36.525 "claim_type": "exclusive_write", 00:29:36.525 "zoned": false, 00:29:36.525 "supported_io_types": { 00:29:36.525 "read": true, 00:29:36.525 "write": true, 00:29:36.525 "unmap": true, 00:29:36.525 "write_zeroes": true, 00:29:36.525 "flush": true, 00:29:36.525 "reset": true, 00:29:36.525 "compare": false, 00:29:36.525 "compare_and_write": false, 00:29:36.525 "abort": true, 00:29:36.525 "nvme_admin": false, 00:29:36.525 "nvme_io": false 00:29:36.525 }, 00:29:36.525 "memory_domains": [ 00:29:36.525 { 00:29:36.525 "dma_device_id": "system", 00:29:36.525 "dma_device_type": 1 00:29:36.525 }, 00:29:36.525 { 00:29:36.525 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:36.525 "dma_device_type": 2 00:29:36.525 } 00:29:36.525 ], 00:29:36.525 "driver_specific": {} 00:29:36.525 } 00:29:36.525 ] 00:29:36.525 11:40:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:29:36.525 11:40:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:29:36.525 11:40:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:36.525 11:40:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:36.525 11:40:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:29:36.525 11:40:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:29:36.525 11:40:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:29:36.525 11:40:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:36.525 11:40:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:36.525 11:40:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:36.525 11:40:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:36.525 11:40:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:36.525 11:40:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:36.525 11:40:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:36.525 "name": "Existed_Raid", 00:29:36.525 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:36.525 "strip_size_kb": 64, 00:29:36.525 "state": "configuring", 00:29:36.525 "raid_level": "raid0", 00:29:36.525 "superblock": false, 00:29:36.525 "num_base_bdevs": 3, 00:29:36.525 "num_base_bdevs_discovered": 1, 00:29:36.525 "num_base_bdevs_operational": 3, 00:29:36.525 "base_bdevs_list": [ 00:29:36.525 { 00:29:36.525 "name": "BaseBdev1", 00:29:36.525 "uuid": "6834e13f-07cb-467a-8c3d-9e496eb694a6", 00:29:36.525 "is_configured": true, 00:29:36.525 "data_offset": 0, 00:29:36.525 "data_size": 65536 00:29:36.525 }, 00:29:36.525 { 00:29:36.525 "name": "BaseBdev2", 00:29:36.525 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:36.525 "is_configured": false, 00:29:36.525 "data_offset": 0, 00:29:36.525 "data_size": 0 00:29:36.525 }, 00:29:36.525 { 00:29:36.525 "name": "BaseBdev3", 00:29:36.525 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:36.525 "is_configured": false, 00:29:36.525 "data_offset": 0, 00:29:36.525 "data_size": 0 00:29:36.525 } 00:29:36.525 ] 00:29:36.525 }' 00:29:36.525 11:40:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:36.525 11:40:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:29:37.093 11:40:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:29:37.352 [2024-06-10 11:40:21.081157] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:29:37.352 [2024-06-10 11:40:21.081201] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12c2e00 name Existed_Raid, state configuring 00:29:37.352 11:40:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:29:37.352 [2024-06-10 11:40:21.249609] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:29:37.352 [2024-06-10 11:40:21.250651] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:29:37.352 [2024-06-10 11:40:21.250678] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:29:37.352 [2024-06-10 11:40:21.250685] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:29:37.352 [2024-06-10 11:40:21.250692] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:29:37.352 11:40:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:29:37.352 11:40:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:29:37.352 11:40:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:29:37.352 11:40:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:37.352 11:40:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:37.352 11:40:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:29:37.352 11:40:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:29:37.352 11:40:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:29:37.352 11:40:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:37.352 11:40:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:37.352 11:40:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:37.352 11:40:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:37.352 11:40:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:37.352 11:40:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:37.612 11:40:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:37.612 "name": "Existed_Raid", 00:29:37.612 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:37.612 "strip_size_kb": 64, 00:29:37.612 "state": "configuring", 00:29:37.612 "raid_level": "raid0", 00:29:37.612 "superblock": false, 00:29:37.612 "num_base_bdevs": 3, 00:29:37.612 "num_base_bdevs_discovered": 1, 00:29:37.612 "num_base_bdevs_operational": 3, 00:29:37.612 "base_bdevs_list": [ 00:29:37.612 { 00:29:37.612 "name": "BaseBdev1", 00:29:37.612 "uuid": "6834e13f-07cb-467a-8c3d-9e496eb694a6", 00:29:37.612 "is_configured": true, 00:29:37.612 "data_offset": 0, 00:29:37.612 "data_size": 65536 00:29:37.612 }, 00:29:37.612 { 00:29:37.612 "name": "BaseBdev2", 00:29:37.612 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:37.612 "is_configured": false, 00:29:37.612 "data_offset": 0, 00:29:37.612 "data_size": 0 00:29:37.612 }, 00:29:37.612 { 00:29:37.612 "name": "BaseBdev3", 00:29:37.612 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:37.612 "is_configured": false, 00:29:37.612 "data_offset": 0, 00:29:37.612 "data_size": 0 00:29:37.612 } 00:29:37.612 ] 00:29:37.612 }' 00:29:37.612 11:40:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:37.612 11:40:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:29:38.180 11:40:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:29:38.180 [2024-06-10 11:40:22.086855] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:29:38.180 BaseBdev2 00:29:38.180 11:40:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:29:38.180 11:40:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:29:38.180 11:40:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:29:38.180 11:40:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:29:38.180 11:40:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:29:38.180 11:40:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:29:38.180 11:40:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:29:38.438 11:40:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:29:38.697 [ 00:29:38.697 { 00:29:38.697 "name": "BaseBdev2", 00:29:38.697 "aliases": [ 00:29:38.697 "5e336f04-ea08-410d-8486-6c2edc80393e" 00:29:38.697 ], 00:29:38.697 "product_name": "Malloc disk", 00:29:38.697 "block_size": 512, 00:29:38.697 "num_blocks": 65536, 00:29:38.697 "uuid": "5e336f04-ea08-410d-8486-6c2edc80393e", 00:29:38.697 "assigned_rate_limits": { 00:29:38.697 "rw_ios_per_sec": 0, 00:29:38.697 "rw_mbytes_per_sec": 0, 00:29:38.697 "r_mbytes_per_sec": 0, 00:29:38.697 "w_mbytes_per_sec": 0 00:29:38.697 }, 00:29:38.697 "claimed": true, 00:29:38.697 "claim_type": "exclusive_write", 00:29:38.697 "zoned": false, 00:29:38.697 "supported_io_types": { 00:29:38.697 "read": true, 00:29:38.697 "write": true, 00:29:38.697 "unmap": true, 00:29:38.697 "write_zeroes": true, 00:29:38.697 "flush": true, 00:29:38.697 "reset": true, 00:29:38.697 "compare": false, 00:29:38.697 "compare_and_write": false, 00:29:38.697 "abort": true, 00:29:38.697 "nvme_admin": false, 00:29:38.697 "nvme_io": false 00:29:38.697 }, 00:29:38.697 "memory_domains": [ 00:29:38.697 { 00:29:38.697 "dma_device_id": "system", 00:29:38.697 "dma_device_type": 1 00:29:38.697 }, 00:29:38.697 { 00:29:38.697 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:38.697 "dma_device_type": 2 00:29:38.697 } 00:29:38.697 ], 00:29:38.697 "driver_specific": {} 00:29:38.697 } 00:29:38.697 ] 00:29:38.697 11:40:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:29:38.697 11:40:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:29:38.697 11:40:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:29:38.697 11:40:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:29:38.697 11:40:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:38.697 11:40:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:38.697 11:40:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:29:38.697 11:40:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:29:38.697 11:40:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:29:38.697 11:40:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:38.697 11:40:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:38.697 11:40:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:38.697 11:40:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:38.697 11:40:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:38.697 11:40:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:38.697 11:40:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:38.697 "name": "Existed_Raid", 00:29:38.697 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:38.697 "strip_size_kb": 64, 00:29:38.697 "state": "configuring", 00:29:38.697 "raid_level": "raid0", 00:29:38.697 "superblock": false, 00:29:38.697 "num_base_bdevs": 3, 00:29:38.697 "num_base_bdevs_discovered": 2, 00:29:38.697 "num_base_bdevs_operational": 3, 00:29:38.697 "base_bdevs_list": [ 00:29:38.697 { 00:29:38.697 "name": "BaseBdev1", 00:29:38.697 "uuid": "6834e13f-07cb-467a-8c3d-9e496eb694a6", 00:29:38.697 "is_configured": true, 00:29:38.697 "data_offset": 0, 00:29:38.697 "data_size": 65536 00:29:38.697 }, 00:29:38.697 { 00:29:38.697 "name": "BaseBdev2", 00:29:38.697 "uuid": "5e336f04-ea08-410d-8486-6c2edc80393e", 00:29:38.697 "is_configured": true, 00:29:38.697 "data_offset": 0, 00:29:38.697 "data_size": 65536 00:29:38.697 }, 00:29:38.697 { 00:29:38.697 "name": "BaseBdev3", 00:29:38.697 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:38.697 "is_configured": false, 00:29:38.697 "data_offset": 0, 00:29:38.697 "data_size": 0 00:29:38.697 } 00:29:38.697 ] 00:29:38.697 }' 00:29:38.698 11:40:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:38.698 11:40:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:29:39.265 11:40:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:29:39.524 [2024-06-10 11:40:23.284958] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:29:39.524 [2024-06-10 11:40:23.284994] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x12c3cf0 00:29:39.524 [2024-06-10 11:40:23.285003] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:29:39.524 [2024-06-10 11:40:23.285156] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12dabe0 00:29:39.524 [2024-06-10 11:40:23.285248] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x12c3cf0 00:29:39.524 [2024-06-10 11:40:23.285255] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x12c3cf0 00:29:39.524 [2024-06-10 11:40:23.285378] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:39.524 BaseBdev3 00:29:39.524 11:40:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:29:39.524 11:40:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:29:39.524 11:40:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:29:39.524 11:40:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:29:39.524 11:40:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:29:39.524 11:40:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:29:39.524 11:40:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:29:39.783 11:40:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:29:39.783 [ 00:29:39.783 { 00:29:39.783 "name": "BaseBdev3", 00:29:39.783 "aliases": [ 00:29:39.783 "3bf90a6a-b2aa-420a-bc01-6ab98ed9ef76" 00:29:39.783 ], 00:29:39.783 "product_name": "Malloc disk", 00:29:39.783 "block_size": 512, 00:29:39.783 "num_blocks": 65536, 00:29:39.783 "uuid": "3bf90a6a-b2aa-420a-bc01-6ab98ed9ef76", 00:29:39.783 "assigned_rate_limits": { 00:29:39.783 "rw_ios_per_sec": 0, 00:29:39.783 "rw_mbytes_per_sec": 0, 00:29:39.783 "r_mbytes_per_sec": 0, 00:29:39.783 "w_mbytes_per_sec": 0 00:29:39.783 }, 00:29:39.783 "claimed": true, 00:29:39.784 "claim_type": "exclusive_write", 00:29:39.784 "zoned": false, 00:29:39.784 "supported_io_types": { 00:29:39.784 "read": true, 00:29:39.784 "write": true, 00:29:39.784 "unmap": true, 00:29:39.784 "write_zeroes": true, 00:29:39.784 "flush": true, 00:29:39.784 "reset": true, 00:29:39.784 "compare": false, 00:29:39.784 "compare_and_write": false, 00:29:39.784 "abort": true, 00:29:39.784 "nvme_admin": false, 00:29:39.784 "nvme_io": false 00:29:39.784 }, 00:29:39.784 "memory_domains": [ 00:29:39.784 { 00:29:39.784 "dma_device_id": "system", 00:29:39.784 "dma_device_type": 1 00:29:39.784 }, 00:29:39.784 { 00:29:39.784 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:39.784 "dma_device_type": 2 00:29:39.784 } 00:29:39.784 ], 00:29:39.784 "driver_specific": {} 00:29:39.784 } 00:29:39.784 ] 00:29:39.784 11:40:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:29:39.784 11:40:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:29:39.784 11:40:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:29:39.784 11:40:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:29:39.784 11:40:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:39.784 11:40:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:39.784 11:40:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:29:39.784 11:40:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:29:39.784 11:40:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:29:39.784 11:40:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:39.784 11:40:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:39.784 11:40:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:39.784 11:40:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:39.784 11:40:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:39.784 11:40:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:40.044 11:40:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:40.044 "name": "Existed_Raid", 00:29:40.044 "uuid": "8d0c1475-d391-421e-a422-86ea13fd0cb5", 00:29:40.044 "strip_size_kb": 64, 00:29:40.044 "state": "online", 00:29:40.044 "raid_level": "raid0", 00:29:40.044 "superblock": false, 00:29:40.044 "num_base_bdevs": 3, 00:29:40.044 "num_base_bdevs_discovered": 3, 00:29:40.044 "num_base_bdevs_operational": 3, 00:29:40.044 "base_bdevs_list": [ 00:29:40.044 { 00:29:40.044 "name": "BaseBdev1", 00:29:40.044 "uuid": "6834e13f-07cb-467a-8c3d-9e496eb694a6", 00:29:40.044 "is_configured": true, 00:29:40.044 "data_offset": 0, 00:29:40.044 "data_size": 65536 00:29:40.044 }, 00:29:40.044 { 00:29:40.044 "name": "BaseBdev2", 00:29:40.044 "uuid": "5e336f04-ea08-410d-8486-6c2edc80393e", 00:29:40.044 "is_configured": true, 00:29:40.044 "data_offset": 0, 00:29:40.044 "data_size": 65536 00:29:40.044 }, 00:29:40.044 { 00:29:40.044 "name": "BaseBdev3", 00:29:40.044 "uuid": "3bf90a6a-b2aa-420a-bc01-6ab98ed9ef76", 00:29:40.044 "is_configured": true, 00:29:40.044 "data_offset": 0, 00:29:40.044 "data_size": 65536 00:29:40.044 } 00:29:40.044 ] 00:29:40.044 }' 00:29:40.044 11:40:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:40.044 11:40:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:29:40.612 11:40:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:29:40.612 11:40:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:29:40.612 11:40:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:29:40.612 11:40:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:29:40.612 11:40:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:29:40.612 11:40:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:29:40.612 11:40:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:29:40.612 11:40:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:29:40.612 [2024-06-10 11:40:24.476202] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:40.612 11:40:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:29:40.612 "name": "Existed_Raid", 00:29:40.612 "aliases": [ 00:29:40.612 "8d0c1475-d391-421e-a422-86ea13fd0cb5" 00:29:40.612 ], 00:29:40.612 "product_name": "Raid Volume", 00:29:40.612 "block_size": 512, 00:29:40.612 "num_blocks": 196608, 00:29:40.612 "uuid": "8d0c1475-d391-421e-a422-86ea13fd0cb5", 00:29:40.612 "assigned_rate_limits": { 00:29:40.612 "rw_ios_per_sec": 0, 00:29:40.612 "rw_mbytes_per_sec": 0, 00:29:40.612 "r_mbytes_per_sec": 0, 00:29:40.612 "w_mbytes_per_sec": 0 00:29:40.612 }, 00:29:40.612 "claimed": false, 00:29:40.612 "zoned": false, 00:29:40.612 "supported_io_types": { 00:29:40.612 "read": true, 00:29:40.612 "write": true, 00:29:40.612 "unmap": true, 00:29:40.612 "write_zeroes": true, 00:29:40.612 "flush": true, 00:29:40.612 "reset": true, 00:29:40.612 "compare": false, 00:29:40.612 "compare_and_write": false, 00:29:40.612 "abort": false, 00:29:40.612 "nvme_admin": false, 00:29:40.612 "nvme_io": false 00:29:40.612 }, 00:29:40.612 "memory_domains": [ 00:29:40.612 { 00:29:40.612 "dma_device_id": "system", 00:29:40.612 "dma_device_type": 1 00:29:40.612 }, 00:29:40.612 { 00:29:40.612 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:40.613 "dma_device_type": 2 00:29:40.613 }, 00:29:40.613 { 00:29:40.613 "dma_device_id": "system", 00:29:40.613 "dma_device_type": 1 00:29:40.613 }, 00:29:40.613 { 00:29:40.613 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:40.613 "dma_device_type": 2 00:29:40.613 }, 00:29:40.613 { 00:29:40.613 "dma_device_id": "system", 00:29:40.613 "dma_device_type": 1 00:29:40.613 }, 00:29:40.613 { 00:29:40.613 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:40.613 "dma_device_type": 2 00:29:40.613 } 00:29:40.613 ], 00:29:40.613 "driver_specific": { 00:29:40.613 "raid": { 00:29:40.613 "uuid": "8d0c1475-d391-421e-a422-86ea13fd0cb5", 00:29:40.613 "strip_size_kb": 64, 00:29:40.613 "state": "online", 00:29:40.613 "raid_level": "raid0", 00:29:40.613 "superblock": false, 00:29:40.613 "num_base_bdevs": 3, 00:29:40.613 "num_base_bdevs_discovered": 3, 00:29:40.613 "num_base_bdevs_operational": 3, 00:29:40.613 "base_bdevs_list": [ 00:29:40.613 { 00:29:40.613 "name": "BaseBdev1", 00:29:40.613 "uuid": "6834e13f-07cb-467a-8c3d-9e496eb694a6", 00:29:40.613 "is_configured": true, 00:29:40.613 "data_offset": 0, 00:29:40.613 "data_size": 65536 00:29:40.613 }, 00:29:40.613 { 00:29:40.613 "name": "BaseBdev2", 00:29:40.613 "uuid": "5e336f04-ea08-410d-8486-6c2edc80393e", 00:29:40.613 "is_configured": true, 00:29:40.613 "data_offset": 0, 00:29:40.613 "data_size": 65536 00:29:40.613 }, 00:29:40.613 { 00:29:40.613 "name": "BaseBdev3", 00:29:40.613 "uuid": "3bf90a6a-b2aa-420a-bc01-6ab98ed9ef76", 00:29:40.613 "is_configured": true, 00:29:40.613 "data_offset": 0, 00:29:40.613 "data_size": 65536 00:29:40.613 } 00:29:40.613 ] 00:29:40.613 } 00:29:40.613 } 00:29:40.613 }' 00:29:40.613 11:40:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:29:40.613 11:40:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:29:40.613 BaseBdev2 00:29:40.613 BaseBdev3' 00:29:40.613 11:40:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:40.613 11:40:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:29:40.613 11:40:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:40.872 11:40:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:40.872 "name": "BaseBdev1", 00:29:40.872 "aliases": [ 00:29:40.872 "6834e13f-07cb-467a-8c3d-9e496eb694a6" 00:29:40.872 ], 00:29:40.872 "product_name": "Malloc disk", 00:29:40.872 "block_size": 512, 00:29:40.872 "num_blocks": 65536, 00:29:40.872 "uuid": "6834e13f-07cb-467a-8c3d-9e496eb694a6", 00:29:40.872 "assigned_rate_limits": { 00:29:40.872 "rw_ios_per_sec": 0, 00:29:40.872 "rw_mbytes_per_sec": 0, 00:29:40.872 "r_mbytes_per_sec": 0, 00:29:40.872 "w_mbytes_per_sec": 0 00:29:40.872 }, 00:29:40.872 "claimed": true, 00:29:40.872 "claim_type": "exclusive_write", 00:29:40.872 "zoned": false, 00:29:40.872 "supported_io_types": { 00:29:40.872 "read": true, 00:29:40.872 "write": true, 00:29:40.872 "unmap": true, 00:29:40.872 "write_zeroes": true, 00:29:40.872 "flush": true, 00:29:40.872 "reset": true, 00:29:40.872 "compare": false, 00:29:40.872 "compare_and_write": false, 00:29:40.872 "abort": true, 00:29:40.872 "nvme_admin": false, 00:29:40.872 "nvme_io": false 00:29:40.872 }, 00:29:40.872 "memory_domains": [ 00:29:40.872 { 00:29:40.872 "dma_device_id": "system", 00:29:40.872 "dma_device_type": 1 00:29:40.872 }, 00:29:40.872 { 00:29:40.872 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:40.872 "dma_device_type": 2 00:29:40.872 } 00:29:40.872 ], 00:29:40.872 "driver_specific": {} 00:29:40.872 }' 00:29:40.872 11:40:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:40.872 11:40:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:40.872 11:40:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:29:40.872 11:40:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:41.130 11:40:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:41.130 11:40:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:29:41.130 11:40:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:41.130 11:40:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:41.130 11:40:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:29:41.130 11:40:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:41.130 11:40:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:41.130 11:40:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:29:41.130 11:40:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:41.130 11:40:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:29:41.130 11:40:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:41.390 11:40:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:41.390 "name": "BaseBdev2", 00:29:41.390 "aliases": [ 00:29:41.390 "5e336f04-ea08-410d-8486-6c2edc80393e" 00:29:41.390 ], 00:29:41.390 "product_name": "Malloc disk", 00:29:41.390 "block_size": 512, 00:29:41.390 "num_blocks": 65536, 00:29:41.390 "uuid": "5e336f04-ea08-410d-8486-6c2edc80393e", 00:29:41.390 "assigned_rate_limits": { 00:29:41.390 "rw_ios_per_sec": 0, 00:29:41.390 "rw_mbytes_per_sec": 0, 00:29:41.390 "r_mbytes_per_sec": 0, 00:29:41.390 "w_mbytes_per_sec": 0 00:29:41.390 }, 00:29:41.390 "claimed": true, 00:29:41.390 "claim_type": "exclusive_write", 00:29:41.390 "zoned": false, 00:29:41.390 "supported_io_types": { 00:29:41.390 "read": true, 00:29:41.390 "write": true, 00:29:41.390 "unmap": true, 00:29:41.390 "write_zeroes": true, 00:29:41.390 "flush": true, 00:29:41.390 "reset": true, 00:29:41.390 "compare": false, 00:29:41.390 "compare_and_write": false, 00:29:41.390 "abort": true, 00:29:41.390 "nvme_admin": false, 00:29:41.390 "nvme_io": false 00:29:41.390 }, 00:29:41.390 "memory_domains": [ 00:29:41.390 { 00:29:41.390 "dma_device_id": "system", 00:29:41.390 "dma_device_type": 1 00:29:41.390 }, 00:29:41.390 { 00:29:41.390 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:41.390 "dma_device_type": 2 00:29:41.390 } 00:29:41.390 ], 00:29:41.390 "driver_specific": {} 00:29:41.390 }' 00:29:41.390 11:40:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:41.390 11:40:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:41.390 11:40:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:29:41.390 11:40:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:41.390 11:40:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:41.649 11:40:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:29:41.649 11:40:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:41.649 11:40:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:41.649 11:40:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:29:41.649 11:40:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:41.649 11:40:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:41.649 11:40:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:29:41.649 11:40:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:41.649 11:40:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:29:41.649 11:40:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:41.908 11:40:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:41.908 "name": "BaseBdev3", 00:29:41.908 "aliases": [ 00:29:41.908 "3bf90a6a-b2aa-420a-bc01-6ab98ed9ef76" 00:29:41.908 ], 00:29:41.908 "product_name": "Malloc disk", 00:29:41.908 "block_size": 512, 00:29:41.908 "num_blocks": 65536, 00:29:41.908 "uuid": "3bf90a6a-b2aa-420a-bc01-6ab98ed9ef76", 00:29:41.908 "assigned_rate_limits": { 00:29:41.908 "rw_ios_per_sec": 0, 00:29:41.908 "rw_mbytes_per_sec": 0, 00:29:41.908 "r_mbytes_per_sec": 0, 00:29:41.908 "w_mbytes_per_sec": 0 00:29:41.908 }, 00:29:41.908 "claimed": true, 00:29:41.908 "claim_type": "exclusive_write", 00:29:41.908 "zoned": false, 00:29:41.908 "supported_io_types": { 00:29:41.908 "read": true, 00:29:41.908 "write": true, 00:29:41.908 "unmap": true, 00:29:41.908 "write_zeroes": true, 00:29:41.908 "flush": true, 00:29:41.908 "reset": true, 00:29:41.908 "compare": false, 00:29:41.908 "compare_and_write": false, 00:29:41.908 "abort": true, 00:29:41.908 "nvme_admin": false, 00:29:41.908 "nvme_io": false 00:29:41.908 }, 00:29:41.908 "memory_domains": [ 00:29:41.908 { 00:29:41.908 "dma_device_id": "system", 00:29:41.908 "dma_device_type": 1 00:29:41.908 }, 00:29:41.908 { 00:29:41.908 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:41.908 "dma_device_type": 2 00:29:41.908 } 00:29:41.908 ], 00:29:41.908 "driver_specific": {} 00:29:41.908 }' 00:29:41.908 11:40:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:41.908 11:40:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:41.908 11:40:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:29:41.908 11:40:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:41.908 11:40:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:41.908 11:40:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:29:41.908 11:40:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:41.908 11:40:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:42.167 11:40:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:29:42.167 11:40:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:42.167 11:40:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:42.167 11:40:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:29:42.167 11:40:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:29:42.167 [2024-06-10 11:40:26.108460] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:29:42.167 [2024-06-10 11:40:26.108482] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:29:42.167 [2024-06-10 11:40:26.108511] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:42.427 11:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:29:42.427 11:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:29:42.427 11:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:29:42.427 11:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:29:42.427 11:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:29:42.427 11:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 2 00:29:42.427 11:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:42.427 11:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:29:42.427 11:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:29:42.427 11:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:29:42.427 11:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:42.427 11:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:42.427 11:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:42.427 11:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:42.427 11:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:42.427 11:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:42.427 11:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:42.427 11:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:42.427 "name": "Existed_Raid", 00:29:42.427 "uuid": "8d0c1475-d391-421e-a422-86ea13fd0cb5", 00:29:42.427 "strip_size_kb": 64, 00:29:42.427 "state": "offline", 00:29:42.427 "raid_level": "raid0", 00:29:42.427 "superblock": false, 00:29:42.427 "num_base_bdevs": 3, 00:29:42.427 "num_base_bdevs_discovered": 2, 00:29:42.427 "num_base_bdevs_operational": 2, 00:29:42.427 "base_bdevs_list": [ 00:29:42.427 { 00:29:42.427 "name": null, 00:29:42.427 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:42.427 "is_configured": false, 00:29:42.427 "data_offset": 0, 00:29:42.427 "data_size": 65536 00:29:42.427 }, 00:29:42.427 { 00:29:42.427 "name": "BaseBdev2", 00:29:42.427 "uuid": "5e336f04-ea08-410d-8486-6c2edc80393e", 00:29:42.427 "is_configured": true, 00:29:42.427 "data_offset": 0, 00:29:42.427 "data_size": 65536 00:29:42.427 }, 00:29:42.427 { 00:29:42.427 "name": "BaseBdev3", 00:29:42.427 "uuid": "3bf90a6a-b2aa-420a-bc01-6ab98ed9ef76", 00:29:42.427 "is_configured": true, 00:29:42.427 "data_offset": 0, 00:29:42.427 "data_size": 65536 00:29:42.427 } 00:29:42.427 ] 00:29:42.427 }' 00:29:42.427 11:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:42.427 11:40:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:29:42.996 11:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:29:42.996 11:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:29:42.996 11:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:42.996 11:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:29:42.996 11:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:29:42.996 11:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:29:42.996 11:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:29:43.255 [2024-06-10 11:40:27.071881] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:29:43.255 11:40:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:29:43.255 11:40:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:29:43.256 11:40:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:43.256 11:40:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:29:43.515 11:40:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:29:43.515 11:40:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:29:43.515 11:40:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:29:43.515 [2024-06-10 11:40:27.430997] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:29:43.515 [2024-06-10 11:40:27.431037] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12c3cf0 name Existed_Raid, state offline 00:29:43.515 11:40:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:29:43.515 11:40:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:29:43.515 11:40:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:43.515 11:40:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:29:43.774 11:40:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:29:43.774 11:40:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:29:43.774 11:40:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:29:43.774 11:40:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:29:43.774 11:40:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:29:43.774 11:40:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:29:44.033 BaseBdev2 00:29:44.033 11:40:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:29:44.033 11:40:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:29:44.033 11:40:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:29:44.033 11:40:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:29:44.033 11:40:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:29:44.033 11:40:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:29:44.033 11:40:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:29:44.293 11:40:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:29:44.293 [ 00:29:44.293 { 00:29:44.293 "name": "BaseBdev2", 00:29:44.293 "aliases": [ 00:29:44.293 "1e4422d4-fbb3-455c-ad98-ebf47dbb2668" 00:29:44.293 ], 00:29:44.293 "product_name": "Malloc disk", 00:29:44.293 "block_size": 512, 00:29:44.293 "num_blocks": 65536, 00:29:44.293 "uuid": "1e4422d4-fbb3-455c-ad98-ebf47dbb2668", 00:29:44.293 "assigned_rate_limits": { 00:29:44.293 "rw_ios_per_sec": 0, 00:29:44.293 "rw_mbytes_per_sec": 0, 00:29:44.293 "r_mbytes_per_sec": 0, 00:29:44.293 "w_mbytes_per_sec": 0 00:29:44.293 }, 00:29:44.293 "claimed": false, 00:29:44.293 "zoned": false, 00:29:44.293 "supported_io_types": { 00:29:44.293 "read": true, 00:29:44.293 "write": true, 00:29:44.293 "unmap": true, 00:29:44.293 "write_zeroes": true, 00:29:44.293 "flush": true, 00:29:44.293 "reset": true, 00:29:44.293 "compare": false, 00:29:44.293 "compare_and_write": false, 00:29:44.293 "abort": true, 00:29:44.293 "nvme_admin": false, 00:29:44.293 "nvme_io": false 00:29:44.293 }, 00:29:44.293 "memory_domains": [ 00:29:44.293 { 00:29:44.293 "dma_device_id": "system", 00:29:44.293 "dma_device_type": 1 00:29:44.293 }, 00:29:44.293 { 00:29:44.293 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:44.293 "dma_device_type": 2 00:29:44.293 } 00:29:44.293 ], 00:29:44.293 "driver_specific": {} 00:29:44.293 } 00:29:44.293 ] 00:29:44.293 11:40:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:29:44.293 11:40:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:29:44.293 11:40:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:29:44.293 11:40:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:29:44.553 BaseBdev3 00:29:44.553 11:40:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:29:44.553 11:40:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:29:44.553 11:40:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:29:44.553 11:40:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:29:44.553 11:40:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:29:44.553 11:40:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:29:44.553 11:40:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:29:44.553 11:40:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:29:44.812 [ 00:29:44.812 { 00:29:44.812 "name": "BaseBdev3", 00:29:44.812 "aliases": [ 00:29:44.812 "3c231c06-6895-4686-89c1-6758ff5c6adf" 00:29:44.812 ], 00:29:44.812 "product_name": "Malloc disk", 00:29:44.812 "block_size": 512, 00:29:44.812 "num_blocks": 65536, 00:29:44.812 "uuid": "3c231c06-6895-4686-89c1-6758ff5c6adf", 00:29:44.812 "assigned_rate_limits": { 00:29:44.812 "rw_ios_per_sec": 0, 00:29:44.812 "rw_mbytes_per_sec": 0, 00:29:44.812 "r_mbytes_per_sec": 0, 00:29:44.812 "w_mbytes_per_sec": 0 00:29:44.812 }, 00:29:44.812 "claimed": false, 00:29:44.812 "zoned": false, 00:29:44.812 "supported_io_types": { 00:29:44.812 "read": true, 00:29:44.812 "write": true, 00:29:44.812 "unmap": true, 00:29:44.812 "write_zeroes": true, 00:29:44.812 "flush": true, 00:29:44.812 "reset": true, 00:29:44.812 "compare": false, 00:29:44.812 "compare_and_write": false, 00:29:44.812 "abort": true, 00:29:44.812 "nvme_admin": false, 00:29:44.812 "nvme_io": false 00:29:44.812 }, 00:29:44.812 "memory_domains": [ 00:29:44.812 { 00:29:44.812 "dma_device_id": "system", 00:29:44.812 "dma_device_type": 1 00:29:44.812 }, 00:29:44.812 { 00:29:44.812 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:44.812 "dma_device_type": 2 00:29:44.812 } 00:29:44.812 ], 00:29:44.812 "driver_specific": {} 00:29:44.812 } 00:29:44.812 ] 00:29:44.812 11:40:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:29:44.812 11:40:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:29:44.812 11:40:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:29:44.812 11:40:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:29:45.072 [2024-06-10 11:40:28.846494] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:29:45.072 [2024-06-10 11:40:28.846534] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:29:45.072 [2024-06-10 11:40:28.846546] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:29:45.072 [2024-06-10 11:40:28.847579] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:29:45.072 11:40:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:29:45.072 11:40:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:45.072 11:40:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:45.072 11:40:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:29:45.072 11:40:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:29:45.072 11:40:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:29:45.072 11:40:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:45.072 11:40:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:45.072 11:40:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:45.072 11:40:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:45.072 11:40:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:45.072 11:40:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:45.332 11:40:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:45.332 "name": "Existed_Raid", 00:29:45.332 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:45.332 "strip_size_kb": 64, 00:29:45.332 "state": "configuring", 00:29:45.332 "raid_level": "raid0", 00:29:45.332 "superblock": false, 00:29:45.332 "num_base_bdevs": 3, 00:29:45.332 "num_base_bdevs_discovered": 2, 00:29:45.332 "num_base_bdevs_operational": 3, 00:29:45.332 "base_bdevs_list": [ 00:29:45.332 { 00:29:45.332 "name": "BaseBdev1", 00:29:45.332 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:45.332 "is_configured": false, 00:29:45.332 "data_offset": 0, 00:29:45.332 "data_size": 0 00:29:45.332 }, 00:29:45.332 { 00:29:45.332 "name": "BaseBdev2", 00:29:45.332 "uuid": "1e4422d4-fbb3-455c-ad98-ebf47dbb2668", 00:29:45.332 "is_configured": true, 00:29:45.332 "data_offset": 0, 00:29:45.332 "data_size": 65536 00:29:45.332 }, 00:29:45.332 { 00:29:45.332 "name": "BaseBdev3", 00:29:45.332 "uuid": "3c231c06-6895-4686-89c1-6758ff5c6adf", 00:29:45.332 "is_configured": true, 00:29:45.332 "data_offset": 0, 00:29:45.332 "data_size": 65536 00:29:45.332 } 00:29:45.332 ] 00:29:45.332 }' 00:29:45.332 11:40:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:45.332 11:40:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:29:45.591 11:40:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:29:45.850 [2024-06-10 11:40:29.652565] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:29:45.850 11:40:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:29:45.850 11:40:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:45.850 11:40:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:45.850 11:40:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:29:45.850 11:40:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:29:45.850 11:40:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:29:45.850 11:40:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:45.850 11:40:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:45.850 11:40:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:45.850 11:40:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:45.850 11:40:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:45.850 11:40:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:46.109 11:40:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:46.109 "name": "Existed_Raid", 00:29:46.109 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:46.109 "strip_size_kb": 64, 00:29:46.109 "state": "configuring", 00:29:46.109 "raid_level": "raid0", 00:29:46.109 "superblock": false, 00:29:46.109 "num_base_bdevs": 3, 00:29:46.109 "num_base_bdevs_discovered": 1, 00:29:46.109 "num_base_bdevs_operational": 3, 00:29:46.109 "base_bdevs_list": [ 00:29:46.109 { 00:29:46.109 "name": "BaseBdev1", 00:29:46.109 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:46.109 "is_configured": false, 00:29:46.109 "data_offset": 0, 00:29:46.109 "data_size": 0 00:29:46.109 }, 00:29:46.109 { 00:29:46.109 "name": null, 00:29:46.109 "uuid": "1e4422d4-fbb3-455c-ad98-ebf47dbb2668", 00:29:46.109 "is_configured": false, 00:29:46.109 "data_offset": 0, 00:29:46.109 "data_size": 65536 00:29:46.109 }, 00:29:46.109 { 00:29:46.109 "name": "BaseBdev3", 00:29:46.109 "uuid": "3c231c06-6895-4686-89c1-6758ff5c6adf", 00:29:46.109 "is_configured": true, 00:29:46.109 "data_offset": 0, 00:29:46.109 "data_size": 65536 00:29:46.109 } 00:29:46.109 ] 00:29:46.109 }' 00:29:46.109 11:40:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:46.109 11:40:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:29:46.677 11:40:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:46.677 11:40:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:29:46.677 11:40:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:29:46.677 11:40:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:29:46.937 [2024-06-10 11:40:30.639110] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:29:46.937 BaseBdev1 00:29:46.937 11:40:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:29:46.937 11:40:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:29:46.937 11:40:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:29:46.937 11:40:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:29:46.937 11:40:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:29:46.937 11:40:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:29:46.937 11:40:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:29:46.937 11:40:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:29:47.196 [ 00:29:47.196 { 00:29:47.196 "name": "BaseBdev1", 00:29:47.196 "aliases": [ 00:29:47.196 "e29db0c5-486e-4bf2-8c78-5f0242c82356" 00:29:47.196 ], 00:29:47.196 "product_name": "Malloc disk", 00:29:47.196 "block_size": 512, 00:29:47.196 "num_blocks": 65536, 00:29:47.196 "uuid": "e29db0c5-486e-4bf2-8c78-5f0242c82356", 00:29:47.196 "assigned_rate_limits": { 00:29:47.196 "rw_ios_per_sec": 0, 00:29:47.196 "rw_mbytes_per_sec": 0, 00:29:47.196 "r_mbytes_per_sec": 0, 00:29:47.196 "w_mbytes_per_sec": 0 00:29:47.196 }, 00:29:47.196 "claimed": true, 00:29:47.196 "claim_type": "exclusive_write", 00:29:47.196 "zoned": false, 00:29:47.196 "supported_io_types": { 00:29:47.196 "read": true, 00:29:47.196 "write": true, 00:29:47.196 "unmap": true, 00:29:47.196 "write_zeroes": true, 00:29:47.196 "flush": true, 00:29:47.196 "reset": true, 00:29:47.196 "compare": false, 00:29:47.196 "compare_and_write": false, 00:29:47.196 "abort": true, 00:29:47.196 "nvme_admin": false, 00:29:47.196 "nvme_io": false 00:29:47.196 }, 00:29:47.196 "memory_domains": [ 00:29:47.196 { 00:29:47.196 "dma_device_id": "system", 00:29:47.196 "dma_device_type": 1 00:29:47.196 }, 00:29:47.196 { 00:29:47.196 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:47.196 "dma_device_type": 2 00:29:47.196 } 00:29:47.196 ], 00:29:47.196 "driver_specific": {} 00:29:47.196 } 00:29:47.196 ] 00:29:47.196 11:40:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:29:47.196 11:40:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:29:47.196 11:40:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:47.196 11:40:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:47.196 11:40:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:29:47.196 11:40:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:29:47.196 11:40:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:29:47.196 11:40:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:47.196 11:40:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:47.196 11:40:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:47.196 11:40:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:47.196 11:40:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:47.196 11:40:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:47.196 11:40:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:47.196 "name": "Existed_Raid", 00:29:47.196 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:47.196 "strip_size_kb": 64, 00:29:47.196 "state": "configuring", 00:29:47.196 "raid_level": "raid0", 00:29:47.196 "superblock": false, 00:29:47.196 "num_base_bdevs": 3, 00:29:47.196 "num_base_bdevs_discovered": 2, 00:29:47.196 "num_base_bdevs_operational": 3, 00:29:47.196 "base_bdevs_list": [ 00:29:47.196 { 00:29:47.196 "name": "BaseBdev1", 00:29:47.196 "uuid": "e29db0c5-486e-4bf2-8c78-5f0242c82356", 00:29:47.196 "is_configured": true, 00:29:47.196 "data_offset": 0, 00:29:47.196 "data_size": 65536 00:29:47.196 }, 00:29:47.196 { 00:29:47.197 "name": null, 00:29:47.197 "uuid": "1e4422d4-fbb3-455c-ad98-ebf47dbb2668", 00:29:47.197 "is_configured": false, 00:29:47.197 "data_offset": 0, 00:29:47.197 "data_size": 65536 00:29:47.197 }, 00:29:47.197 { 00:29:47.197 "name": "BaseBdev3", 00:29:47.197 "uuid": "3c231c06-6895-4686-89c1-6758ff5c6adf", 00:29:47.197 "is_configured": true, 00:29:47.197 "data_offset": 0, 00:29:47.197 "data_size": 65536 00:29:47.197 } 00:29:47.197 ] 00:29:47.197 }' 00:29:47.197 11:40:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:47.197 11:40:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:29:47.769 11:40:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:47.769 11:40:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:29:48.028 11:40:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:29:48.028 11:40:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:29:48.028 [2024-06-10 11:40:31.922473] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:29:48.028 11:40:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:29:48.028 11:40:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:48.028 11:40:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:48.028 11:40:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:29:48.028 11:40:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:29:48.028 11:40:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:29:48.028 11:40:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:48.028 11:40:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:48.028 11:40:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:48.028 11:40:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:48.028 11:40:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:48.028 11:40:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:48.288 11:40:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:48.288 "name": "Existed_Raid", 00:29:48.288 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:48.288 "strip_size_kb": 64, 00:29:48.288 "state": "configuring", 00:29:48.288 "raid_level": "raid0", 00:29:48.288 "superblock": false, 00:29:48.288 "num_base_bdevs": 3, 00:29:48.288 "num_base_bdevs_discovered": 1, 00:29:48.288 "num_base_bdevs_operational": 3, 00:29:48.288 "base_bdevs_list": [ 00:29:48.288 { 00:29:48.288 "name": "BaseBdev1", 00:29:48.288 "uuid": "e29db0c5-486e-4bf2-8c78-5f0242c82356", 00:29:48.288 "is_configured": true, 00:29:48.288 "data_offset": 0, 00:29:48.288 "data_size": 65536 00:29:48.288 }, 00:29:48.288 { 00:29:48.288 "name": null, 00:29:48.288 "uuid": "1e4422d4-fbb3-455c-ad98-ebf47dbb2668", 00:29:48.288 "is_configured": false, 00:29:48.288 "data_offset": 0, 00:29:48.288 "data_size": 65536 00:29:48.288 }, 00:29:48.288 { 00:29:48.288 "name": null, 00:29:48.288 "uuid": "3c231c06-6895-4686-89c1-6758ff5c6adf", 00:29:48.288 "is_configured": false, 00:29:48.288 "data_offset": 0, 00:29:48.288 "data_size": 65536 00:29:48.288 } 00:29:48.288 ] 00:29:48.288 }' 00:29:48.288 11:40:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:48.288 11:40:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:29:48.855 11:40:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:48.855 11:40:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:29:48.855 11:40:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:29:48.855 11:40:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:29:49.114 [2024-06-10 11:40:32.945146] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:29:49.114 11:40:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:29:49.114 11:40:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:49.114 11:40:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:49.114 11:40:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:29:49.114 11:40:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:29:49.114 11:40:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:29:49.114 11:40:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:49.114 11:40:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:49.114 11:40:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:49.114 11:40:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:49.114 11:40:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:49.114 11:40:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:49.372 11:40:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:49.372 "name": "Existed_Raid", 00:29:49.373 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:49.373 "strip_size_kb": 64, 00:29:49.373 "state": "configuring", 00:29:49.373 "raid_level": "raid0", 00:29:49.373 "superblock": false, 00:29:49.373 "num_base_bdevs": 3, 00:29:49.373 "num_base_bdevs_discovered": 2, 00:29:49.373 "num_base_bdevs_operational": 3, 00:29:49.373 "base_bdevs_list": [ 00:29:49.373 { 00:29:49.373 "name": "BaseBdev1", 00:29:49.373 "uuid": "e29db0c5-486e-4bf2-8c78-5f0242c82356", 00:29:49.373 "is_configured": true, 00:29:49.373 "data_offset": 0, 00:29:49.373 "data_size": 65536 00:29:49.373 }, 00:29:49.373 { 00:29:49.373 "name": null, 00:29:49.373 "uuid": "1e4422d4-fbb3-455c-ad98-ebf47dbb2668", 00:29:49.373 "is_configured": false, 00:29:49.373 "data_offset": 0, 00:29:49.373 "data_size": 65536 00:29:49.373 }, 00:29:49.373 { 00:29:49.373 "name": "BaseBdev3", 00:29:49.373 "uuid": "3c231c06-6895-4686-89c1-6758ff5c6adf", 00:29:49.373 "is_configured": true, 00:29:49.373 "data_offset": 0, 00:29:49.373 "data_size": 65536 00:29:49.373 } 00:29:49.373 ] 00:29:49.373 }' 00:29:49.373 11:40:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:49.373 11:40:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:29:49.941 11:40:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:49.941 11:40:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:29:49.941 11:40:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:29:49.941 11:40:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:29:50.200 [2024-06-10 11:40:33.939747] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:29:50.200 11:40:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:29:50.200 11:40:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:50.200 11:40:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:50.200 11:40:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:29:50.200 11:40:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:29:50.200 11:40:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:29:50.200 11:40:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:50.200 11:40:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:50.200 11:40:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:50.200 11:40:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:50.200 11:40:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:50.200 11:40:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:50.459 11:40:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:50.459 "name": "Existed_Raid", 00:29:50.459 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:50.459 "strip_size_kb": 64, 00:29:50.459 "state": "configuring", 00:29:50.459 "raid_level": "raid0", 00:29:50.459 "superblock": false, 00:29:50.459 "num_base_bdevs": 3, 00:29:50.459 "num_base_bdevs_discovered": 1, 00:29:50.459 "num_base_bdevs_operational": 3, 00:29:50.459 "base_bdevs_list": [ 00:29:50.459 { 00:29:50.459 "name": null, 00:29:50.459 "uuid": "e29db0c5-486e-4bf2-8c78-5f0242c82356", 00:29:50.459 "is_configured": false, 00:29:50.459 "data_offset": 0, 00:29:50.459 "data_size": 65536 00:29:50.459 }, 00:29:50.459 { 00:29:50.459 "name": null, 00:29:50.459 "uuid": "1e4422d4-fbb3-455c-ad98-ebf47dbb2668", 00:29:50.459 "is_configured": false, 00:29:50.459 "data_offset": 0, 00:29:50.459 "data_size": 65536 00:29:50.459 }, 00:29:50.459 { 00:29:50.459 "name": "BaseBdev3", 00:29:50.459 "uuid": "3c231c06-6895-4686-89c1-6758ff5c6adf", 00:29:50.459 "is_configured": true, 00:29:50.459 "data_offset": 0, 00:29:50.459 "data_size": 65536 00:29:50.459 } 00:29:50.459 ] 00:29:50.459 }' 00:29:50.459 11:40:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:50.459 11:40:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:29:50.719 11:40:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:50.719 11:40:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:29:50.978 11:40:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:29:50.978 11:40:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:29:51.237 [2024-06-10 11:40:34.988495] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:29:51.237 11:40:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:29:51.237 11:40:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:51.237 11:40:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:51.237 11:40:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:29:51.237 11:40:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:29:51.237 11:40:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:29:51.237 11:40:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:51.237 11:40:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:51.237 11:40:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:51.237 11:40:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:51.237 11:40:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:51.237 11:40:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:51.237 11:40:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:51.237 "name": "Existed_Raid", 00:29:51.237 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:51.237 "strip_size_kb": 64, 00:29:51.237 "state": "configuring", 00:29:51.237 "raid_level": "raid0", 00:29:51.237 "superblock": false, 00:29:51.237 "num_base_bdevs": 3, 00:29:51.237 "num_base_bdevs_discovered": 2, 00:29:51.237 "num_base_bdevs_operational": 3, 00:29:51.237 "base_bdevs_list": [ 00:29:51.237 { 00:29:51.237 "name": null, 00:29:51.237 "uuid": "e29db0c5-486e-4bf2-8c78-5f0242c82356", 00:29:51.237 "is_configured": false, 00:29:51.237 "data_offset": 0, 00:29:51.237 "data_size": 65536 00:29:51.237 }, 00:29:51.237 { 00:29:51.237 "name": "BaseBdev2", 00:29:51.237 "uuid": "1e4422d4-fbb3-455c-ad98-ebf47dbb2668", 00:29:51.237 "is_configured": true, 00:29:51.237 "data_offset": 0, 00:29:51.237 "data_size": 65536 00:29:51.237 }, 00:29:51.237 { 00:29:51.237 "name": "BaseBdev3", 00:29:51.237 "uuid": "3c231c06-6895-4686-89c1-6758ff5c6adf", 00:29:51.237 "is_configured": true, 00:29:51.237 "data_offset": 0, 00:29:51.237 "data_size": 65536 00:29:51.237 } 00:29:51.237 ] 00:29:51.237 }' 00:29:51.237 11:40:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:51.237 11:40:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:29:51.805 11:40:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:51.805 11:40:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:29:52.064 11:40:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:29:52.064 11:40:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:52.064 11:40:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:29:52.324 11:40:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u e29db0c5-486e-4bf2-8c78-5f0242c82356 00:29:52.324 [2024-06-10 11:40:36.191666] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:29:52.324 [2024-06-10 11:40:36.191705] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x12bb410 00:29:52.324 [2024-06-10 11:40:36.191711] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:29:52.324 [2024-06-10 11:40:36.191852] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1477140 00:29:52.324 [2024-06-10 11:40:36.191947] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x12bb410 00:29:52.324 [2024-06-10 11:40:36.191954] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x12bb410 00:29:52.324 [2024-06-10 11:40:36.192089] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:52.324 NewBaseBdev 00:29:52.324 11:40:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:29:52.324 11:40:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=NewBaseBdev 00:29:52.324 11:40:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:29:52.324 11:40:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:29:52.324 11:40:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:29:52.324 11:40:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:29:52.324 11:40:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:29:52.583 11:40:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:29:52.583 [ 00:29:52.583 { 00:29:52.583 "name": "NewBaseBdev", 00:29:52.583 "aliases": [ 00:29:52.583 "e29db0c5-486e-4bf2-8c78-5f0242c82356" 00:29:52.583 ], 00:29:52.583 "product_name": "Malloc disk", 00:29:52.583 "block_size": 512, 00:29:52.583 "num_blocks": 65536, 00:29:52.583 "uuid": "e29db0c5-486e-4bf2-8c78-5f0242c82356", 00:29:52.583 "assigned_rate_limits": { 00:29:52.583 "rw_ios_per_sec": 0, 00:29:52.583 "rw_mbytes_per_sec": 0, 00:29:52.583 "r_mbytes_per_sec": 0, 00:29:52.583 "w_mbytes_per_sec": 0 00:29:52.583 }, 00:29:52.583 "claimed": true, 00:29:52.583 "claim_type": "exclusive_write", 00:29:52.583 "zoned": false, 00:29:52.584 "supported_io_types": { 00:29:52.584 "read": true, 00:29:52.584 "write": true, 00:29:52.584 "unmap": true, 00:29:52.584 "write_zeroes": true, 00:29:52.584 "flush": true, 00:29:52.584 "reset": true, 00:29:52.584 "compare": false, 00:29:52.584 "compare_and_write": false, 00:29:52.584 "abort": true, 00:29:52.584 "nvme_admin": false, 00:29:52.584 "nvme_io": false 00:29:52.584 }, 00:29:52.584 "memory_domains": [ 00:29:52.584 { 00:29:52.584 "dma_device_id": "system", 00:29:52.584 "dma_device_type": 1 00:29:52.584 }, 00:29:52.584 { 00:29:52.584 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:52.584 "dma_device_type": 2 00:29:52.584 } 00:29:52.584 ], 00:29:52.584 "driver_specific": {} 00:29:52.584 } 00:29:52.584 ] 00:29:52.584 11:40:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:29:52.584 11:40:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:29:52.584 11:40:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:52.584 11:40:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:52.584 11:40:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:29:52.584 11:40:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:29:52.584 11:40:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:29:52.843 11:40:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:52.843 11:40:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:52.843 11:40:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:52.843 11:40:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:52.843 11:40:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:52.843 11:40:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:52.843 11:40:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:52.843 "name": "Existed_Raid", 00:29:52.843 "uuid": "ee15541b-d281-4153-8d1a-3b3cb704d8ef", 00:29:52.843 "strip_size_kb": 64, 00:29:52.843 "state": "online", 00:29:52.843 "raid_level": "raid0", 00:29:52.843 "superblock": false, 00:29:52.843 "num_base_bdevs": 3, 00:29:52.843 "num_base_bdevs_discovered": 3, 00:29:52.843 "num_base_bdevs_operational": 3, 00:29:52.843 "base_bdevs_list": [ 00:29:52.843 { 00:29:52.843 "name": "NewBaseBdev", 00:29:52.843 "uuid": "e29db0c5-486e-4bf2-8c78-5f0242c82356", 00:29:52.843 "is_configured": true, 00:29:52.843 "data_offset": 0, 00:29:52.843 "data_size": 65536 00:29:52.843 }, 00:29:52.843 { 00:29:52.843 "name": "BaseBdev2", 00:29:52.843 "uuid": "1e4422d4-fbb3-455c-ad98-ebf47dbb2668", 00:29:52.843 "is_configured": true, 00:29:52.843 "data_offset": 0, 00:29:52.843 "data_size": 65536 00:29:52.843 }, 00:29:52.843 { 00:29:52.843 "name": "BaseBdev3", 00:29:52.843 "uuid": "3c231c06-6895-4686-89c1-6758ff5c6adf", 00:29:52.843 "is_configured": true, 00:29:52.843 "data_offset": 0, 00:29:52.843 "data_size": 65536 00:29:52.843 } 00:29:52.843 ] 00:29:52.843 }' 00:29:52.843 11:40:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:52.843 11:40:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:29:53.411 11:40:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:29:53.411 11:40:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:29:53.411 11:40:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:29:53.411 11:40:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:29:53.411 11:40:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:29:53.411 11:40:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:29:53.411 11:40:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:29:53.411 11:40:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:29:53.411 [2024-06-10 11:40:37.346876] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:53.670 11:40:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:29:53.670 "name": "Existed_Raid", 00:29:53.670 "aliases": [ 00:29:53.670 "ee15541b-d281-4153-8d1a-3b3cb704d8ef" 00:29:53.670 ], 00:29:53.670 "product_name": "Raid Volume", 00:29:53.670 "block_size": 512, 00:29:53.670 "num_blocks": 196608, 00:29:53.670 "uuid": "ee15541b-d281-4153-8d1a-3b3cb704d8ef", 00:29:53.670 "assigned_rate_limits": { 00:29:53.670 "rw_ios_per_sec": 0, 00:29:53.670 "rw_mbytes_per_sec": 0, 00:29:53.670 "r_mbytes_per_sec": 0, 00:29:53.670 "w_mbytes_per_sec": 0 00:29:53.670 }, 00:29:53.671 "claimed": false, 00:29:53.671 "zoned": false, 00:29:53.671 "supported_io_types": { 00:29:53.671 "read": true, 00:29:53.671 "write": true, 00:29:53.671 "unmap": true, 00:29:53.671 "write_zeroes": true, 00:29:53.671 "flush": true, 00:29:53.671 "reset": true, 00:29:53.671 "compare": false, 00:29:53.671 "compare_and_write": false, 00:29:53.671 "abort": false, 00:29:53.671 "nvme_admin": false, 00:29:53.671 "nvme_io": false 00:29:53.671 }, 00:29:53.671 "memory_domains": [ 00:29:53.671 { 00:29:53.671 "dma_device_id": "system", 00:29:53.671 "dma_device_type": 1 00:29:53.671 }, 00:29:53.671 { 00:29:53.671 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:53.671 "dma_device_type": 2 00:29:53.671 }, 00:29:53.671 { 00:29:53.671 "dma_device_id": "system", 00:29:53.671 "dma_device_type": 1 00:29:53.671 }, 00:29:53.671 { 00:29:53.671 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:53.671 "dma_device_type": 2 00:29:53.671 }, 00:29:53.671 { 00:29:53.671 "dma_device_id": "system", 00:29:53.671 "dma_device_type": 1 00:29:53.671 }, 00:29:53.671 { 00:29:53.671 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:53.671 "dma_device_type": 2 00:29:53.671 } 00:29:53.671 ], 00:29:53.671 "driver_specific": { 00:29:53.671 "raid": { 00:29:53.671 "uuid": "ee15541b-d281-4153-8d1a-3b3cb704d8ef", 00:29:53.671 "strip_size_kb": 64, 00:29:53.671 "state": "online", 00:29:53.671 "raid_level": "raid0", 00:29:53.671 "superblock": false, 00:29:53.671 "num_base_bdevs": 3, 00:29:53.671 "num_base_bdevs_discovered": 3, 00:29:53.671 "num_base_bdevs_operational": 3, 00:29:53.671 "base_bdevs_list": [ 00:29:53.671 { 00:29:53.671 "name": "NewBaseBdev", 00:29:53.671 "uuid": "e29db0c5-486e-4bf2-8c78-5f0242c82356", 00:29:53.671 "is_configured": true, 00:29:53.671 "data_offset": 0, 00:29:53.671 "data_size": 65536 00:29:53.671 }, 00:29:53.671 { 00:29:53.671 "name": "BaseBdev2", 00:29:53.671 "uuid": "1e4422d4-fbb3-455c-ad98-ebf47dbb2668", 00:29:53.671 "is_configured": true, 00:29:53.671 "data_offset": 0, 00:29:53.671 "data_size": 65536 00:29:53.671 }, 00:29:53.671 { 00:29:53.671 "name": "BaseBdev3", 00:29:53.671 "uuid": "3c231c06-6895-4686-89c1-6758ff5c6adf", 00:29:53.671 "is_configured": true, 00:29:53.671 "data_offset": 0, 00:29:53.671 "data_size": 65536 00:29:53.671 } 00:29:53.671 ] 00:29:53.671 } 00:29:53.671 } 00:29:53.671 }' 00:29:53.671 11:40:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:29:53.671 11:40:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:29:53.671 BaseBdev2 00:29:53.671 BaseBdev3' 00:29:53.671 11:40:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:53.671 11:40:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:29:53.671 11:40:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:53.671 11:40:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:53.671 "name": "NewBaseBdev", 00:29:53.671 "aliases": [ 00:29:53.671 "e29db0c5-486e-4bf2-8c78-5f0242c82356" 00:29:53.671 ], 00:29:53.671 "product_name": "Malloc disk", 00:29:53.671 "block_size": 512, 00:29:53.671 "num_blocks": 65536, 00:29:53.671 "uuid": "e29db0c5-486e-4bf2-8c78-5f0242c82356", 00:29:53.671 "assigned_rate_limits": { 00:29:53.671 "rw_ios_per_sec": 0, 00:29:53.671 "rw_mbytes_per_sec": 0, 00:29:53.671 "r_mbytes_per_sec": 0, 00:29:53.671 "w_mbytes_per_sec": 0 00:29:53.671 }, 00:29:53.671 "claimed": true, 00:29:53.671 "claim_type": "exclusive_write", 00:29:53.671 "zoned": false, 00:29:53.671 "supported_io_types": { 00:29:53.671 "read": true, 00:29:53.671 "write": true, 00:29:53.671 "unmap": true, 00:29:53.671 "write_zeroes": true, 00:29:53.671 "flush": true, 00:29:53.671 "reset": true, 00:29:53.671 "compare": false, 00:29:53.671 "compare_and_write": false, 00:29:53.671 "abort": true, 00:29:53.671 "nvme_admin": false, 00:29:53.671 "nvme_io": false 00:29:53.671 }, 00:29:53.671 "memory_domains": [ 00:29:53.671 { 00:29:53.671 "dma_device_id": "system", 00:29:53.671 "dma_device_type": 1 00:29:53.671 }, 00:29:53.671 { 00:29:53.671 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:53.671 "dma_device_type": 2 00:29:53.671 } 00:29:53.671 ], 00:29:53.671 "driver_specific": {} 00:29:53.671 }' 00:29:53.671 11:40:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:53.930 11:40:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:53.930 11:40:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:29:53.930 11:40:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:53.930 11:40:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:53.930 11:40:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:29:53.930 11:40:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:53.930 11:40:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:53.930 11:40:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:29:53.930 11:40:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:53.930 11:40:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:54.189 11:40:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:29:54.189 11:40:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:54.189 11:40:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:54.189 11:40:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:29:54.189 11:40:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:54.189 "name": "BaseBdev2", 00:29:54.189 "aliases": [ 00:29:54.189 "1e4422d4-fbb3-455c-ad98-ebf47dbb2668" 00:29:54.189 ], 00:29:54.189 "product_name": "Malloc disk", 00:29:54.189 "block_size": 512, 00:29:54.189 "num_blocks": 65536, 00:29:54.189 "uuid": "1e4422d4-fbb3-455c-ad98-ebf47dbb2668", 00:29:54.189 "assigned_rate_limits": { 00:29:54.189 "rw_ios_per_sec": 0, 00:29:54.189 "rw_mbytes_per_sec": 0, 00:29:54.189 "r_mbytes_per_sec": 0, 00:29:54.189 "w_mbytes_per_sec": 0 00:29:54.189 }, 00:29:54.189 "claimed": true, 00:29:54.189 "claim_type": "exclusive_write", 00:29:54.189 "zoned": false, 00:29:54.189 "supported_io_types": { 00:29:54.189 "read": true, 00:29:54.189 "write": true, 00:29:54.189 "unmap": true, 00:29:54.189 "write_zeroes": true, 00:29:54.189 "flush": true, 00:29:54.189 "reset": true, 00:29:54.189 "compare": false, 00:29:54.189 "compare_and_write": false, 00:29:54.189 "abort": true, 00:29:54.189 "nvme_admin": false, 00:29:54.189 "nvme_io": false 00:29:54.189 }, 00:29:54.189 "memory_domains": [ 00:29:54.189 { 00:29:54.189 "dma_device_id": "system", 00:29:54.189 "dma_device_type": 1 00:29:54.189 }, 00:29:54.189 { 00:29:54.189 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:54.189 "dma_device_type": 2 00:29:54.189 } 00:29:54.189 ], 00:29:54.189 "driver_specific": {} 00:29:54.189 }' 00:29:54.189 11:40:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:54.189 11:40:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:54.189 11:40:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:29:54.189 11:40:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:54.449 11:40:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:54.449 11:40:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:29:54.449 11:40:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:54.449 11:40:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:54.449 11:40:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:29:54.449 11:40:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:54.449 11:40:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:54.449 11:40:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:29:54.449 11:40:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:54.449 11:40:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:29:54.449 11:40:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:54.707 11:40:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:54.707 "name": "BaseBdev3", 00:29:54.707 "aliases": [ 00:29:54.707 "3c231c06-6895-4686-89c1-6758ff5c6adf" 00:29:54.707 ], 00:29:54.707 "product_name": "Malloc disk", 00:29:54.707 "block_size": 512, 00:29:54.707 "num_blocks": 65536, 00:29:54.707 "uuid": "3c231c06-6895-4686-89c1-6758ff5c6adf", 00:29:54.707 "assigned_rate_limits": { 00:29:54.707 "rw_ios_per_sec": 0, 00:29:54.707 "rw_mbytes_per_sec": 0, 00:29:54.707 "r_mbytes_per_sec": 0, 00:29:54.707 "w_mbytes_per_sec": 0 00:29:54.707 }, 00:29:54.707 "claimed": true, 00:29:54.707 "claim_type": "exclusive_write", 00:29:54.707 "zoned": false, 00:29:54.707 "supported_io_types": { 00:29:54.707 "read": true, 00:29:54.707 "write": true, 00:29:54.707 "unmap": true, 00:29:54.707 "write_zeroes": true, 00:29:54.707 "flush": true, 00:29:54.707 "reset": true, 00:29:54.707 "compare": false, 00:29:54.707 "compare_and_write": false, 00:29:54.707 "abort": true, 00:29:54.707 "nvme_admin": false, 00:29:54.707 "nvme_io": false 00:29:54.707 }, 00:29:54.707 "memory_domains": [ 00:29:54.707 { 00:29:54.707 "dma_device_id": "system", 00:29:54.707 "dma_device_type": 1 00:29:54.707 }, 00:29:54.707 { 00:29:54.707 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:54.707 "dma_device_type": 2 00:29:54.707 } 00:29:54.707 ], 00:29:54.707 "driver_specific": {} 00:29:54.707 }' 00:29:54.707 11:40:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:54.707 11:40:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:54.707 11:40:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:29:54.707 11:40:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:54.707 11:40:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:54.966 11:40:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:29:54.966 11:40:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:54.966 11:40:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:54.966 11:40:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:29:54.966 11:40:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:54.966 11:40:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:54.966 11:40:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:29:54.967 11:40:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:29:55.228 [2024-06-10 11:40:39.002991] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:29:55.228 [2024-06-10 11:40:39.003017] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:29:55.228 [2024-06-10 11:40:39.003057] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:55.228 [2024-06-10 11:40:39.003096] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:55.228 [2024-06-10 11:40:39.003104] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12bb410 name Existed_Raid, state offline 00:29:55.228 11:40:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 147235 00:29:55.228 11:40:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@949 -- # '[' -z 147235 ']' 00:29:55.228 11:40:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # kill -0 147235 00:29:55.228 11:40:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # uname 00:29:55.228 11:40:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:29:55.228 11:40:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 147235 00:29:55.228 11:40:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:29:55.228 11:40:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:29:55.228 11:40:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 147235' 00:29:55.228 killing process with pid 147235 00:29:55.228 11:40:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # kill 147235 00:29:55.228 [2024-06-10 11:40:39.060133] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:29:55.228 11:40:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@973 -- # wait 147235 00:29:55.228 [2024-06-10 11:40:39.089508] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:29:55.563 11:40:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:29:55.563 00:29:55.563 real 0m21.519s 00:29:55.563 user 0m39.203s 00:29:55.563 sys 0m4.139s 00:29:55.563 11:40:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:29:55.563 11:40:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:29:55.563 ************************************ 00:29:55.563 END TEST raid_state_function_test 00:29:55.563 ************************************ 00:29:55.563 11:40:39 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 3 true 00:29:55.563 11:40:39 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:29:55.563 11:40:39 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:29:55.563 11:40:39 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:29:55.563 ************************************ 00:29:55.563 START TEST raid_state_function_test_sb 00:29:55.563 ************************************ 00:29:55.563 11:40:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # raid_state_function_test raid0 3 true 00:29:55.563 11:40:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:29:55.563 11:40:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:29:55.563 11:40:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:29:55.563 11:40:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:29:55.563 11:40:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:29:55.563 11:40:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:29:55.563 11:40:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:29:55.563 11:40:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:29:55.563 11:40:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:29:55.563 11:40:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:29:55.563 11:40:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:29:55.563 11:40:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:29:55.563 11:40:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:29:55.563 11:40:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:29:55.563 11:40:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:29:55.563 11:40:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:29:55.563 11:40:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:29:55.563 11:40:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:29:55.563 11:40:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:29:55.563 11:40:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:29:55.563 11:40:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:29:55.563 11:40:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:29:55.564 11:40:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:29:55.564 11:40:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:29:55.564 11:40:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:29:55.564 11:40:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:29:55.564 11:40:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=150686 00:29:55.564 11:40:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 150686' 00:29:55.564 Process raid pid: 150686 00:29:55.564 11:40:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:29:55.564 11:40:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 150686 /var/tmp/spdk-raid.sock 00:29:55.564 11:40:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@830 -- # '[' -z 150686 ']' 00:29:55.564 11:40:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:29:55.564 11:40:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local max_retries=100 00:29:55.564 11:40:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:29:55.564 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:29:55.564 11:40:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@839 -- # xtrace_disable 00:29:55.564 11:40:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:29:55.564 [2024-06-10 11:40:39.440953] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:29:55.564 [2024-06-10 11:40:39.441008] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:29:55.830 [2024-06-10 11:40:39.527048] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:55.830 [2024-06-10 11:40:39.609492] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:29:55.830 [2024-06-10 11:40:39.664970] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:55.830 [2024-06-10 11:40:39.664995] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:56.397 11:40:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:29:56.397 11:40:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@863 -- # return 0 00:29:56.397 11:40:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:29:56.657 [2024-06-10 11:40:40.398786] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:29:56.657 [2024-06-10 11:40:40.398823] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:29:56.657 [2024-06-10 11:40:40.398831] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:29:56.657 [2024-06-10 11:40:40.398854] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:29:56.657 [2024-06-10 11:40:40.398860] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:29:56.657 [2024-06-10 11:40:40.398881] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:29:56.657 11:40:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:29:56.657 11:40:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:56.657 11:40:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:56.657 11:40:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:29:56.657 11:40:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:29:56.657 11:40:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:29:56.657 11:40:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:56.657 11:40:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:56.657 11:40:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:56.657 11:40:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:56.657 11:40:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:56.657 11:40:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:56.657 11:40:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:56.657 "name": "Existed_Raid", 00:29:56.657 "uuid": "34b910e7-fc60-4e9e-b513-4123108754f9", 00:29:56.657 "strip_size_kb": 64, 00:29:56.657 "state": "configuring", 00:29:56.657 "raid_level": "raid0", 00:29:56.657 "superblock": true, 00:29:56.657 "num_base_bdevs": 3, 00:29:56.657 "num_base_bdevs_discovered": 0, 00:29:56.657 "num_base_bdevs_operational": 3, 00:29:56.657 "base_bdevs_list": [ 00:29:56.657 { 00:29:56.657 "name": "BaseBdev1", 00:29:56.657 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:56.657 "is_configured": false, 00:29:56.657 "data_offset": 0, 00:29:56.657 "data_size": 0 00:29:56.657 }, 00:29:56.657 { 00:29:56.657 "name": "BaseBdev2", 00:29:56.657 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:56.657 "is_configured": false, 00:29:56.657 "data_offset": 0, 00:29:56.657 "data_size": 0 00:29:56.657 }, 00:29:56.657 { 00:29:56.657 "name": "BaseBdev3", 00:29:56.657 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:56.657 "is_configured": false, 00:29:56.657 "data_offset": 0, 00:29:56.657 "data_size": 0 00:29:56.657 } 00:29:56.657 ] 00:29:56.657 }' 00:29:56.657 11:40:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:56.657 11:40:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:29:57.225 11:40:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:29:57.485 [2024-06-10 11:40:41.208781] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:29:57.485 [2024-06-10 11:40:41.208812] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x152b530 name Existed_Raid, state configuring 00:29:57.485 11:40:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:29:57.485 [2024-06-10 11:40:41.373236] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:29:57.485 [2024-06-10 11:40:41.373262] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:29:57.485 [2024-06-10 11:40:41.373268] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:29:57.485 [2024-06-10 11:40:41.373276] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:29:57.485 [2024-06-10 11:40:41.373298] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:29:57.485 [2024-06-10 11:40:41.373305] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:29:57.485 11:40:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:29:57.744 [2024-06-10 11:40:41.551581] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:29:57.745 BaseBdev1 00:29:57.745 11:40:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:29:57.745 11:40:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:29:57.745 11:40:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:29:57.745 11:40:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:29:57.745 11:40:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:29:57.745 11:40:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:29:57.745 11:40:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:29:58.004 11:40:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:29:58.004 [ 00:29:58.004 { 00:29:58.004 "name": "BaseBdev1", 00:29:58.004 "aliases": [ 00:29:58.004 "1212f1c8-8c81-400d-85c2-e509e8dbea84" 00:29:58.004 ], 00:29:58.004 "product_name": "Malloc disk", 00:29:58.004 "block_size": 512, 00:29:58.004 "num_blocks": 65536, 00:29:58.004 "uuid": "1212f1c8-8c81-400d-85c2-e509e8dbea84", 00:29:58.004 "assigned_rate_limits": { 00:29:58.004 "rw_ios_per_sec": 0, 00:29:58.004 "rw_mbytes_per_sec": 0, 00:29:58.004 "r_mbytes_per_sec": 0, 00:29:58.004 "w_mbytes_per_sec": 0 00:29:58.004 }, 00:29:58.004 "claimed": true, 00:29:58.004 "claim_type": "exclusive_write", 00:29:58.004 "zoned": false, 00:29:58.004 "supported_io_types": { 00:29:58.004 "read": true, 00:29:58.004 "write": true, 00:29:58.004 "unmap": true, 00:29:58.004 "write_zeroes": true, 00:29:58.004 "flush": true, 00:29:58.004 "reset": true, 00:29:58.004 "compare": false, 00:29:58.004 "compare_and_write": false, 00:29:58.004 "abort": true, 00:29:58.004 "nvme_admin": false, 00:29:58.004 "nvme_io": false 00:29:58.004 }, 00:29:58.004 "memory_domains": [ 00:29:58.004 { 00:29:58.004 "dma_device_id": "system", 00:29:58.004 "dma_device_type": 1 00:29:58.004 }, 00:29:58.004 { 00:29:58.004 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:58.004 "dma_device_type": 2 00:29:58.004 } 00:29:58.004 ], 00:29:58.004 "driver_specific": {} 00:29:58.004 } 00:29:58.004 ] 00:29:58.004 11:40:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:29:58.004 11:40:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:29:58.004 11:40:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:58.004 11:40:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:58.004 11:40:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:29:58.004 11:40:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:29:58.004 11:40:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:29:58.004 11:40:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:58.004 11:40:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:58.004 11:40:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:58.004 11:40:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:58.004 11:40:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:58.004 11:40:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:58.263 11:40:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:58.263 "name": "Existed_Raid", 00:29:58.263 "uuid": "35d7eba1-5186-497e-8961-2ae1de13b54f", 00:29:58.263 "strip_size_kb": 64, 00:29:58.263 "state": "configuring", 00:29:58.263 "raid_level": "raid0", 00:29:58.263 "superblock": true, 00:29:58.263 "num_base_bdevs": 3, 00:29:58.263 "num_base_bdevs_discovered": 1, 00:29:58.263 "num_base_bdevs_operational": 3, 00:29:58.263 "base_bdevs_list": [ 00:29:58.263 { 00:29:58.263 "name": "BaseBdev1", 00:29:58.263 "uuid": "1212f1c8-8c81-400d-85c2-e509e8dbea84", 00:29:58.263 "is_configured": true, 00:29:58.263 "data_offset": 2048, 00:29:58.263 "data_size": 63488 00:29:58.263 }, 00:29:58.263 { 00:29:58.263 "name": "BaseBdev2", 00:29:58.263 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:58.263 "is_configured": false, 00:29:58.263 "data_offset": 0, 00:29:58.263 "data_size": 0 00:29:58.263 }, 00:29:58.263 { 00:29:58.263 "name": "BaseBdev3", 00:29:58.263 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:58.263 "is_configured": false, 00:29:58.263 "data_offset": 0, 00:29:58.263 "data_size": 0 00:29:58.263 } 00:29:58.263 ] 00:29:58.263 }' 00:29:58.263 11:40:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:58.263 11:40:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:29:58.830 11:40:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:29:58.830 [2024-06-10 11:40:42.722614] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:29:58.830 [2024-06-10 11:40:42.722647] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x152ae00 name Existed_Raid, state configuring 00:29:58.831 11:40:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:29:59.089 [2024-06-10 11:40:42.895090] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:29:59.089 [2024-06-10 11:40:42.896159] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:29:59.089 [2024-06-10 11:40:42.896186] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:29:59.089 [2024-06-10 11:40:42.896192] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:29:59.089 [2024-06-10 11:40:42.896200] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:29:59.089 11:40:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:29:59.089 11:40:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:29:59.089 11:40:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:29:59.089 11:40:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:59.090 11:40:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:59.090 11:40:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:29:59.090 11:40:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:29:59.090 11:40:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:29:59.090 11:40:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:59.090 11:40:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:59.090 11:40:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:59.090 11:40:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:59.090 11:40:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:59.090 11:40:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:59.348 11:40:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:59.348 "name": "Existed_Raid", 00:29:59.348 "uuid": "f3b53935-7510-43ec-8d2f-75bfd8e2cabf", 00:29:59.348 "strip_size_kb": 64, 00:29:59.348 "state": "configuring", 00:29:59.348 "raid_level": "raid0", 00:29:59.348 "superblock": true, 00:29:59.348 "num_base_bdevs": 3, 00:29:59.348 "num_base_bdevs_discovered": 1, 00:29:59.348 "num_base_bdevs_operational": 3, 00:29:59.348 "base_bdevs_list": [ 00:29:59.348 { 00:29:59.348 "name": "BaseBdev1", 00:29:59.348 "uuid": "1212f1c8-8c81-400d-85c2-e509e8dbea84", 00:29:59.348 "is_configured": true, 00:29:59.348 "data_offset": 2048, 00:29:59.348 "data_size": 63488 00:29:59.348 }, 00:29:59.348 { 00:29:59.348 "name": "BaseBdev2", 00:29:59.348 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:59.348 "is_configured": false, 00:29:59.348 "data_offset": 0, 00:29:59.348 "data_size": 0 00:29:59.348 }, 00:29:59.348 { 00:29:59.348 "name": "BaseBdev3", 00:29:59.348 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:59.348 "is_configured": false, 00:29:59.348 "data_offset": 0, 00:29:59.348 "data_size": 0 00:29:59.348 } 00:29:59.348 ] 00:29:59.348 }' 00:29:59.349 11:40:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:59.349 11:40:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:29:59.917 11:40:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:29:59.917 [2024-06-10 11:40:43.756249] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:29:59.917 BaseBdev2 00:29:59.917 11:40:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:29:59.917 11:40:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:29:59.917 11:40:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:29:59.917 11:40:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:29:59.917 11:40:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:29:59.917 11:40:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:29:59.917 11:40:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:30:00.176 11:40:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:30:00.176 [ 00:30:00.176 { 00:30:00.176 "name": "BaseBdev2", 00:30:00.176 "aliases": [ 00:30:00.176 "ff874d28-3a69-4460-8d04-aaeb81dba6f4" 00:30:00.176 ], 00:30:00.176 "product_name": "Malloc disk", 00:30:00.176 "block_size": 512, 00:30:00.176 "num_blocks": 65536, 00:30:00.176 "uuid": "ff874d28-3a69-4460-8d04-aaeb81dba6f4", 00:30:00.176 "assigned_rate_limits": { 00:30:00.176 "rw_ios_per_sec": 0, 00:30:00.176 "rw_mbytes_per_sec": 0, 00:30:00.176 "r_mbytes_per_sec": 0, 00:30:00.176 "w_mbytes_per_sec": 0 00:30:00.176 }, 00:30:00.176 "claimed": true, 00:30:00.176 "claim_type": "exclusive_write", 00:30:00.176 "zoned": false, 00:30:00.176 "supported_io_types": { 00:30:00.176 "read": true, 00:30:00.176 "write": true, 00:30:00.176 "unmap": true, 00:30:00.176 "write_zeroes": true, 00:30:00.176 "flush": true, 00:30:00.176 "reset": true, 00:30:00.176 "compare": false, 00:30:00.176 "compare_and_write": false, 00:30:00.176 "abort": true, 00:30:00.176 "nvme_admin": false, 00:30:00.176 "nvme_io": false 00:30:00.176 }, 00:30:00.176 "memory_domains": [ 00:30:00.176 { 00:30:00.176 "dma_device_id": "system", 00:30:00.176 "dma_device_type": 1 00:30:00.176 }, 00:30:00.176 { 00:30:00.176 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:00.176 "dma_device_type": 2 00:30:00.176 } 00:30:00.176 ], 00:30:00.176 "driver_specific": {} 00:30:00.176 } 00:30:00.176 ] 00:30:00.436 11:40:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:30:00.436 11:40:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:30:00.436 11:40:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:30:00.436 11:40:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:30:00.436 11:40:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:30:00.436 11:40:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:30:00.436 11:40:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:30:00.436 11:40:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:30:00.436 11:40:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:30:00.436 11:40:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:00.436 11:40:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:00.436 11:40:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:00.436 11:40:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:00.436 11:40:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:00.436 11:40:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:30:00.436 11:40:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:00.436 "name": "Existed_Raid", 00:30:00.436 "uuid": "f3b53935-7510-43ec-8d2f-75bfd8e2cabf", 00:30:00.436 "strip_size_kb": 64, 00:30:00.436 "state": "configuring", 00:30:00.436 "raid_level": "raid0", 00:30:00.436 "superblock": true, 00:30:00.436 "num_base_bdevs": 3, 00:30:00.436 "num_base_bdevs_discovered": 2, 00:30:00.436 "num_base_bdevs_operational": 3, 00:30:00.436 "base_bdevs_list": [ 00:30:00.436 { 00:30:00.436 "name": "BaseBdev1", 00:30:00.436 "uuid": "1212f1c8-8c81-400d-85c2-e509e8dbea84", 00:30:00.436 "is_configured": true, 00:30:00.436 "data_offset": 2048, 00:30:00.436 "data_size": 63488 00:30:00.436 }, 00:30:00.436 { 00:30:00.436 "name": "BaseBdev2", 00:30:00.436 "uuid": "ff874d28-3a69-4460-8d04-aaeb81dba6f4", 00:30:00.436 "is_configured": true, 00:30:00.436 "data_offset": 2048, 00:30:00.436 "data_size": 63488 00:30:00.436 }, 00:30:00.436 { 00:30:00.436 "name": "BaseBdev3", 00:30:00.436 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:00.436 "is_configured": false, 00:30:00.436 "data_offset": 0, 00:30:00.436 "data_size": 0 00:30:00.436 } 00:30:00.436 ] 00:30:00.436 }' 00:30:00.436 11:40:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:00.436 11:40:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:30:01.005 11:40:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:30:01.005 [2024-06-10 11:40:44.938519] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:30:01.005 [2024-06-10 11:40:44.938649] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x152bcf0 00:30:01.005 [2024-06-10 11:40:44.938659] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:30:01.005 [2024-06-10 11:40:44.938778] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1542be0 00:30:01.005 [2024-06-10 11:40:44.938859] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x152bcf0 00:30:01.005 [2024-06-10 11:40:44.938875] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x152bcf0 00:30:01.005 [2024-06-10 11:40:44.938952] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:01.005 BaseBdev3 00:30:01.263 11:40:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:30:01.263 11:40:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:30:01.263 11:40:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:30:01.263 11:40:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:30:01.263 11:40:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:30:01.263 11:40:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:30:01.263 11:40:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:30:01.263 11:40:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:30:01.522 [ 00:30:01.522 { 00:30:01.522 "name": "BaseBdev3", 00:30:01.522 "aliases": [ 00:30:01.522 "f76a39b3-01a9-4adc-9c8d-b4090b2e0ddd" 00:30:01.522 ], 00:30:01.522 "product_name": "Malloc disk", 00:30:01.522 "block_size": 512, 00:30:01.522 "num_blocks": 65536, 00:30:01.522 "uuid": "f76a39b3-01a9-4adc-9c8d-b4090b2e0ddd", 00:30:01.522 "assigned_rate_limits": { 00:30:01.522 "rw_ios_per_sec": 0, 00:30:01.522 "rw_mbytes_per_sec": 0, 00:30:01.522 "r_mbytes_per_sec": 0, 00:30:01.522 "w_mbytes_per_sec": 0 00:30:01.522 }, 00:30:01.522 "claimed": true, 00:30:01.522 "claim_type": "exclusive_write", 00:30:01.522 "zoned": false, 00:30:01.522 "supported_io_types": { 00:30:01.522 "read": true, 00:30:01.522 "write": true, 00:30:01.522 "unmap": true, 00:30:01.522 "write_zeroes": true, 00:30:01.522 "flush": true, 00:30:01.523 "reset": true, 00:30:01.523 "compare": false, 00:30:01.523 "compare_and_write": false, 00:30:01.523 "abort": true, 00:30:01.523 "nvme_admin": false, 00:30:01.523 "nvme_io": false 00:30:01.523 }, 00:30:01.523 "memory_domains": [ 00:30:01.523 { 00:30:01.523 "dma_device_id": "system", 00:30:01.523 "dma_device_type": 1 00:30:01.523 }, 00:30:01.523 { 00:30:01.523 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:01.523 "dma_device_type": 2 00:30:01.523 } 00:30:01.523 ], 00:30:01.523 "driver_specific": {} 00:30:01.523 } 00:30:01.523 ] 00:30:01.523 11:40:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:30:01.523 11:40:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:30:01.523 11:40:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:30:01.523 11:40:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:30:01.523 11:40:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:30:01.523 11:40:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:01.523 11:40:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:30:01.523 11:40:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:30:01.523 11:40:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:30:01.523 11:40:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:01.523 11:40:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:01.523 11:40:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:01.523 11:40:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:01.523 11:40:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:01.523 11:40:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:30:01.523 11:40:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:01.523 "name": "Existed_Raid", 00:30:01.523 "uuid": "f3b53935-7510-43ec-8d2f-75bfd8e2cabf", 00:30:01.523 "strip_size_kb": 64, 00:30:01.523 "state": "online", 00:30:01.523 "raid_level": "raid0", 00:30:01.523 "superblock": true, 00:30:01.523 "num_base_bdevs": 3, 00:30:01.523 "num_base_bdevs_discovered": 3, 00:30:01.523 "num_base_bdevs_operational": 3, 00:30:01.523 "base_bdevs_list": [ 00:30:01.523 { 00:30:01.523 "name": "BaseBdev1", 00:30:01.523 "uuid": "1212f1c8-8c81-400d-85c2-e509e8dbea84", 00:30:01.523 "is_configured": true, 00:30:01.523 "data_offset": 2048, 00:30:01.523 "data_size": 63488 00:30:01.523 }, 00:30:01.523 { 00:30:01.523 "name": "BaseBdev2", 00:30:01.523 "uuid": "ff874d28-3a69-4460-8d04-aaeb81dba6f4", 00:30:01.523 "is_configured": true, 00:30:01.523 "data_offset": 2048, 00:30:01.523 "data_size": 63488 00:30:01.523 }, 00:30:01.523 { 00:30:01.523 "name": "BaseBdev3", 00:30:01.523 "uuid": "f76a39b3-01a9-4adc-9c8d-b4090b2e0ddd", 00:30:01.523 "is_configured": true, 00:30:01.523 "data_offset": 2048, 00:30:01.523 "data_size": 63488 00:30:01.523 } 00:30:01.523 ] 00:30:01.523 }' 00:30:01.523 11:40:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:01.523 11:40:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:30:02.091 11:40:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:30:02.091 11:40:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:30:02.091 11:40:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:30:02.091 11:40:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:30:02.091 11:40:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:30:02.091 11:40:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:30:02.091 11:40:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:30:02.091 11:40:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:30:02.351 [2024-06-10 11:40:46.134090] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:30:02.351 11:40:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:30:02.351 "name": "Existed_Raid", 00:30:02.351 "aliases": [ 00:30:02.351 "f3b53935-7510-43ec-8d2f-75bfd8e2cabf" 00:30:02.351 ], 00:30:02.351 "product_name": "Raid Volume", 00:30:02.351 "block_size": 512, 00:30:02.351 "num_blocks": 190464, 00:30:02.351 "uuid": "f3b53935-7510-43ec-8d2f-75bfd8e2cabf", 00:30:02.351 "assigned_rate_limits": { 00:30:02.351 "rw_ios_per_sec": 0, 00:30:02.351 "rw_mbytes_per_sec": 0, 00:30:02.351 "r_mbytes_per_sec": 0, 00:30:02.351 "w_mbytes_per_sec": 0 00:30:02.351 }, 00:30:02.351 "claimed": false, 00:30:02.351 "zoned": false, 00:30:02.351 "supported_io_types": { 00:30:02.351 "read": true, 00:30:02.351 "write": true, 00:30:02.351 "unmap": true, 00:30:02.351 "write_zeroes": true, 00:30:02.351 "flush": true, 00:30:02.351 "reset": true, 00:30:02.351 "compare": false, 00:30:02.351 "compare_and_write": false, 00:30:02.351 "abort": false, 00:30:02.351 "nvme_admin": false, 00:30:02.351 "nvme_io": false 00:30:02.351 }, 00:30:02.351 "memory_domains": [ 00:30:02.351 { 00:30:02.351 "dma_device_id": "system", 00:30:02.351 "dma_device_type": 1 00:30:02.351 }, 00:30:02.351 { 00:30:02.351 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:02.351 "dma_device_type": 2 00:30:02.351 }, 00:30:02.351 { 00:30:02.351 "dma_device_id": "system", 00:30:02.351 "dma_device_type": 1 00:30:02.351 }, 00:30:02.351 { 00:30:02.351 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:02.351 "dma_device_type": 2 00:30:02.351 }, 00:30:02.351 { 00:30:02.351 "dma_device_id": "system", 00:30:02.351 "dma_device_type": 1 00:30:02.351 }, 00:30:02.351 { 00:30:02.351 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:02.351 "dma_device_type": 2 00:30:02.351 } 00:30:02.351 ], 00:30:02.351 "driver_specific": { 00:30:02.351 "raid": { 00:30:02.351 "uuid": "f3b53935-7510-43ec-8d2f-75bfd8e2cabf", 00:30:02.351 "strip_size_kb": 64, 00:30:02.351 "state": "online", 00:30:02.351 "raid_level": "raid0", 00:30:02.351 "superblock": true, 00:30:02.351 "num_base_bdevs": 3, 00:30:02.351 "num_base_bdevs_discovered": 3, 00:30:02.351 "num_base_bdevs_operational": 3, 00:30:02.351 "base_bdevs_list": [ 00:30:02.351 { 00:30:02.351 "name": "BaseBdev1", 00:30:02.351 "uuid": "1212f1c8-8c81-400d-85c2-e509e8dbea84", 00:30:02.351 "is_configured": true, 00:30:02.351 "data_offset": 2048, 00:30:02.351 "data_size": 63488 00:30:02.351 }, 00:30:02.351 { 00:30:02.351 "name": "BaseBdev2", 00:30:02.351 "uuid": "ff874d28-3a69-4460-8d04-aaeb81dba6f4", 00:30:02.351 "is_configured": true, 00:30:02.351 "data_offset": 2048, 00:30:02.351 "data_size": 63488 00:30:02.351 }, 00:30:02.351 { 00:30:02.351 "name": "BaseBdev3", 00:30:02.351 "uuid": "f76a39b3-01a9-4adc-9c8d-b4090b2e0ddd", 00:30:02.351 "is_configured": true, 00:30:02.351 "data_offset": 2048, 00:30:02.351 "data_size": 63488 00:30:02.351 } 00:30:02.351 ] 00:30:02.351 } 00:30:02.351 } 00:30:02.351 }' 00:30:02.351 11:40:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:30:02.351 11:40:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:30:02.351 BaseBdev2 00:30:02.351 BaseBdev3' 00:30:02.351 11:40:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:30:02.351 11:40:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:30:02.351 11:40:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:30:02.610 11:40:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:30:02.610 "name": "BaseBdev1", 00:30:02.610 "aliases": [ 00:30:02.610 "1212f1c8-8c81-400d-85c2-e509e8dbea84" 00:30:02.610 ], 00:30:02.610 "product_name": "Malloc disk", 00:30:02.610 "block_size": 512, 00:30:02.610 "num_blocks": 65536, 00:30:02.610 "uuid": "1212f1c8-8c81-400d-85c2-e509e8dbea84", 00:30:02.610 "assigned_rate_limits": { 00:30:02.610 "rw_ios_per_sec": 0, 00:30:02.610 "rw_mbytes_per_sec": 0, 00:30:02.610 "r_mbytes_per_sec": 0, 00:30:02.610 "w_mbytes_per_sec": 0 00:30:02.610 }, 00:30:02.610 "claimed": true, 00:30:02.610 "claim_type": "exclusive_write", 00:30:02.610 "zoned": false, 00:30:02.611 "supported_io_types": { 00:30:02.611 "read": true, 00:30:02.611 "write": true, 00:30:02.611 "unmap": true, 00:30:02.611 "write_zeroes": true, 00:30:02.611 "flush": true, 00:30:02.611 "reset": true, 00:30:02.611 "compare": false, 00:30:02.611 "compare_and_write": false, 00:30:02.611 "abort": true, 00:30:02.611 "nvme_admin": false, 00:30:02.611 "nvme_io": false 00:30:02.611 }, 00:30:02.611 "memory_domains": [ 00:30:02.611 { 00:30:02.611 "dma_device_id": "system", 00:30:02.611 "dma_device_type": 1 00:30:02.611 }, 00:30:02.611 { 00:30:02.611 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:02.611 "dma_device_type": 2 00:30:02.611 } 00:30:02.611 ], 00:30:02.611 "driver_specific": {} 00:30:02.611 }' 00:30:02.611 11:40:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:02.611 11:40:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:02.611 11:40:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:30:02.611 11:40:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:02.611 11:40:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:02.611 11:40:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:30:02.611 11:40:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:02.870 11:40:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:02.870 11:40:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:30:02.870 11:40:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:02.870 11:40:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:02.870 11:40:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:30:02.870 11:40:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:30:02.870 11:40:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:30:02.870 11:40:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:30:03.129 11:40:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:30:03.129 "name": "BaseBdev2", 00:30:03.129 "aliases": [ 00:30:03.129 "ff874d28-3a69-4460-8d04-aaeb81dba6f4" 00:30:03.129 ], 00:30:03.129 "product_name": "Malloc disk", 00:30:03.129 "block_size": 512, 00:30:03.129 "num_blocks": 65536, 00:30:03.129 "uuid": "ff874d28-3a69-4460-8d04-aaeb81dba6f4", 00:30:03.129 "assigned_rate_limits": { 00:30:03.129 "rw_ios_per_sec": 0, 00:30:03.129 "rw_mbytes_per_sec": 0, 00:30:03.129 "r_mbytes_per_sec": 0, 00:30:03.129 "w_mbytes_per_sec": 0 00:30:03.129 }, 00:30:03.129 "claimed": true, 00:30:03.129 "claim_type": "exclusive_write", 00:30:03.129 "zoned": false, 00:30:03.129 "supported_io_types": { 00:30:03.129 "read": true, 00:30:03.129 "write": true, 00:30:03.129 "unmap": true, 00:30:03.129 "write_zeroes": true, 00:30:03.129 "flush": true, 00:30:03.129 "reset": true, 00:30:03.129 "compare": false, 00:30:03.129 "compare_and_write": false, 00:30:03.129 "abort": true, 00:30:03.129 "nvme_admin": false, 00:30:03.129 "nvme_io": false 00:30:03.129 }, 00:30:03.129 "memory_domains": [ 00:30:03.129 { 00:30:03.129 "dma_device_id": "system", 00:30:03.129 "dma_device_type": 1 00:30:03.129 }, 00:30:03.129 { 00:30:03.129 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:03.129 "dma_device_type": 2 00:30:03.129 } 00:30:03.129 ], 00:30:03.129 "driver_specific": {} 00:30:03.129 }' 00:30:03.129 11:40:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:03.129 11:40:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:03.129 11:40:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:30:03.129 11:40:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:03.129 11:40:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:03.129 11:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:30:03.129 11:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:03.129 11:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:03.388 11:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:30:03.388 11:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:03.388 11:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:03.388 11:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:30:03.388 11:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:30:03.388 11:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:30:03.388 11:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:30:03.647 11:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:30:03.647 "name": "BaseBdev3", 00:30:03.647 "aliases": [ 00:30:03.647 "f76a39b3-01a9-4adc-9c8d-b4090b2e0ddd" 00:30:03.647 ], 00:30:03.647 "product_name": "Malloc disk", 00:30:03.647 "block_size": 512, 00:30:03.647 "num_blocks": 65536, 00:30:03.647 "uuid": "f76a39b3-01a9-4adc-9c8d-b4090b2e0ddd", 00:30:03.647 "assigned_rate_limits": { 00:30:03.647 "rw_ios_per_sec": 0, 00:30:03.647 "rw_mbytes_per_sec": 0, 00:30:03.647 "r_mbytes_per_sec": 0, 00:30:03.647 "w_mbytes_per_sec": 0 00:30:03.647 }, 00:30:03.647 "claimed": true, 00:30:03.647 "claim_type": "exclusive_write", 00:30:03.647 "zoned": false, 00:30:03.647 "supported_io_types": { 00:30:03.647 "read": true, 00:30:03.647 "write": true, 00:30:03.647 "unmap": true, 00:30:03.647 "write_zeroes": true, 00:30:03.647 "flush": true, 00:30:03.647 "reset": true, 00:30:03.647 "compare": false, 00:30:03.647 "compare_and_write": false, 00:30:03.647 "abort": true, 00:30:03.647 "nvme_admin": false, 00:30:03.647 "nvme_io": false 00:30:03.647 }, 00:30:03.647 "memory_domains": [ 00:30:03.647 { 00:30:03.647 "dma_device_id": "system", 00:30:03.647 "dma_device_type": 1 00:30:03.647 }, 00:30:03.647 { 00:30:03.647 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:03.647 "dma_device_type": 2 00:30:03.647 } 00:30:03.647 ], 00:30:03.647 "driver_specific": {} 00:30:03.647 }' 00:30:03.648 11:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:03.648 11:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:03.648 11:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:30:03.648 11:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:03.648 11:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:03.648 11:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:30:03.648 11:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:03.648 11:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:03.648 11:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:30:03.648 11:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:03.907 11:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:03.907 11:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:30:03.907 11:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:30:03.907 [2024-06-10 11:40:47.814290] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:30:03.907 [2024-06-10 11:40:47.814311] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:30:03.907 [2024-06-10 11:40:47.814340] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:30:03.907 11:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:30:03.907 11:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:30:03.907 11:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:30:03.907 11:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:30:03.907 11:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:30:03.907 11:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 2 00:30:03.907 11:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:30:03.907 11:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:30:03.907 11:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:30:03.907 11:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:30:03.907 11:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:30:03.907 11:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:03.907 11:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:03.907 11:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:03.907 11:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:03.907 11:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:03.907 11:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:30:04.166 11:40:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:04.166 "name": "Existed_Raid", 00:30:04.166 "uuid": "f3b53935-7510-43ec-8d2f-75bfd8e2cabf", 00:30:04.166 "strip_size_kb": 64, 00:30:04.166 "state": "offline", 00:30:04.166 "raid_level": "raid0", 00:30:04.166 "superblock": true, 00:30:04.166 "num_base_bdevs": 3, 00:30:04.166 "num_base_bdevs_discovered": 2, 00:30:04.166 "num_base_bdevs_operational": 2, 00:30:04.166 "base_bdevs_list": [ 00:30:04.166 { 00:30:04.166 "name": null, 00:30:04.166 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:04.166 "is_configured": false, 00:30:04.166 "data_offset": 2048, 00:30:04.166 "data_size": 63488 00:30:04.166 }, 00:30:04.166 { 00:30:04.166 "name": "BaseBdev2", 00:30:04.166 "uuid": "ff874d28-3a69-4460-8d04-aaeb81dba6f4", 00:30:04.166 "is_configured": true, 00:30:04.166 "data_offset": 2048, 00:30:04.166 "data_size": 63488 00:30:04.166 }, 00:30:04.166 { 00:30:04.166 "name": "BaseBdev3", 00:30:04.166 "uuid": "f76a39b3-01a9-4adc-9c8d-b4090b2e0ddd", 00:30:04.166 "is_configured": true, 00:30:04.166 "data_offset": 2048, 00:30:04.166 "data_size": 63488 00:30:04.166 } 00:30:04.166 ] 00:30:04.166 }' 00:30:04.166 11:40:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:04.167 11:40:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:30:04.734 11:40:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:30:04.734 11:40:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:30:04.734 11:40:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:04.734 11:40:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:30:04.734 11:40:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:30:04.734 11:40:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:30:04.734 11:40:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:30:04.994 [2024-06-10 11:40:48.837704] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:30:04.994 11:40:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:30:04.994 11:40:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:30:04.994 11:40:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:04.994 11:40:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:30:05.253 11:40:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:30:05.253 11:40:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:30:05.253 11:40:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:30:05.512 [2024-06-10 11:40:49.206424] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:30:05.512 [2024-06-10 11:40:49.206461] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x152bcf0 name Existed_Raid, state offline 00:30:05.512 11:40:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:30:05.512 11:40:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:30:05.512 11:40:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:05.512 11:40:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:30:05.512 11:40:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:30:05.512 11:40:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:30:05.512 11:40:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:30:05.512 11:40:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:30:05.512 11:40:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:30:05.512 11:40:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:30:05.771 BaseBdev2 00:30:05.771 11:40:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:30:05.771 11:40:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:30:05.771 11:40:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:30:05.771 11:40:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:30:05.771 11:40:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:30:05.771 11:40:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:30:05.771 11:40:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:30:06.030 11:40:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:30:06.030 [ 00:30:06.030 { 00:30:06.030 "name": "BaseBdev2", 00:30:06.030 "aliases": [ 00:30:06.030 "769e7adc-2eec-4db2-837f-7fc7124bcb30" 00:30:06.030 ], 00:30:06.030 "product_name": "Malloc disk", 00:30:06.030 "block_size": 512, 00:30:06.030 "num_blocks": 65536, 00:30:06.030 "uuid": "769e7adc-2eec-4db2-837f-7fc7124bcb30", 00:30:06.030 "assigned_rate_limits": { 00:30:06.030 "rw_ios_per_sec": 0, 00:30:06.030 "rw_mbytes_per_sec": 0, 00:30:06.030 "r_mbytes_per_sec": 0, 00:30:06.030 "w_mbytes_per_sec": 0 00:30:06.030 }, 00:30:06.030 "claimed": false, 00:30:06.030 "zoned": false, 00:30:06.030 "supported_io_types": { 00:30:06.030 "read": true, 00:30:06.030 "write": true, 00:30:06.030 "unmap": true, 00:30:06.030 "write_zeroes": true, 00:30:06.030 "flush": true, 00:30:06.030 "reset": true, 00:30:06.030 "compare": false, 00:30:06.030 "compare_and_write": false, 00:30:06.030 "abort": true, 00:30:06.030 "nvme_admin": false, 00:30:06.030 "nvme_io": false 00:30:06.030 }, 00:30:06.030 "memory_domains": [ 00:30:06.030 { 00:30:06.030 "dma_device_id": "system", 00:30:06.030 "dma_device_type": 1 00:30:06.030 }, 00:30:06.030 { 00:30:06.030 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:06.030 "dma_device_type": 2 00:30:06.030 } 00:30:06.030 ], 00:30:06.030 "driver_specific": {} 00:30:06.030 } 00:30:06.030 ] 00:30:06.030 11:40:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:30:06.030 11:40:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:30:06.030 11:40:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:30:06.030 11:40:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:30:06.289 BaseBdev3 00:30:06.289 11:40:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:30:06.289 11:40:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:30:06.289 11:40:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:30:06.289 11:40:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:30:06.289 11:40:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:30:06.289 11:40:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:30:06.289 11:40:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:30:06.548 11:40:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:30:06.548 [ 00:30:06.548 { 00:30:06.548 "name": "BaseBdev3", 00:30:06.548 "aliases": [ 00:30:06.548 "5188bf4f-d8ca-4ce2-83c5-5f755c10a045" 00:30:06.548 ], 00:30:06.548 "product_name": "Malloc disk", 00:30:06.548 "block_size": 512, 00:30:06.548 "num_blocks": 65536, 00:30:06.548 "uuid": "5188bf4f-d8ca-4ce2-83c5-5f755c10a045", 00:30:06.548 "assigned_rate_limits": { 00:30:06.548 "rw_ios_per_sec": 0, 00:30:06.548 "rw_mbytes_per_sec": 0, 00:30:06.548 "r_mbytes_per_sec": 0, 00:30:06.548 "w_mbytes_per_sec": 0 00:30:06.548 }, 00:30:06.548 "claimed": false, 00:30:06.548 "zoned": false, 00:30:06.548 "supported_io_types": { 00:30:06.548 "read": true, 00:30:06.548 "write": true, 00:30:06.548 "unmap": true, 00:30:06.548 "write_zeroes": true, 00:30:06.548 "flush": true, 00:30:06.548 "reset": true, 00:30:06.548 "compare": false, 00:30:06.548 "compare_and_write": false, 00:30:06.548 "abort": true, 00:30:06.548 "nvme_admin": false, 00:30:06.548 "nvme_io": false 00:30:06.548 }, 00:30:06.548 "memory_domains": [ 00:30:06.548 { 00:30:06.548 "dma_device_id": "system", 00:30:06.548 "dma_device_type": 1 00:30:06.548 }, 00:30:06.548 { 00:30:06.548 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:06.548 "dma_device_type": 2 00:30:06.548 } 00:30:06.548 ], 00:30:06.548 "driver_specific": {} 00:30:06.548 } 00:30:06.548 ] 00:30:06.548 11:40:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:30:06.548 11:40:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:30:06.548 11:40:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:30:06.548 11:40:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:30:06.807 [2024-06-10 11:40:50.581191] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:30:06.807 [2024-06-10 11:40:50.581231] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:30:06.807 [2024-06-10 11:40:50.581245] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:30:06.807 [2024-06-10 11:40:50.582271] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:30:06.807 11:40:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:30:06.807 11:40:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:30:06.807 11:40:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:30:06.807 11:40:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:30:06.807 11:40:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:30:06.807 11:40:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:30:06.807 11:40:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:06.807 11:40:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:06.807 11:40:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:06.807 11:40:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:06.807 11:40:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:06.807 11:40:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:30:07.066 11:40:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:07.066 "name": "Existed_Raid", 00:30:07.066 "uuid": "d051687c-fa9e-483d-949a-614d9455d548", 00:30:07.066 "strip_size_kb": 64, 00:30:07.066 "state": "configuring", 00:30:07.066 "raid_level": "raid0", 00:30:07.066 "superblock": true, 00:30:07.066 "num_base_bdevs": 3, 00:30:07.066 "num_base_bdevs_discovered": 2, 00:30:07.066 "num_base_bdevs_operational": 3, 00:30:07.066 "base_bdevs_list": [ 00:30:07.066 { 00:30:07.066 "name": "BaseBdev1", 00:30:07.066 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:07.066 "is_configured": false, 00:30:07.066 "data_offset": 0, 00:30:07.066 "data_size": 0 00:30:07.066 }, 00:30:07.066 { 00:30:07.066 "name": "BaseBdev2", 00:30:07.066 "uuid": "769e7adc-2eec-4db2-837f-7fc7124bcb30", 00:30:07.066 "is_configured": true, 00:30:07.066 "data_offset": 2048, 00:30:07.066 "data_size": 63488 00:30:07.066 }, 00:30:07.066 { 00:30:07.066 "name": "BaseBdev3", 00:30:07.066 "uuid": "5188bf4f-d8ca-4ce2-83c5-5f755c10a045", 00:30:07.066 "is_configured": true, 00:30:07.066 "data_offset": 2048, 00:30:07.066 "data_size": 63488 00:30:07.066 } 00:30:07.066 ] 00:30:07.066 }' 00:30:07.066 11:40:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:07.066 11:40:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:30:07.325 11:40:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:30:07.584 [2024-06-10 11:40:51.411292] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:30:07.584 11:40:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:30:07.584 11:40:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:30:07.584 11:40:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:30:07.584 11:40:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:30:07.584 11:40:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:30:07.584 11:40:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:30:07.584 11:40:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:07.584 11:40:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:07.584 11:40:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:07.584 11:40:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:07.584 11:40:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:30:07.584 11:40:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:07.843 11:40:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:07.843 "name": "Existed_Raid", 00:30:07.843 "uuid": "d051687c-fa9e-483d-949a-614d9455d548", 00:30:07.843 "strip_size_kb": 64, 00:30:07.843 "state": "configuring", 00:30:07.843 "raid_level": "raid0", 00:30:07.843 "superblock": true, 00:30:07.843 "num_base_bdevs": 3, 00:30:07.843 "num_base_bdevs_discovered": 1, 00:30:07.843 "num_base_bdevs_operational": 3, 00:30:07.843 "base_bdevs_list": [ 00:30:07.843 { 00:30:07.843 "name": "BaseBdev1", 00:30:07.843 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:07.843 "is_configured": false, 00:30:07.843 "data_offset": 0, 00:30:07.843 "data_size": 0 00:30:07.843 }, 00:30:07.843 { 00:30:07.843 "name": null, 00:30:07.843 "uuid": "769e7adc-2eec-4db2-837f-7fc7124bcb30", 00:30:07.843 "is_configured": false, 00:30:07.843 "data_offset": 2048, 00:30:07.843 "data_size": 63488 00:30:07.843 }, 00:30:07.843 { 00:30:07.843 "name": "BaseBdev3", 00:30:07.843 "uuid": "5188bf4f-d8ca-4ce2-83c5-5f755c10a045", 00:30:07.843 "is_configured": true, 00:30:07.843 "data_offset": 2048, 00:30:07.843 "data_size": 63488 00:30:07.843 } 00:30:07.843 ] 00:30:07.843 }' 00:30:07.843 11:40:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:07.843 11:40:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:30:08.411 11:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:08.411 11:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:30:08.411 11:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:30:08.411 11:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:30:08.669 [2024-06-10 11:40:52.460886] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:30:08.669 BaseBdev1 00:30:08.669 11:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:30:08.669 11:40:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:30:08.669 11:40:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:30:08.669 11:40:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:30:08.669 11:40:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:30:08.669 11:40:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:30:08.669 11:40:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:30:08.928 11:40:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:30:08.928 [ 00:30:08.928 { 00:30:08.928 "name": "BaseBdev1", 00:30:08.928 "aliases": [ 00:30:08.928 "c5709cf4-9528-4640-9737-50f7aa65af8a" 00:30:08.928 ], 00:30:08.928 "product_name": "Malloc disk", 00:30:08.928 "block_size": 512, 00:30:08.928 "num_blocks": 65536, 00:30:08.928 "uuid": "c5709cf4-9528-4640-9737-50f7aa65af8a", 00:30:08.928 "assigned_rate_limits": { 00:30:08.928 "rw_ios_per_sec": 0, 00:30:08.928 "rw_mbytes_per_sec": 0, 00:30:08.928 "r_mbytes_per_sec": 0, 00:30:08.928 "w_mbytes_per_sec": 0 00:30:08.928 }, 00:30:08.928 "claimed": true, 00:30:08.928 "claim_type": "exclusive_write", 00:30:08.928 "zoned": false, 00:30:08.928 "supported_io_types": { 00:30:08.928 "read": true, 00:30:08.928 "write": true, 00:30:08.928 "unmap": true, 00:30:08.928 "write_zeroes": true, 00:30:08.928 "flush": true, 00:30:08.928 "reset": true, 00:30:08.928 "compare": false, 00:30:08.928 "compare_and_write": false, 00:30:08.928 "abort": true, 00:30:08.928 "nvme_admin": false, 00:30:08.928 "nvme_io": false 00:30:08.928 }, 00:30:08.928 "memory_domains": [ 00:30:08.928 { 00:30:08.928 "dma_device_id": "system", 00:30:08.928 "dma_device_type": 1 00:30:08.928 }, 00:30:08.928 { 00:30:08.928 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:08.928 "dma_device_type": 2 00:30:08.928 } 00:30:08.928 ], 00:30:08.928 "driver_specific": {} 00:30:08.928 } 00:30:08.928 ] 00:30:08.928 11:40:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:30:08.928 11:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:30:08.928 11:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:30:08.928 11:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:30:08.928 11:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:30:08.928 11:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:30:08.928 11:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:30:08.928 11:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:08.928 11:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:08.928 11:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:08.928 11:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:08.928 11:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:08.928 11:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:30:09.187 11:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:09.187 "name": "Existed_Raid", 00:30:09.187 "uuid": "d051687c-fa9e-483d-949a-614d9455d548", 00:30:09.187 "strip_size_kb": 64, 00:30:09.187 "state": "configuring", 00:30:09.187 "raid_level": "raid0", 00:30:09.187 "superblock": true, 00:30:09.187 "num_base_bdevs": 3, 00:30:09.187 "num_base_bdevs_discovered": 2, 00:30:09.187 "num_base_bdevs_operational": 3, 00:30:09.187 "base_bdevs_list": [ 00:30:09.187 { 00:30:09.187 "name": "BaseBdev1", 00:30:09.187 "uuid": "c5709cf4-9528-4640-9737-50f7aa65af8a", 00:30:09.187 "is_configured": true, 00:30:09.187 "data_offset": 2048, 00:30:09.187 "data_size": 63488 00:30:09.187 }, 00:30:09.187 { 00:30:09.187 "name": null, 00:30:09.187 "uuid": "769e7adc-2eec-4db2-837f-7fc7124bcb30", 00:30:09.187 "is_configured": false, 00:30:09.187 "data_offset": 2048, 00:30:09.187 "data_size": 63488 00:30:09.187 }, 00:30:09.187 { 00:30:09.187 "name": "BaseBdev3", 00:30:09.187 "uuid": "5188bf4f-d8ca-4ce2-83c5-5f755c10a045", 00:30:09.187 "is_configured": true, 00:30:09.187 "data_offset": 2048, 00:30:09.187 "data_size": 63488 00:30:09.187 } 00:30:09.187 ] 00:30:09.187 }' 00:30:09.187 11:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:09.187 11:40:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:30:09.754 11:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:09.755 11:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:30:09.755 11:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:30:09.755 11:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:30:10.013 [2024-06-10 11:40:53.836618] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:30:10.013 11:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:30:10.013 11:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:30:10.013 11:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:30:10.013 11:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:30:10.013 11:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:30:10.013 11:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:30:10.013 11:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:10.013 11:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:10.013 11:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:10.013 11:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:10.013 11:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:10.013 11:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:30:10.272 11:40:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:10.272 "name": "Existed_Raid", 00:30:10.272 "uuid": "d051687c-fa9e-483d-949a-614d9455d548", 00:30:10.272 "strip_size_kb": 64, 00:30:10.272 "state": "configuring", 00:30:10.272 "raid_level": "raid0", 00:30:10.272 "superblock": true, 00:30:10.272 "num_base_bdevs": 3, 00:30:10.272 "num_base_bdevs_discovered": 1, 00:30:10.272 "num_base_bdevs_operational": 3, 00:30:10.272 "base_bdevs_list": [ 00:30:10.272 { 00:30:10.272 "name": "BaseBdev1", 00:30:10.272 "uuid": "c5709cf4-9528-4640-9737-50f7aa65af8a", 00:30:10.272 "is_configured": true, 00:30:10.272 "data_offset": 2048, 00:30:10.272 "data_size": 63488 00:30:10.272 }, 00:30:10.272 { 00:30:10.272 "name": null, 00:30:10.272 "uuid": "769e7adc-2eec-4db2-837f-7fc7124bcb30", 00:30:10.272 "is_configured": false, 00:30:10.272 "data_offset": 2048, 00:30:10.272 "data_size": 63488 00:30:10.272 }, 00:30:10.272 { 00:30:10.272 "name": null, 00:30:10.272 "uuid": "5188bf4f-d8ca-4ce2-83c5-5f755c10a045", 00:30:10.272 "is_configured": false, 00:30:10.272 "data_offset": 2048, 00:30:10.272 "data_size": 63488 00:30:10.272 } 00:30:10.272 ] 00:30:10.272 }' 00:30:10.272 11:40:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:10.272 11:40:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:30:10.838 11:40:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:30:10.838 11:40:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:10.839 11:40:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:30:10.839 11:40:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:30:11.097 [2024-06-10 11:40:54.867307] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:30:11.097 11:40:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:30:11.097 11:40:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:30:11.097 11:40:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:30:11.097 11:40:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:30:11.097 11:40:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:30:11.097 11:40:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:30:11.097 11:40:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:11.097 11:40:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:11.097 11:40:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:11.097 11:40:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:11.097 11:40:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:11.097 11:40:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:30:11.356 11:40:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:11.356 "name": "Existed_Raid", 00:30:11.356 "uuid": "d051687c-fa9e-483d-949a-614d9455d548", 00:30:11.356 "strip_size_kb": 64, 00:30:11.356 "state": "configuring", 00:30:11.356 "raid_level": "raid0", 00:30:11.356 "superblock": true, 00:30:11.356 "num_base_bdevs": 3, 00:30:11.356 "num_base_bdevs_discovered": 2, 00:30:11.356 "num_base_bdevs_operational": 3, 00:30:11.356 "base_bdevs_list": [ 00:30:11.356 { 00:30:11.356 "name": "BaseBdev1", 00:30:11.356 "uuid": "c5709cf4-9528-4640-9737-50f7aa65af8a", 00:30:11.356 "is_configured": true, 00:30:11.356 "data_offset": 2048, 00:30:11.356 "data_size": 63488 00:30:11.356 }, 00:30:11.356 { 00:30:11.356 "name": null, 00:30:11.356 "uuid": "769e7adc-2eec-4db2-837f-7fc7124bcb30", 00:30:11.356 "is_configured": false, 00:30:11.356 "data_offset": 2048, 00:30:11.356 "data_size": 63488 00:30:11.356 }, 00:30:11.356 { 00:30:11.356 "name": "BaseBdev3", 00:30:11.356 "uuid": "5188bf4f-d8ca-4ce2-83c5-5f755c10a045", 00:30:11.356 "is_configured": true, 00:30:11.356 "data_offset": 2048, 00:30:11.356 "data_size": 63488 00:30:11.356 } 00:30:11.356 ] 00:30:11.356 }' 00:30:11.357 11:40:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:11.357 11:40:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:30:11.615 11:40:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:11.615 11:40:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:30:11.874 11:40:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:30:11.874 11:40:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:30:12.133 [2024-06-10 11:40:55.893964] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:30:12.133 11:40:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:30:12.133 11:40:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:30:12.133 11:40:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:30:12.133 11:40:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:30:12.133 11:40:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:30:12.133 11:40:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:30:12.133 11:40:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:12.133 11:40:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:12.133 11:40:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:12.133 11:40:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:12.133 11:40:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:12.133 11:40:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:30:12.392 11:40:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:12.392 "name": "Existed_Raid", 00:30:12.392 "uuid": "d051687c-fa9e-483d-949a-614d9455d548", 00:30:12.392 "strip_size_kb": 64, 00:30:12.392 "state": "configuring", 00:30:12.392 "raid_level": "raid0", 00:30:12.392 "superblock": true, 00:30:12.392 "num_base_bdevs": 3, 00:30:12.392 "num_base_bdevs_discovered": 1, 00:30:12.392 "num_base_bdevs_operational": 3, 00:30:12.392 "base_bdevs_list": [ 00:30:12.392 { 00:30:12.392 "name": null, 00:30:12.392 "uuid": "c5709cf4-9528-4640-9737-50f7aa65af8a", 00:30:12.392 "is_configured": false, 00:30:12.392 "data_offset": 2048, 00:30:12.392 "data_size": 63488 00:30:12.392 }, 00:30:12.392 { 00:30:12.392 "name": null, 00:30:12.392 "uuid": "769e7adc-2eec-4db2-837f-7fc7124bcb30", 00:30:12.392 "is_configured": false, 00:30:12.392 "data_offset": 2048, 00:30:12.392 "data_size": 63488 00:30:12.392 }, 00:30:12.392 { 00:30:12.392 "name": "BaseBdev3", 00:30:12.392 "uuid": "5188bf4f-d8ca-4ce2-83c5-5f755c10a045", 00:30:12.392 "is_configured": true, 00:30:12.392 "data_offset": 2048, 00:30:12.392 "data_size": 63488 00:30:12.392 } 00:30:12.392 ] 00:30:12.392 }' 00:30:12.392 11:40:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:12.392 11:40:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:30:12.650 11:40:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:30:12.650 11:40:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:12.908 11:40:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:30:12.908 11:40:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:30:13.167 [2024-06-10 11:40:56.900404] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:30:13.167 11:40:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:30:13.167 11:40:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:30:13.167 11:40:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:30:13.167 11:40:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:30:13.167 11:40:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:30:13.167 11:40:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:30:13.167 11:40:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:13.167 11:40:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:13.167 11:40:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:13.167 11:40:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:13.167 11:40:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:13.167 11:40:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:30:13.167 11:40:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:13.167 "name": "Existed_Raid", 00:30:13.167 "uuid": "d051687c-fa9e-483d-949a-614d9455d548", 00:30:13.167 "strip_size_kb": 64, 00:30:13.167 "state": "configuring", 00:30:13.167 "raid_level": "raid0", 00:30:13.167 "superblock": true, 00:30:13.167 "num_base_bdevs": 3, 00:30:13.167 "num_base_bdevs_discovered": 2, 00:30:13.167 "num_base_bdevs_operational": 3, 00:30:13.167 "base_bdevs_list": [ 00:30:13.167 { 00:30:13.167 "name": null, 00:30:13.167 "uuid": "c5709cf4-9528-4640-9737-50f7aa65af8a", 00:30:13.167 "is_configured": false, 00:30:13.167 "data_offset": 2048, 00:30:13.167 "data_size": 63488 00:30:13.167 }, 00:30:13.167 { 00:30:13.167 "name": "BaseBdev2", 00:30:13.167 "uuid": "769e7adc-2eec-4db2-837f-7fc7124bcb30", 00:30:13.167 "is_configured": true, 00:30:13.167 "data_offset": 2048, 00:30:13.167 "data_size": 63488 00:30:13.167 }, 00:30:13.167 { 00:30:13.167 "name": "BaseBdev3", 00:30:13.167 "uuid": "5188bf4f-d8ca-4ce2-83c5-5f755c10a045", 00:30:13.167 "is_configured": true, 00:30:13.167 "data_offset": 2048, 00:30:13.167 "data_size": 63488 00:30:13.167 } 00:30:13.167 ] 00:30:13.167 }' 00:30:13.167 11:40:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:13.167 11:40:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:30:13.735 11:40:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:30:13.735 11:40:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:13.994 11:40:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:30:13.994 11:40:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:13.994 11:40:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:30:14.253 11:40:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u c5709cf4-9528-4640-9737-50f7aa65af8a 00:30:14.254 [2024-06-10 11:40:58.111538] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:30:14.254 [2024-06-10 11:40:58.111665] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1522a60 00:30:14.254 [2024-06-10 11:40:58.111674] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:30:14.254 [2024-06-10 11:40:58.111797] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16df140 00:30:14.254 [2024-06-10 11:40:58.111901] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1522a60 00:30:14.254 [2024-06-10 11:40:58.111908] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1522a60 00:30:14.254 [2024-06-10 11:40:58.111977] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:14.254 NewBaseBdev 00:30:14.254 11:40:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:30:14.254 11:40:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=NewBaseBdev 00:30:14.254 11:40:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:30:14.254 11:40:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:30:14.254 11:40:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:30:14.254 11:40:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:30:14.254 11:40:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:30:14.513 11:40:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:30:14.513 [ 00:30:14.513 { 00:30:14.513 "name": "NewBaseBdev", 00:30:14.513 "aliases": [ 00:30:14.513 "c5709cf4-9528-4640-9737-50f7aa65af8a" 00:30:14.513 ], 00:30:14.513 "product_name": "Malloc disk", 00:30:14.513 "block_size": 512, 00:30:14.513 "num_blocks": 65536, 00:30:14.513 "uuid": "c5709cf4-9528-4640-9737-50f7aa65af8a", 00:30:14.513 "assigned_rate_limits": { 00:30:14.513 "rw_ios_per_sec": 0, 00:30:14.513 "rw_mbytes_per_sec": 0, 00:30:14.513 "r_mbytes_per_sec": 0, 00:30:14.513 "w_mbytes_per_sec": 0 00:30:14.513 }, 00:30:14.513 "claimed": true, 00:30:14.513 "claim_type": "exclusive_write", 00:30:14.513 "zoned": false, 00:30:14.513 "supported_io_types": { 00:30:14.513 "read": true, 00:30:14.514 "write": true, 00:30:14.514 "unmap": true, 00:30:14.514 "write_zeroes": true, 00:30:14.514 "flush": true, 00:30:14.514 "reset": true, 00:30:14.514 "compare": false, 00:30:14.514 "compare_and_write": false, 00:30:14.514 "abort": true, 00:30:14.514 "nvme_admin": false, 00:30:14.514 "nvme_io": false 00:30:14.514 }, 00:30:14.514 "memory_domains": [ 00:30:14.514 { 00:30:14.514 "dma_device_id": "system", 00:30:14.514 "dma_device_type": 1 00:30:14.514 }, 00:30:14.514 { 00:30:14.514 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:14.514 "dma_device_type": 2 00:30:14.514 } 00:30:14.514 ], 00:30:14.514 "driver_specific": {} 00:30:14.514 } 00:30:14.514 ] 00:30:14.773 11:40:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:30:14.773 11:40:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:30:14.773 11:40:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:30:14.773 11:40:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:14.773 11:40:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:30:14.773 11:40:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:30:14.773 11:40:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:30:14.773 11:40:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:14.773 11:40:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:14.773 11:40:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:14.773 11:40:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:14.773 11:40:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:14.773 11:40:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:30:14.773 11:40:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:14.773 "name": "Existed_Raid", 00:30:14.773 "uuid": "d051687c-fa9e-483d-949a-614d9455d548", 00:30:14.773 "strip_size_kb": 64, 00:30:14.773 "state": "online", 00:30:14.773 "raid_level": "raid0", 00:30:14.773 "superblock": true, 00:30:14.773 "num_base_bdevs": 3, 00:30:14.773 "num_base_bdevs_discovered": 3, 00:30:14.773 "num_base_bdevs_operational": 3, 00:30:14.773 "base_bdevs_list": [ 00:30:14.773 { 00:30:14.773 "name": "NewBaseBdev", 00:30:14.773 "uuid": "c5709cf4-9528-4640-9737-50f7aa65af8a", 00:30:14.773 "is_configured": true, 00:30:14.773 "data_offset": 2048, 00:30:14.773 "data_size": 63488 00:30:14.773 }, 00:30:14.773 { 00:30:14.773 "name": "BaseBdev2", 00:30:14.773 "uuid": "769e7adc-2eec-4db2-837f-7fc7124bcb30", 00:30:14.773 "is_configured": true, 00:30:14.773 "data_offset": 2048, 00:30:14.773 "data_size": 63488 00:30:14.773 }, 00:30:14.773 { 00:30:14.773 "name": "BaseBdev3", 00:30:14.773 "uuid": "5188bf4f-d8ca-4ce2-83c5-5f755c10a045", 00:30:14.773 "is_configured": true, 00:30:14.773 "data_offset": 2048, 00:30:14.773 "data_size": 63488 00:30:14.773 } 00:30:14.773 ] 00:30:14.773 }' 00:30:14.773 11:40:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:14.773 11:40:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:30:15.342 11:40:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:30:15.342 11:40:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:30:15.342 11:40:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:30:15.342 11:40:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:30:15.342 11:40:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:30:15.342 11:40:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:30:15.342 11:40:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:30:15.342 11:40:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:30:15.342 [2024-06-10 11:40:59.278739] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:30:15.601 11:40:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:30:15.601 "name": "Existed_Raid", 00:30:15.601 "aliases": [ 00:30:15.601 "d051687c-fa9e-483d-949a-614d9455d548" 00:30:15.601 ], 00:30:15.601 "product_name": "Raid Volume", 00:30:15.601 "block_size": 512, 00:30:15.601 "num_blocks": 190464, 00:30:15.601 "uuid": "d051687c-fa9e-483d-949a-614d9455d548", 00:30:15.601 "assigned_rate_limits": { 00:30:15.601 "rw_ios_per_sec": 0, 00:30:15.601 "rw_mbytes_per_sec": 0, 00:30:15.601 "r_mbytes_per_sec": 0, 00:30:15.601 "w_mbytes_per_sec": 0 00:30:15.601 }, 00:30:15.601 "claimed": false, 00:30:15.601 "zoned": false, 00:30:15.601 "supported_io_types": { 00:30:15.601 "read": true, 00:30:15.601 "write": true, 00:30:15.601 "unmap": true, 00:30:15.601 "write_zeroes": true, 00:30:15.601 "flush": true, 00:30:15.601 "reset": true, 00:30:15.601 "compare": false, 00:30:15.601 "compare_and_write": false, 00:30:15.601 "abort": false, 00:30:15.601 "nvme_admin": false, 00:30:15.601 "nvme_io": false 00:30:15.601 }, 00:30:15.601 "memory_domains": [ 00:30:15.601 { 00:30:15.601 "dma_device_id": "system", 00:30:15.601 "dma_device_type": 1 00:30:15.601 }, 00:30:15.601 { 00:30:15.601 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:15.601 "dma_device_type": 2 00:30:15.601 }, 00:30:15.601 { 00:30:15.601 "dma_device_id": "system", 00:30:15.601 "dma_device_type": 1 00:30:15.601 }, 00:30:15.601 { 00:30:15.601 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:15.601 "dma_device_type": 2 00:30:15.601 }, 00:30:15.601 { 00:30:15.601 "dma_device_id": "system", 00:30:15.601 "dma_device_type": 1 00:30:15.601 }, 00:30:15.601 { 00:30:15.601 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:15.601 "dma_device_type": 2 00:30:15.601 } 00:30:15.601 ], 00:30:15.601 "driver_specific": { 00:30:15.601 "raid": { 00:30:15.601 "uuid": "d051687c-fa9e-483d-949a-614d9455d548", 00:30:15.601 "strip_size_kb": 64, 00:30:15.601 "state": "online", 00:30:15.601 "raid_level": "raid0", 00:30:15.601 "superblock": true, 00:30:15.601 "num_base_bdevs": 3, 00:30:15.601 "num_base_bdevs_discovered": 3, 00:30:15.601 "num_base_bdevs_operational": 3, 00:30:15.601 "base_bdevs_list": [ 00:30:15.601 { 00:30:15.601 "name": "NewBaseBdev", 00:30:15.601 "uuid": "c5709cf4-9528-4640-9737-50f7aa65af8a", 00:30:15.601 "is_configured": true, 00:30:15.601 "data_offset": 2048, 00:30:15.601 "data_size": 63488 00:30:15.601 }, 00:30:15.601 { 00:30:15.601 "name": "BaseBdev2", 00:30:15.601 "uuid": "769e7adc-2eec-4db2-837f-7fc7124bcb30", 00:30:15.601 "is_configured": true, 00:30:15.601 "data_offset": 2048, 00:30:15.601 "data_size": 63488 00:30:15.601 }, 00:30:15.601 { 00:30:15.601 "name": "BaseBdev3", 00:30:15.601 "uuid": "5188bf4f-d8ca-4ce2-83c5-5f755c10a045", 00:30:15.601 "is_configured": true, 00:30:15.602 "data_offset": 2048, 00:30:15.602 "data_size": 63488 00:30:15.602 } 00:30:15.602 ] 00:30:15.602 } 00:30:15.602 } 00:30:15.602 }' 00:30:15.602 11:40:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:30:15.602 11:40:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:30:15.602 BaseBdev2 00:30:15.602 BaseBdev3' 00:30:15.602 11:40:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:30:15.602 11:40:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:30:15.602 11:40:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:30:15.602 11:40:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:30:15.602 "name": "NewBaseBdev", 00:30:15.602 "aliases": [ 00:30:15.602 "c5709cf4-9528-4640-9737-50f7aa65af8a" 00:30:15.602 ], 00:30:15.602 "product_name": "Malloc disk", 00:30:15.602 "block_size": 512, 00:30:15.602 "num_blocks": 65536, 00:30:15.602 "uuid": "c5709cf4-9528-4640-9737-50f7aa65af8a", 00:30:15.602 "assigned_rate_limits": { 00:30:15.602 "rw_ios_per_sec": 0, 00:30:15.602 "rw_mbytes_per_sec": 0, 00:30:15.602 "r_mbytes_per_sec": 0, 00:30:15.602 "w_mbytes_per_sec": 0 00:30:15.602 }, 00:30:15.602 "claimed": true, 00:30:15.602 "claim_type": "exclusive_write", 00:30:15.602 "zoned": false, 00:30:15.602 "supported_io_types": { 00:30:15.602 "read": true, 00:30:15.602 "write": true, 00:30:15.602 "unmap": true, 00:30:15.602 "write_zeroes": true, 00:30:15.602 "flush": true, 00:30:15.602 "reset": true, 00:30:15.602 "compare": false, 00:30:15.602 "compare_and_write": false, 00:30:15.602 "abort": true, 00:30:15.602 "nvme_admin": false, 00:30:15.602 "nvme_io": false 00:30:15.602 }, 00:30:15.602 "memory_domains": [ 00:30:15.602 { 00:30:15.602 "dma_device_id": "system", 00:30:15.602 "dma_device_type": 1 00:30:15.602 }, 00:30:15.602 { 00:30:15.602 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:15.602 "dma_device_type": 2 00:30:15.602 } 00:30:15.602 ], 00:30:15.602 "driver_specific": {} 00:30:15.602 }' 00:30:15.602 11:40:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:15.861 11:40:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:15.861 11:40:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:30:15.861 11:40:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:15.861 11:40:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:15.861 11:40:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:30:15.861 11:40:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:15.861 11:40:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:15.861 11:40:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:30:15.861 11:40:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:15.861 11:40:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:16.121 11:40:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:30:16.121 11:40:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:30:16.121 11:40:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:30:16.121 11:40:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:30:16.121 11:41:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:30:16.121 "name": "BaseBdev2", 00:30:16.121 "aliases": [ 00:30:16.121 "769e7adc-2eec-4db2-837f-7fc7124bcb30" 00:30:16.121 ], 00:30:16.121 "product_name": "Malloc disk", 00:30:16.121 "block_size": 512, 00:30:16.121 "num_blocks": 65536, 00:30:16.121 "uuid": "769e7adc-2eec-4db2-837f-7fc7124bcb30", 00:30:16.121 "assigned_rate_limits": { 00:30:16.121 "rw_ios_per_sec": 0, 00:30:16.121 "rw_mbytes_per_sec": 0, 00:30:16.121 "r_mbytes_per_sec": 0, 00:30:16.121 "w_mbytes_per_sec": 0 00:30:16.121 }, 00:30:16.121 "claimed": true, 00:30:16.121 "claim_type": "exclusive_write", 00:30:16.121 "zoned": false, 00:30:16.121 "supported_io_types": { 00:30:16.121 "read": true, 00:30:16.121 "write": true, 00:30:16.121 "unmap": true, 00:30:16.121 "write_zeroes": true, 00:30:16.121 "flush": true, 00:30:16.121 "reset": true, 00:30:16.121 "compare": false, 00:30:16.121 "compare_and_write": false, 00:30:16.121 "abort": true, 00:30:16.121 "nvme_admin": false, 00:30:16.121 "nvme_io": false 00:30:16.121 }, 00:30:16.121 "memory_domains": [ 00:30:16.121 { 00:30:16.121 "dma_device_id": "system", 00:30:16.121 "dma_device_type": 1 00:30:16.121 }, 00:30:16.121 { 00:30:16.121 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:16.121 "dma_device_type": 2 00:30:16.121 } 00:30:16.121 ], 00:30:16.121 "driver_specific": {} 00:30:16.121 }' 00:30:16.121 11:41:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:16.121 11:41:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:16.381 11:41:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:30:16.381 11:41:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:16.381 11:41:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:16.381 11:41:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:30:16.381 11:41:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:16.381 11:41:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:16.381 11:41:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:30:16.381 11:41:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:16.381 11:41:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:16.381 11:41:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:30:16.381 11:41:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:30:16.381 11:41:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:30:16.381 11:41:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:30:16.651 11:41:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:30:16.651 "name": "BaseBdev3", 00:30:16.651 "aliases": [ 00:30:16.651 "5188bf4f-d8ca-4ce2-83c5-5f755c10a045" 00:30:16.651 ], 00:30:16.651 "product_name": "Malloc disk", 00:30:16.651 "block_size": 512, 00:30:16.651 "num_blocks": 65536, 00:30:16.651 "uuid": "5188bf4f-d8ca-4ce2-83c5-5f755c10a045", 00:30:16.651 "assigned_rate_limits": { 00:30:16.651 "rw_ios_per_sec": 0, 00:30:16.651 "rw_mbytes_per_sec": 0, 00:30:16.651 "r_mbytes_per_sec": 0, 00:30:16.651 "w_mbytes_per_sec": 0 00:30:16.651 }, 00:30:16.651 "claimed": true, 00:30:16.651 "claim_type": "exclusive_write", 00:30:16.651 "zoned": false, 00:30:16.651 "supported_io_types": { 00:30:16.651 "read": true, 00:30:16.651 "write": true, 00:30:16.651 "unmap": true, 00:30:16.651 "write_zeroes": true, 00:30:16.651 "flush": true, 00:30:16.652 "reset": true, 00:30:16.652 "compare": false, 00:30:16.652 "compare_and_write": false, 00:30:16.652 "abort": true, 00:30:16.652 "nvme_admin": false, 00:30:16.652 "nvme_io": false 00:30:16.652 }, 00:30:16.652 "memory_domains": [ 00:30:16.652 { 00:30:16.652 "dma_device_id": "system", 00:30:16.652 "dma_device_type": 1 00:30:16.652 }, 00:30:16.652 { 00:30:16.652 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:16.652 "dma_device_type": 2 00:30:16.652 } 00:30:16.652 ], 00:30:16.652 "driver_specific": {} 00:30:16.652 }' 00:30:16.652 11:41:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:16.652 11:41:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:16.652 11:41:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:30:16.652 11:41:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:16.973 11:41:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:16.973 11:41:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:30:16.973 11:41:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:16.973 11:41:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:16.973 11:41:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:30:16.973 11:41:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:16.973 11:41:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:16.973 11:41:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:30:16.973 11:41:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:30:17.232 [2024-06-10 11:41:00.930847] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:30:17.232 [2024-06-10 11:41:00.930874] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:30:17.232 [2024-06-10 11:41:00.930918] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:30:17.232 [2024-06-10 11:41:00.930956] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:30:17.232 [2024-06-10 11:41:00.930964] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1522a60 name Existed_Raid, state offline 00:30:17.232 11:41:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 150686 00:30:17.232 11:41:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@949 -- # '[' -z 150686 ']' 00:30:17.232 11:41:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # kill -0 150686 00:30:17.232 11:41:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # uname 00:30:17.232 11:41:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:30:17.232 11:41:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 150686 00:30:17.232 11:41:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:30:17.232 11:41:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:30:17.232 11:41:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # echo 'killing process with pid 150686' 00:30:17.232 killing process with pid 150686 00:30:17.232 11:41:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # kill 150686 00:30:17.232 [2024-06-10 11:41:00.998458] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:30:17.232 11:41:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@973 -- # wait 150686 00:30:17.232 [2024-06-10 11:41:01.024080] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:30:17.492 11:41:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:30:17.492 00:30:17.492 real 0m21.843s 00:30:17.492 user 0m39.856s 00:30:17.492 sys 0m4.185s 00:30:17.492 11:41:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # xtrace_disable 00:30:17.492 11:41:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:30:17.492 ************************************ 00:30:17.492 END TEST raid_state_function_test_sb 00:30:17.492 ************************************ 00:30:17.492 11:41:01 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 3 00:30:17.492 11:41:01 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:30:17.492 11:41:01 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:30:17.492 11:41:01 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:30:17.492 ************************************ 00:30:17.492 START TEST raid_superblock_test 00:30:17.492 ************************************ 00:30:17.492 11:41:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # raid_superblock_test raid0 3 00:30:17.492 11:41:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:30:17.492 11:41:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:30:17.492 11:41:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:30:17.492 11:41:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:30:17.492 11:41:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:30:17.492 11:41:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:30:17.492 11:41:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:30:17.492 11:41:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:30:17.492 11:41:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:30:17.492 11:41:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:30:17.492 11:41:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:30:17.492 11:41:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:30:17.492 11:41:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:30:17.492 11:41:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:30:17.492 11:41:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:30:17.492 11:41:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:30:17.492 11:41:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=154243 00:30:17.492 11:41:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 154243 /var/tmp/spdk-raid.sock 00:30:17.492 11:41:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:30:17.492 11:41:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@830 -- # '[' -z 154243 ']' 00:30:17.492 11:41:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:30:17.492 11:41:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:30:17.492 11:41:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:30:17.492 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:30:17.492 11:41:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:30:17.492 11:41:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:30:17.492 [2024-06-10 11:41:01.364368] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:30:17.492 [2024-06-10 11:41:01.364419] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid154243 ] 00:30:17.751 [2024-06-10 11:41:01.452538] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:17.751 [2024-06-10 11:41:01.540193] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:30:17.751 [2024-06-10 11:41:01.597954] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:30:17.751 [2024-06-10 11:41:01.597983] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:30:18.323 11:41:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:30:18.323 11:41:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@863 -- # return 0 00:30:18.323 11:41:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:30:18.323 11:41:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:30:18.323 11:41:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:30:18.323 11:41:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:30:18.323 11:41:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:30:18.323 11:41:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:30:18.323 11:41:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:30:18.323 11:41:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:30:18.323 11:41:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:30:18.581 malloc1 00:30:18.581 11:41:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:30:18.581 [2024-06-10 11:41:02.473984] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:30:18.581 [2024-06-10 11:41:02.474021] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:18.581 [2024-06-10 11:41:02.474035] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2373100 00:30:18.581 [2024-06-10 11:41:02.474059] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:18.581 [2024-06-10 11:41:02.475306] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:18.581 [2024-06-10 11:41:02.475329] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:30:18.581 pt1 00:30:18.581 11:41:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:30:18.581 11:41:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:30:18.581 11:41:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:30:18.581 11:41:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:30:18.581 11:41:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:30:18.581 11:41:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:30:18.581 11:41:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:30:18.581 11:41:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:30:18.581 11:41:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:30:18.840 malloc2 00:30:18.840 11:41:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:30:19.099 [2024-06-10 11:41:02.816005] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:30:19.099 [2024-06-10 11:41:02.816041] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:19.099 [2024-06-10 11:41:02.816072] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2374500 00:30:19.099 [2024-06-10 11:41:02.816080] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:19.099 [2024-06-10 11:41:02.817272] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:19.099 [2024-06-10 11:41:02.817294] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:30:19.099 pt2 00:30:19.099 11:41:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:30:19.099 11:41:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:30:19.099 11:41:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:30:19.099 11:41:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:30:19.099 11:41:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:30:19.099 11:41:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:30:19.099 11:41:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:30:19.099 11:41:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:30:19.099 11:41:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:30:19.099 malloc3 00:30:19.099 11:41:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:30:19.359 [2024-06-10 11:41:03.161423] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:30:19.359 [2024-06-10 11:41:03.161458] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:19.359 [2024-06-10 11:41:03.161471] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x251e7a0 00:30:19.359 [2024-06-10 11:41:03.161480] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:19.359 [2024-06-10 11:41:03.162602] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:19.359 [2024-06-10 11:41:03.162624] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:30:19.359 pt3 00:30:19.359 11:41:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:30:19.359 11:41:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:30:19.359 11:41:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:30:19.618 [2024-06-10 11:41:03.333895] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:30:19.618 [2024-06-10 11:41:03.334893] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:30:19.618 [2024-06-10 11:41:03.334934] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:30:19.618 [2024-06-10 11:41:03.335047] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2521d40 00:30:19.618 [2024-06-10 11:41:03.335055] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:30:19.618 [2024-06-10 11:41:03.335197] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2373e90 00:30:19.618 [2024-06-10 11:41:03.335295] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2521d40 00:30:19.618 [2024-06-10 11:41:03.335302] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2521d40 00:30:19.618 [2024-06-10 11:41:03.335369] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:19.618 11:41:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:30:19.618 11:41:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:19.618 11:41:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:19.618 11:41:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:30:19.618 11:41:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:30:19.618 11:41:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:30:19.618 11:41:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:19.618 11:41:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:19.618 11:41:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:19.618 11:41:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:19.618 11:41:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:19.618 11:41:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:19.618 11:41:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:19.618 "name": "raid_bdev1", 00:30:19.618 "uuid": "6e065df4-6466-4f0f-b151-358cd9e02a60", 00:30:19.618 "strip_size_kb": 64, 00:30:19.618 "state": "online", 00:30:19.618 "raid_level": "raid0", 00:30:19.618 "superblock": true, 00:30:19.618 "num_base_bdevs": 3, 00:30:19.618 "num_base_bdevs_discovered": 3, 00:30:19.618 "num_base_bdevs_operational": 3, 00:30:19.618 "base_bdevs_list": [ 00:30:19.618 { 00:30:19.618 "name": "pt1", 00:30:19.618 "uuid": "00000000-0000-0000-0000-000000000001", 00:30:19.618 "is_configured": true, 00:30:19.618 "data_offset": 2048, 00:30:19.618 "data_size": 63488 00:30:19.618 }, 00:30:19.618 { 00:30:19.618 "name": "pt2", 00:30:19.618 "uuid": "00000000-0000-0000-0000-000000000002", 00:30:19.618 "is_configured": true, 00:30:19.618 "data_offset": 2048, 00:30:19.618 "data_size": 63488 00:30:19.618 }, 00:30:19.618 { 00:30:19.618 "name": "pt3", 00:30:19.618 "uuid": "00000000-0000-0000-0000-000000000003", 00:30:19.618 "is_configured": true, 00:30:19.618 "data_offset": 2048, 00:30:19.618 "data_size": 63488 00:30:19.618 } 00:30:19.618 ] 00:30:19.618 }' 00:30:19.618 11:41:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:19.618 11:41:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:30:20.186 11:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:30:20.186 11:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:30:20.186 11:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:30:20.186 11:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:30:20.186 11:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:30:20.186 11:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:30:20.186 11:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:30:20.186 11:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:30:20.446 [2024-06-10 11:41:04.164172] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:30:20.446 11:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:30:20.446 "name": "raid_bdev1", 00:30:20.446 "aliases": [ 00:30:20.446 "6e065df4-6466-4f0f-b151-358cd9e02a60" 00:30:20.446 ], 00:30:20.446 "product_name": "Raid Volume", 00:30:20.446 "block_size": 512, 00:30:20.446 "num_blocks": 190464, 00:30:20.446 "uuid": "6e065df4-6466-4f0f-b151-358cd9e02a60", 00:30:20.446 "assigned_rate_limits": { 00:30:20.446 "rw_ios_per_sec": 0, 00:30:20.446 "rw_mbytes_per_sec": 0, 00:30:20.446 "r_mbytes_per_sec": 0, 00:30:20.446 "w_mbytes_per_sec": 0 00:30:20.446 }, 00:30:20.446 "claimed": false, 00:30:20.446 "zoned": false, 00:30:20.446 "supported_io_types": { 00:30:20.446 "read": true, 00:30:20.446 "write": true, 00:30:20.446 "unmap": true, 00:30:20.446 "write_zeroes": true, 00:30:20.446 "flush": true, 00:30:20.446 "reset": true, 00:30:20.446 "compare": false, 00:30:20.446 "compare_and_write": false, 00:30:20.446 "abort": false, 00:30:20.446 "nvme_admin": false, 00:30:20.446 "nvme_io": false 00:30:20.446 }, 00:30:20.446 "memory_domains": [ 00:30:20.446 { 00:30:20.446 "dma_device_id": "system", 00:30:20.446 "dma_device_type": 1 00:30:20.446 }, 00:30:20.446 { 00:30:20.446 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:20.446 "dma_device_type": 2 00:30:20.446 }, 00:30:20.446 { 00:30:20.446 "dma_device_id": "system", 00:30:20.446 "dma_device_type": 1 00:30:20.446 }, 00:30:20.446 { 00:30:20.446 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:20.446 "dma_device_type": 2 00:30:20.446 }, 00:30:20.446 { 00:30:20.446 "dma_device_id": "system", 00:30:20.446 "dma_device_type": 1 00:30:20.446 }, 00:30:20.446 { 00:30:20.446 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:20.446 "dma_device_type": 2 00:30:20.446 } 00:30:20.446 ], 00:30:20.446 "driver_specific": { 00:30:20.446 "raid": { 00:30:20.446 "uuid": "6e065df4-6466-4f0f-b151-358cd9e02a60", 00:30:20.446 "strip_size_kb": 64, 00:30:20.446 "state": "online", 00:30:20.446 "raid_level": "raid0", 00:30:20.446 "superblock": true, 00:30:20.446 "num_base_bdevs": 3, 00:30:20.446 "num_base_bdevs_discovered": 3, 00:30:20.446 "num_base_bdevs_operational": 3, 00:30:20.446 "base_bdevs_list": [ 00:30:20.446 { 00:30:20.446 "name": "pt1", 00:30:20.446 "uuid": "00000000-0000-0000-0000-000000000001", 00:30:20.446 "is_configured": true, 00:30:20.446 "data_offset": 2048, 00:30:20.446 "data_size": 63488 00:30:20.446 }, 00:30:20.446 { 00:30:20.446 "name": "pt2", 00:30:20.446 "uuid": "00000000-0000-0000-0000-000000000002", 00:30:20.446 "is_configured": true, 00:30:20.446 "data_offset": 2048, 00:30:20.446 "data_size": 63488 00:30:20.446 }, 00:30:20.446 { 00:30:20.446 "name": "pt3", 00:30:20.446 "uuid": "00000000-0000-0000-0000-000000000003", 00:30:20.446 "is_configured": true, 00:30:20.446 "data_offset": 2048, 00:30:20.446 "data_size": 63488 00:30:20.446 } 00:30:20.446 ] 00:30:20.446 } 00:30:20.446 } 00:30:20.446 }' 00:30:20.446 11:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:30:20.446 11:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:30:20.446 pt2 00:30:20.446 pt3' 00:30:20.446 11:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:30:20.446 11:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:30:20.446 11:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:30:20.706 11:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:30:20.706 "name": "pt1", 00:30:20.706 "aliases": [ 00:30:20.706 "00000000-0000-0000-0000-000000000001" 00:30:20.706 ], 00:30:20.706 "product_name": "passthru", 00:30:20.706 "block_size": 512, 00:30:20.706 "num_blocks": 65536, 00:30:20.706 "uuid": "00000000-0000-0000-0000-000000000001", 00:30:20.706 "assigned_rate_limits": { 00:30:20.706 "rw_ios_per_sec": 0, 00:30:20.706 "rw_mbytes_per_sec": 0, 00:30:20.706 "r_mbytes_per_sec": 0, 00:30:20.706 "w_mbytes_per_sec": 0 00:30:20.706 }, 00:30:20.706 "claimed": true, 00:30:20.706 "claim_type": "exclusive_write", 00:30:20.706 "zoned": false, 00:30:20.706 "supported_io_types": { 00:30:20.706 "read": true, 00:30:20.706 "write": true, 00:30:20.706 "unmap": true, 00:30:20.706 "write_zeroes": true, 00:30:20.706 "flush": true, 00:30:20.706 "reset": true, 00:30:20.706 "compare": false, 00:30:20.706 "compare_and_write": false, 00:30:20.706 "abort": true, 00:30:20.706 "nvme_admin": false, 00:30:20.706 "nvme_io": false 00:30:20.706 }, 00:30:20.706 "memory_domains": [ 00:30:20.706 { 00:30:20.706 "dma_device_id": "system", 00:30:20.706 "dma_device_type": 1 00:30:20.706 }, 00:30:20.706 { 00:30:20.706 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:20.706 "dma_device_type": 2 00:30:20.706 } 00:30:20.706 ], 00:30:20.706 "driver_specific": { 00:30:20.706 "passthru": { 00:30:20.706 "name": "pt1", 00:30:20.706 "base_bdev_name": "malloc1" 00:30:20.706 } 00:30:20.706 } 00:30:20.706 }' 00:30:20.706 11:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:20.706 11:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:20.706 11:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:30:20.706 11:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:20.706 11:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:20.707 11:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:30:20.707 11:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:20.707 11:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:20.966 11:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:30:20.966 11:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:20.966 11:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:20.966 11:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:30:20.966 11:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:30:20.966 11:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:30:20.966 11:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:30:20.966 11:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:30:20.966 "name": "pt2", 00:30:20.966 "aliases": [ 00:30:20.966 "00000000-0000-0000-0000-000000000002" 00:30:20.966 ], 00:30:20.966 "product_name": "passthru", 00:30:20.966 "block_size": 512, 00:30:20.966 "num_blocks": 65536, 00:30:20.966 "uuid": "00000000-0000-0000-0000-000000000002", 00:30:20.966 "assigned_rate_limits": { 00:30:20.966 "rw_ios_per_sec": 0, 00:30:20.966 "rw_mbytes_per_sec": 0, 00:30:20.966 "r_mbytes_per_sec": 0, 00:30:20.966 "w_mbytes_per_sec": 0 00:30:20.966 }, 00:30:20.966 "claimed": true, 00:30:20.966 "claim_type": "exclusive_write", 00:30:20.966 "zoned": false, 00:30:20.966 "supported_io_types": { 00:30:20.966 "read": true, 00:30:20.966 "write": true, 00:30:20.966 "unmap": true, 00:30:20.966 "write_zeroes": true, 00:30:20.966 "flush": true, 00:30:20.966 "reset": true, 00:30:20.967 "compare": false, 00:30:20.967 "compare_and_write": false, 00:30:20.967 "abort": true, 00:30:20.967 "nvme_admin": false, 00:30:20.967 "nvme_io": false 00:30:20.967 }, 00:30:20.967 "memory_domains": [ 00:30:20.967 { 00:30:20.967 "dma_device_id": "system", 00:30:20.967 "dma_device_type": 1 00:30:20.967 }, 00:30:20.967 { 00:30:20.967 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:20.967 "dma_device_type": 2 00:30:20.967 } 00:30:20.967 ], 00:30:20.967 "driver_specific": { 00:30:20.967 "passthru": { 00:30:20.967 "name": "pt2", 00:30:20.967 "base_bdev_name": "malloc2" 00:30:20.967 } 00:30:20.967 } 00:30:20.967 }' 00:30:20.967 11:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:21.226 11:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:21.226 11:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:30:21.226 11:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:21.226 11:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:21.226 11:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:30:21.226 11:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:21.226 11:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:21.226 11:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:30:21.226 11:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:21.485 11:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:21.485 11:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:30:21.485 11:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:30:21.485 11:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:30:21.485 11:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:30:21.485 11:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:30:21.485 "name": "pt3", 00:30:21.485 "aliases": [ 00:30:21.485 "00000000-0000-0000-0000-000000000003" 00:30:21.485 ], 00:30:21.485 "product_name": "passthru", 00:30:21.485 "block_size": 512, 00:30:21.485 "num_blocks": 65536, 00:30:21.485 "uuid": "00000000-0000-0000-0000-000000000003", 00:30:21.485 "assigned_rate_limits": { 00:30:21.485 "rw_ios_per_sec": 0, 00:30:21.485 "rw_mbytes_per_sec": 0, 00:30:21.485 "r_mbytes_per_sec": 0, 00:30:21.485 "w_mbytes_per_sec": 0 00:30:21.485 }, 00:30:21.485 "claimed": true, 00:30:21.485 "claim_type": "exclusive_write", 00:30:21.485 "zoned": false, 00:30:21.485 "supported_io_types": { 00:30:21.485 "read": true, 00:30:21.485 "write": true, 00:30:21.485 "unmap": true, 00:30:21.485 "write_zeroes": true, 00:30:21.485 "flush": true, 00:30:21.485 "reset": true, 00:30:21.485 "compare": false, 00:30:21.485 "compare_and_write": false, 00:30:21.485 "abort": true, 00:30:21.485 "nvme_admin": false, 00:30:21.485 "nvme_io": false 00:30:21.485 }, 00:30:21.485 "memory_domains": [ 00:30:21.485 { 00:30:21.485 "dma_device_id": "system", 00:30:21.485 "dma_device_type": 1 00:30:21.485 }, 00:30:21.485 { 00:30:21.485 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:21.485 "dma_device_type": 2 00:30:21.485 } 00:30:21.485 ], 00:30:21.485 "driver_specific": { 00:30:21.485 "passthru": { 00:30:21.485 "name": "pt3", 00:30:21.485 "base_bdev_name": "malloc3" 00:30:21.485 } 00:30:21.485 } 00:30:21.485 }' 00:30:21.485 11:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:21.745 11:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:21.745 11:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:30:21.745 11:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:21.745 11:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:21.745 11:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:30:21.745 11:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:21.745 11:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:21.745 11:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:30:21.745 11:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:21.745 11:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:22.004 11:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:30:22.004 11:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:30:22.004 11:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:30:22.004 [2024-06-10 11:41:05.852524] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:30:22.004 11:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=6e065df4-6466-4f0f-b151-358cd9e02a60 00:30:22.004 11:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 6e065df4-6466-4f0f-b151-358cd9e02a60 ']' 00:30:22.004 11:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:30:22.264 [2024-06-10 11:41:06.032833] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:30:22.264 [2024-06-10 11:41:06.032848] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:30:22.264 [2024-06-10 11:41:06.032882] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:30:22.264 [2024-06-10 11:41:06.032918] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:30:22.264 [2024-06-10 11:41:06.032926] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2521d40 name raid_bdev1, state offline 00:30:22.264 11:41:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:22.264 11:41:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:30:22.523 11:41:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:30:22.523 11:41:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:30:22.523 11:41:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:30:22.523 11:41:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:30:22.523 11:41:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:30:22.523 11:41:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:30:22.782 11:41:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:30:22.782 11:41:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:30:22.782 11:41:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:30:22.782 11:41:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:30:23.042 11:41:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:30:23.042 11:41:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:30:23.042 11:41:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@649 -- # local es=0 00:30:23.042 11:41:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:30:23.042 11:41:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:23.042 11:41:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:30:23.042 11:41:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:23.042 11:41:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:30:23.042 11:41:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:23.042 11:41:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:30:23.042 11:41:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:23.042 11:41:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:30:23.042 11:41:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:30:23.301 [2024-06-10 11:41:07.051449] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:30:23.302 [2024-06-10 11:41:07.052466] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:30:23.302 [2024-06-10 11:41:07.052497] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:30:23.302 [2024-06-10 11:41:07.052529] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:30:23.302 [2024-06-10 11:41:07.052556] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:30:23.302 [2024-06-10 11:41:07.052587] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:30:23.302 [2024-06-10 11:41:07.052599] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:30:23.302 [2024-06-10 11:41:07.052607] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x251fad0 name raid_bdev1, state configuring 00:30:23.302 request: 00:30:23.302 { 00:30:23.302 "name": "raid_bdev1", 00:30:23.302 "raid_level": "raid0", 00:30:23.302 "base_bdevs": [ 00:30:23.302 "malloc1", 00:30:23.302 "malloc2", 00:30:23.302 "malloc3" 00:30:23.302 ], 00:30:23.302 "superblock": false, 00:30:23.302 "strip_size_kb": 64, 00:30:23.302 "method": "bdev_raid_create", 00:30:23.302 "req_id": 1 00:30:23.302 } 00:30:23.302 Got JSON-RPC error response 00:30:23.302 response: 00:30:23.302 { 00:30:23.302 "code": -17, 00:30:23.302 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:30:23.302 } 00:30:23.302 11:41:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # es=1 00:30:23.302 11:41:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:30:23.302 11:41:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:30:23.302 11:41:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:30:23.302 11:41:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:23.302 11:41:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:30:23.302 11:41:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:30:23.302 11:41:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:30:23.302 11:41:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:30:23.561 [2024-06-10 11:41:07.404332] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:30:23.561 [2024-06-10 11:41:07.404364] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:23.562 [2024-06-10 11:41:07.404392] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2520b50 00:30:23.562 [2024-06-10 11:41:07.404401] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:23.562 [2024-06-10 11:41:07.405535] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:23.562 [2024-06-10 11:41:07.405558] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:30:23.562 [2024-06-10 11:41:07.405612] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:30:23.562 [2024-06-10 11:41:07.405632] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:30:23.562 pt1 00:30:23.562 11:41:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 3 00:30:23.562 11:41:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:23.562 11:41:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:30:23.562 11:41:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:30:23.562 11:41:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:30:23.562 11:41:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:30:23.562 11:41:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:23.562 11:41:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:23.562 11:41:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:23.562 11:41:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:23.562 11:41:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:23.562 11:41:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:23.821 11:41:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:23.821 "name": "raid_bdev1", 00:30:23.821 "uuid": "6e065df4-6466-4f0f-b151-358cd9e02a60", 00:30:23.821 "strip_size_kb": 64, 00:30:23.821 "state": "configuring", 00:30:23.821 "raid_level": "raid0", 00:30:23.821 "superblock": true, 00:30:23.821 "num_base_bdevs": 3, 00:30:23.821 "num_base_bdevs_discovered": 1, 00:30:23.821 "num_base_bdevs_operational": 3, 00:30:23.821 "base_bdevs_list": [ 00:30:23.821 { 00:30:23.821 "name": "pt1", 00:30:23.821 "uuid": "00000000-0000-0000-0000-000000000001", 00:30:23.821 "is_configured": true, 00:30:23.821 "data_offset": 2048, 00:30:23.821 "data_size": 63488 00:30:23.821 }, 00:30:23.821 { 00:30:23.821 "name": null, 00:30:23.821 "uuid": "00000000-0000-0000-0000-000000000002", 00:30:23.821 "is_configured": false, 00:30:23.821 "data_offset": 2048, 00:30:23.821 "data_size": 63488 00:30:23.821 }, 00:30:23.821 { 00:30:23.821 "name": null, 00:30:23.821 "uuid": "00000000-0000-0000-0000-000000000003", 00:30:23.821 "is_configured": false, 00:30:23.821 "data_offset": 2048, 00:30:23.821 "data_size": 63488 00:30:23.821 } 00:30:23.821 ] 00:30:23.821 }' 00:30:23.821 11:41:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:23.821 11:41:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:30:24.389 11:41:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:30:24.389 11:41:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:30:24.389 [2024-06-10 11:41:08.266552] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:30:24.389 [2024-06-10 11:41:08.266585] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:24.389 [2024-06-10 11:41:08.266613] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2373330 00:30:24.389 [2024-06-10 11:41:08.266621] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:24.389 [2024-06-10 11:41:08.266859] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:24.389 [2024-06-10 11:41:08.266878] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:30:24.389 [2024-06-10 11:41:08.266921] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:30:24.389 [2024-06-10 11:41:08.266936] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:30:24.389 pt2 00:30:24.389 11:41:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:30:24.647 [2024-06-10 11:41:08.426981] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:30:24.647 11:41:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 3 00:30:24.647 11:41:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:24.647 11:41:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:30:24.647 11:41:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:30:24.647 11:41:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:30:24.647 11:41:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:30:24.647 11:41:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:24.647 11:41:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:24.647 11:41:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:24.647 11:41:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:24.647 11:41:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:24.647 11:41:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:24.906 11:41:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:24.906 "name": "raid_bdev1", 00:30:24.906 "uuid": "6e065df4-6466-4f0f-b151-358cd9e02a60", 00:30:24.906 "strip_size_kb": 64, 00:30:24.906 "state": "configuring", 00:30:24.906 "raid_level": "raid0", 00:30:24.906 "superblock": true, 00:30:24.906 "num_base_bdevs": 3, 00:30:24.906 "num_base_bdevs_discovered": 1, 00:30:24.906 "num_base_bdevs_operational": 3, 00:30:24.906 "base_bdevs_list": [ 00:30:24.906 { 00:30:24.906 "name": "pt1", 00:30:24.906 "uuid": "00000000-0000-0000-0000-000000000001", 00:30:24.906 "is_configured": true, 00:30:24.906 "data_offset": 2048, 00:30:24.906 "data_size": 63488 00:30:24.906 }, 00:30:24.906 { 00:30:24.906 "name": null, 00:30:24.906 "uuid": "00000000-0000-0000-0000-000000000002", 00:30:24.906 "is_configured": false, 00:30:24.906 "data_offset": 2048, 00:30:24.906 "data_size": 63488 00:30:24.907 }, 00:30:24.907 { 00:30:24.907 "name": null, 00:30:24.907 "uuid": "00000000-0000-0000-0000-000000000003", 00:30:24.907 "is_configured": false, 00:30:24.907 "data_offset": 2048, 00:30:24.907 "data_size": 63488 00:30:24.907 } 00:30:24.907 ] 00:30:24.907 }' 00:30:24.907 11:41:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:24.907 11:41:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:30:25.166 11:41:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:30:25.166 11:41:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:30:25.166 11:41:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:30:25.424 [2024-06-10 11:41:09.261133] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:30:25.424 [2024-06-10 11:41:09.261164] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:25.424 [2024-06-10 11:41:09.261197] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x251d490 00:30:25.424 [2024-06-10 11:41:09.261206] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:25.424 [2024-06-10 11:41:09.261446] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:25.424 [2024-06-10 11:41:09.261458] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:30:25.424 [2024-06-10 11:41:09.261502] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:30:25.425 [2024-06-10 11:41:09.261516] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:30:25.425 pt2 00:30:25.425 11:41:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:30:25.425 11:41:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:30:25.425 11:41:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:30:25.683 [2024-06-10 11:41:09.445622] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:30:25.683 [2024-06-10 11:41:09.445653] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:25.683 [2024-06-10 11:41:09.445665] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x251ed00 00:30:25.683 [2024-06-10 11:41:09.445674] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:25.683 [2024-06-10 11:41:09.445916] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:25.683 [2024-06-10 11:41:09.445929] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:30:25.683 [2024-06-10 11:41:09.445970] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:30:25.683 [2024-06-10 11:41:09.445984] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:30:25.683 [2024-06-10 11:41:09.446058] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x251d7f0 00:30:25.683 [2024-06-10 11:41:09.446066] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:30:25.683 [2024-06-10 11:41:09.446181] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23692e0 00:30:25.683 [2024-06-10 11:41:09.446268] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x251d7f0 00:30:25.683 [2024-06-10 11:41:09.446275] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x251d7f0 00:30:25.683 [2024-06-10 11:41:09.446343] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:25.683 pt3 00:30:25.683 11:41:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:30:25.683 11:41:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:30:25.683 11:41:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:30:25.683 11:41:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:25.683 11:41:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:25.683 11:41:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:30:25.683 11:41:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:30:25.684 11:41:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:30:25.684 11:41:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:25.684 11:41:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:25.684 11:41:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:25.684 11:41:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:25.684 11:41:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:25.684 11:41:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:25.942 11:41:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:25.942 "name": "raid_bdev1", 00:30:25.942 "uuid": "6e065df4-6466-4f0f-b151-358cd9e02a60", 00:30:25.942 "strip_size_kb": 64, 00:30:25.942 "state": "online", 00:30:25.942 "raid_level": "raid0", 00:30:25.942 "superblock": true, 00:30:25.942 "num_base_bdevs": 3, 00:30:25.942 "num_base_bdevs_discovered": 3, 00:30:25.942 "num_base_bdevs_operational": 3, 00:30:25.942 "base_bdevs_list": [ 00:30:25.942 { 00:30:25.942 "name": "pt1", 00:30:25.942 "uuid": "00000000-0000-0000-0000-000000000001", 00:30:25.942 "is_configured": true, 00:30:25.942 "data_offset": 2048, 00:30:25.942 "data_size": 63488 00:30:25.942 }, 00:30:25.942 { 00:30:25.942 "name": "pt2", 00:30:25.942 "uuid": "00000000-0000-0000-0000-000000000002", 00:30:25.942 "is_configured": true, 00:30:25.942 "data_offset": 2048, 00:30:25.942 "data_size": 63488 00:30:25.942 }, 00:30:25.942 { 00:30:25.942 "name": "pt3", 00:30:25.942 "uuid": "00000000-0000-0000-0000-000000000003", 00:30:25.942 "is_configured": true, 00:30:25.942 "data_offset": 2048, 00:30:25.942 "data_size": 63488 00:30:25.942 } 00:30:25.942 ] 00:30:25.942 }' 00:30:25.942 11:41:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:25.942 11:41:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:30:26.508 11:41:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:30:26.508 11:41:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:30:26.508 11:41:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:30:26.508 11:41:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:30:26.508 11:41:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:30:26.508 11:41:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:30:26.508 11:41:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:30:26.508 11:41:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:30:26.508 [2024-06-10 11:41:10.312117] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:30:26.508 11:41:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:30:26.508 "name": "raid_bdev1", 00:30:26.508 "aliases": [ 00:30:26.508 "6e065df4-6466-4f0f-b151-358cd9e02a60" 00:30:26.508 ], 00:30:26.508 "product_name": "Raid Volume", 00:30:26.508 "block_size": 512, 00:30:26.508 "num_blocks": 190464, 00:30:26.508 "uuid": "6e065df4-6466-4f0f-b151-358cd9e02a60", 00:30:26.508 "assigned_rate_limits": { 00:30:26.508 "rw_ios_per_sec": 0, 00:30:26.508 "rw_mbytes_per_sec": 0, 00:30:26.508 "r_mbytes_per_sec": 0, 00:30:26.508 "w_mbytes_per_sec": 0 00:30:26.508 }, 00:30:26.508 "claimed": false, 00:30:26.508 "zoned": false, 00:30:26.508 "supported_io_types": { 00:30:26.508 "read": true, 00:30:26.508 "write": true, 00:30:26.508 "unmap": true, 00:30:26.508 "write_zeroes": true, 00:30:26.508 "flush": true, 00:30:26.508 "reset": true, 00:30:26.508 "compare": false, 00:30:26.508 "compare_and_write": false, 00:30:26.508 "abort": false, 00:30:26.508 "nvme_admin": false, 00:30:26.508 "nvme_io": false 00:30:26.508 }, 00:30:26.508 "memory_domains": [ 00:30:26.508 { 00:30:26.508 "dma_device_id": "system", 00:30:26.508 "dma_device_type": 1 00:30:26.508 }, 00:30:26.508 { 00:30:26.508 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:26.508 "dma_device_type": 2 00:30:26.508 }, 00:30:26.508 { 00:30:26.508 "dma_device_id": "system", 00:30:26.508 "dma_device_type": 1 00:30:26.508 }, 00:30:26.508 { 00:30:26.508 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:26.508 "dma_device_type": 2 00:30:26.508 }, 00:30:26.508 { 00:30:26.508 "dma_device_id": "system", 00:30:26.508 "dma_device_type": 1 00:30:26.508 }, 00:30:26.508 { 00:30:26.508 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:26.508 "dma_device_type": 2 00:30:26.508 } 00:30:26.508 ], 00:30:26.508 "driver_specific": { 00:30:26.508 "raid": { 00:30:26.508 "uuid": "6e065df4-6466-4f0f-b151-358cd9e02a60", 00:30:26.508 "strip_size_kb": 64, 00:30:26.508 "state": "online", 00:30:26.508 "raid_level": "raid0", 00:30:26.508 "superblock": true, 00:30:26.508 "num_base_bdevs": 3, 00:30:26.508 "num_base_bdevs_discovered": 3, 00:30:26.508 "num_base_bdevs_operational": 3, 00:30:26.508 "base_bdevs_list": [ 00:30:26.508 { 00:30:26.508 "name": "pt1", 00:30:26.508 "uuid": "00000000-0000-0000-0000-000000000001", 00:30:26.508 "is_configured": true, 00:30:26.508 "data_offset": 2048, 00:30:26.508 "data_size": 63488 00:30:26.508 }, 00:30:26.508 { 00:30:26.508 "name": "pt2", 00:30:26.508 "uuid": "00000000-0000-0000-0000-000000000002", 00:30:26.508 "is_configured": true, 00:30:26.508 "data_offset": 2048, 00:30:26.508 "data_size": 63488 00:30:26.508 }, 00:30:26.508 { 00:30:26.508 "name": "pt3", 00:30:26.508 "uuid": "00000000-0000-0000-0000-000000000003", 00:30:26.508 "is_configured": true, 00:30:26.508 "data_offset": 2048, 00:30:26.508 "data_size": 63488 00:30:26.508 } 00:30:26.508 ] 00:30:26.508 } 00:30:26.508 } 00:30:26.508 }' 00:30:26.508 11:41:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:30:26.508 11:41:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:30:26.508 pt2 00:30:26.508 pt3' 00:30:26.508 11:41:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:30:26.508 11:41:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:30:26.508 11:41:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:30:26.766 11:41:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:30:26.766 "name": "pt1", 00:30:26.766 "aliases": [ 00:30:26.767 "00000000-0000-0000-0000-000000000001" 00:30:26.767 ], 00:30:26.767 "product_name": "passthru", 00:30:26.767 "block_size": 512, 00:30:26.767 "num_blocks": 65536, 00:30:26.767 "uuid": "00000000-0000-0000-0000-000000000001", 00:30:26.767 "assigned_rate_limits": { 00:30:26.767 "rw_ios_per_sec": 0, 00:30:26.767 "rw_mbytes_per_sec": 0, 00:30:26.767 "r_mbytes_per_sec": 0, 00:30:26.767 "w_mbytes_per_sec": 0 00:30:26.767 }, 00:30:26.767 "claimed": true, 00:30:26.767 "claim_type": "exclusive_write", 00:30:26.767 "zoned": false, 00:30:26.767 "supported_io_types": { 00:30:26.767 "read": true, 00:30:26.767 "write": true, 00:30:26.767 "unmap": true, 00:30:26.767 "write_zeroes": true, 00:30:26.767 "flush": true, 00:30:26.767 "reset": true, 00:30:26.767 "compare": false, 00:30:26.767 "compare_and_write": false, 00:30:26.767 "abort": true, 00:30:26.767 "nvme_admin": false, 00:30:26.767 "nvme_io": false 00:30:26.767 }, 00:30:26.767 "memory_domains": [ 00:30:26.767 { 00:30:26.767 "dma_device_id": "system", 00:30:26.767 "dma_device_type": 1 00:30:26.767 }, 00:30:26.767 { 00:30:26.767 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:26.767 "dma_device_type": 2 00:30:26.767 } 00:30:26.767 ], 00:30:26.767 "driver_specific": { 00:30:26.767 "passthru": { 00:30:26.767 "name": "pt1", 00:30:26.767 "base_bdev_name": "malloc1" 00:30:26.767 } 00:30:26.767 } 00:30:26.767 }' 00:30:26.767 11:41:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:26.767 11:41:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:26.767 11:41:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:30:26.767 11:41:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:26.767 11:41:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:26.767 11:41:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:30:26.767 11:41:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:27.026 11:41:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:27.026 11:41:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:30:27.026 11:41:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:27.026 11:41:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:27.026 11:41:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:30:27.026 11:41:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:30:27.026 11:41:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:30:27.026 11:41:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:30:27.285 11:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:30:27.285 "name": "pt2", 00:30:27.285 "aliases": [ 00:30:27.285 "00000000-0000-0000-0000-000000000002" 00:30:27.285 ], 00:30:27.285 "product_name": "passthru", 00:30:27.285 "block_size": 512, 00:30:27.285 "num_blocks": 65536, 00:30:27.285 "uuid": "00000000-0000-0000-0000-000000000002", 00:30:27.285 "assigned_rate_limits": { 00:30:27.285 "rw_ios_per_sec": 0, 00:30:27.285 "rw_mbytes_per_sec": 0, 00:30:27.285 "r_mbytes_per_sec": 0, 00:30:27.285 "w_mbytes_per_sec": 0 00:30:27.285 }, 00:30:27.285 "claimed": true, 00:30:27.285 "claim_type": "exclusive_write", 00:30:27.285 "zoned": false, 00:30:27.285 "supported_io_types": { 00:30:27.285 "read": true, 00:30:27.285 "write": true, 00:30:27.285 "unmap": true, 00:30:27.285 "write_zeroes": true, 00:30:27.285 "flush": true, 00:30:27.285 "reset": true, 00:30:27.285 "compare": false, 00:30:27.285 "compare_and_write": false, 00:30:27.285 "abort": true, 00:30:27.285 "nvme_admin": false, 00:30:27.285 "nvme_io": false 00:30:27.285 }, 00:30:27.285 "memory_domains": [ 00:30:27.285 { 00:30:27.285 "dma_device_id": "system", 00:30:27.285 "dma_device_type": 1 00:30:27.285 }, 00:30:27.285 { 00:30:27.285 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:27.285 "dma_device_type": 2 00:30:27.285 } 00:30:27.285 ], 00:30:27.285 "driver_specific": { 00:30:27.285 "passthru": { 00:30:27.285 "name": "pt2", 00:30:27.285 "base_bdev_name": "malloc2" 00:30:27.285 } 00:30:27.285 } 00:30:27.285 }' 00:30:27.285 11:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:27.285 11:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:27.285 11:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:30:27.285 11:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:27.285 11:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:27.285 11:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:30:27.285 11:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:27.285 11:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:27.285 11:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:30:27.544 11:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:27.544 11:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:27.544 11:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:30:27.544 11:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:30:27.544 11:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:30:27.544 11:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:30:27.544 11:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:30:27.544 "name": "pt3", 00:30:27.544 "aliases": [ 00:30:27.544 "00000000-0000-0000-0000-000000000003" 00:30:27.544 ], 00:30:27.544 "product_name": "passthru", 00:30:27.544 "block_size": 512, 00:30:27.544 "num_blocks": 65536, 00:30:27.544 "uuid": "00000000-0000-0000-0000-000000000003", 00:30:27.544 "assigned_rate_limits": { 00:30:27.544 "rw_ios_per_sec": 0, 00:30:27.544 "rw_mbytes_per_sec": 0, 00:30:27.544 "r_mbytes_per_sec": 0, 00:30:27.544 "w_mbytes_per_sec": 0 00:30:27.544 }, 00:30:27.544 "claimed": true, 00:30:27.544 "claim_type": "exclusive_write", 00:30:27.544 "zoned": false, 00:30:27.544 "supported_io_types": { 00:30:27.544 "read": true, 00:30:27.544 "write": true, 00:30:27.544 "unmap": true, 00:30:27.544 "write_zeroes": true, 00:30:27.544 "flush": true, 00:30:27.544 "reset": true, 00:30:27.544 "compare": false, 00:30:27.544 "compare_and_write": false, 00:30:27.544 "abort": true, 00:30:27.544 "nvme_admin": false, 00:30:27.544 "nvme_io": false 00:30:27.544 }, 00:30:27.544 "memory_domains": [ 00:30:27.544 { 00:30:27.545 "dma_device_id": "system", 00:30:27.545 "dma_device_type": 1 00:30:27.545 }, 00:30:27.545 { 00:30:27.545 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:27.545 "dma_device_type": 2 00:30:27.545 } 00:30:27.545 ], 00:30:27.545 "driver_specific": { 00:30:27.545 "passthru": { 00:30:27.545 "name": "pt3", 00:30:27.545 "base_bdev_name": "malloc3" 00:30:27.545 } 00:30:27.545 } 00:30:27.545 }' 00:30:27.545 11:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:27.803 11:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:27.803 11:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:30:27.803 11:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:27.803 11:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:27.803 11:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:30:27.803 11:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:27.803 11:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:27.803 11:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:30:27.803 11:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:27.803 11:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:28.063 11:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:30:28.063 11:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:30:28.063 11:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:30:28.063 [2024-06-10 11:41:11.932293] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:30:28.063 11:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 6e065df4-6466-4f0f-b151-358cd9e02a60 '!=' 6e065df4-6466-4f0f-b151-358cd9e02a60 ']' 00:30:28.063 11:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:30:28.063 11:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:30:28.063 11:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:30:28.063 11:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 154243 00:30:28.063 11:41:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@949 -- # '[' -z 154243 ']' 00:30:28.063 11:41:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # kill -0 154243 00:30:28.063 11:41:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # uname 00:30:28.063 11:41:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:30:28.063 11:41:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 154243 00:30:28.063 11:41:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:30:28.063 11:41:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:30:28.063 11:41:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 154243' 00:30:28.063 killing process with pid 154243 00:30:28.063 11:41:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # kill 154243 00:30:28.063 [2024-06-10 11:41:11.989146] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:30:28.063 [2024-06-10 11:41:11.989186] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:30:28.063 [2024-06-10 11:41:11.989224] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:30:28.063 [2024-06-10 11:41:11.989233] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x251d7f0 name raid_bdev1, state offline 00:30:28.064 11:41:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@973 -- # wait 154243 00:30:28.323 [2024-06-10 11:41:12.019042] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:30:28.323 11:41:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:30:28.323 00:30:28.323 real 0m10.910s 00:30:28.323 user 0m19.453s 00:30:28.323 sys 0m2.105s 00:30:28.323 11:41:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:30:28.323 11:41:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:30:28.323 ************************************ 00:30:28.323 END TEST raid_superblock_test 00:30:28.323 ************************************ 00:30:28.323 11:41:12 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 3 read 00:30:28.323 11:41:12 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:30:28.323 11:41:12 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:30:28.323 11:41:12 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:30:28.582 ************************************ 00:30:28.582 START TEST raid_read_error_test 00:30:28.582 ************************************ 00:30:28.582 11:41:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test raid0 3 read 00:30:28.582 11:41:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:30:28.582 11:41:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:30:28.582 11:41:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:30:28.582 11:41:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:30:28.582 11:41:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:30:28.582 11:41:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:30:28.583 11:41:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:30:28.583 11:41:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:30:28.583 11:41:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:30:28.583 11:41:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:30:28.583 11:41:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:30:28.583 11:41:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:30:28.583 11:41:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:30:28.583 11:41:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:30:28.583 11:41:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:30:28.583 11:41:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:30:28.583 11:41:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:30:28.583 11:41:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:30:28.583 11:41:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:30:28.583 11:41:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:30:28.583 11:41:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:30:28.583 11:41:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:30:28.583 11:41:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:30:28.583 11:41:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:30:28.583 11:41:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:30:28.583 11:41:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.oUiXji0xMu 00:30:28.583 11:41:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=155978 00:30:28.583 11:41:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 155978 /var/tmp/spdk-raid.sock 00:30:28.583 11:41:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:30:28.583 11:41:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@830 -- # '[' -z 155978 ']' 00:30:28.583 11:41:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:30:28.583 11:41:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:30:28.583 11:41:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:30:28.583 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:30:28.583 11:41:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:30:28.583 11:41:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:30:28.583 [2024-06-10 11:41:12.373118] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:30:28.583 [2024-06-10 11:41:12.373169] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid155978 ] 00:30:28.583 [2024-06-10 11:41:12.459917] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:28.842 [2024-06-10 11:41:12.549774] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:30:28.842 [2024-06-10 11:41:12.608891] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:30:28.842 [2024-06-10 11:41:12.608923] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:30:29.411 11:41:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:30:29.411 11:41:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@863 -- # return 0 00:30:29.411 11:41:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:30:29.411 11:41:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:30:29.411 BaseBdev1_malloc 00:30:29.411 11:41:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:30:29.670 true 00:30:29.670 11:41:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:30:29.929 [2024-06-10 11:41:13.677620] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:30:29.929 [2024-06-10 11:41:13.677657] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:29.929 [2024-06-10 11:41:13.677688] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xbbab10 00:30:29.929 [2024-06-10 11:41:13.677697] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:29.929 [2024-06-10 11:41:13.679036] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:29.929 [2024-06-10 11:41:13.679058] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:30:29.929 BaseBdev1 00:30:29.929 11:41:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:30:29.929 11:41:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:30:29.929 BaseBdev2_malloc 00:30:29.929 11:41:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:30:30.189 true 00:30:30.189 11:41:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:30:30.448 [2024-06-10 11:41:14.179794] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:30:30.448 [2024-06-10 11:41:14.179829] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:30.448 [2024-06-10 11:41:14.179845] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xbbf280 00:30:30.448 [2024-06-10 11:41:14.179853] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:30.448 [2024-06-10 11:41:14.181032] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:30.448 [2024-06-10 11:41:14.181054] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:30:30.448 BaseBdev2 00:30:30.448 11:41:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:30:30.448 11:41:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:30:30.448 BaseBdev3_malloc 00:30:30.448 11:41:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:30:30.707 true 00:30:30.707 11:41:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:30:30.966 [2024-06-10 11:41:14.698045] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:30:30.966 [2024-06-10 11:41:14.698081] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:30.966 [2024-06-10 11:41:14.698099] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xbc1ab0 00:30:30.966 [2024-06-10 11:41:14.698107] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:30.966 [2024-06-10 11:41:14.699268] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:30.966 [2024-06-10 11:41:14.699289] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:30:30.966 BaseBdev3 00:30:30.967 11:41:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:30:30.967 [2024-06-10 11:41:14.874527] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:30:30.967 [2024-06-10 11:41:14.875538] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:30:30.967 [2024-06-10 11:41:14.875586] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:30:30.967 [2024-06-10 11:41:14.875736] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xbc00b0 00:30:30.967 [2024-06-10 11:41:14.875744] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:30:30.967 [2024-06-10 11:41:14.875894] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xbbc4e0 00:30:30.967 [2024-06-10 11:41:14.876006] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xbc00b0 00:30:30.967 [2024-06-10 11:41:14.876013] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xbc00b0 00:30:30.967 [2024-06-10 11:41:14.876086] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:30.967 11:41:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:30:30.967 11:41:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:30.967 11:41:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:30.967 11:41:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:30:30.967 11:41:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:30:30.967 11:41:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:30:30.967 11:41:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:30.967 11:41:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:30.967 11:41:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:30.967 11:41:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:30.967 11:41:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:30.967 11:41:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:31.226 11:41:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:31.226 "name": "raid_bdev1", 00:30:31.226 "uuid": "0378326c-dd59-4125-809e-2e966f1eaccd", 00:30:31.226 "strip_size_kb": 64, 00:30:31.226 "state": "online", 00:30:31.226 "raid_level": "raid0", 00:30:31.226 "superblock": true, 00:30:31.226 "num_base_bdevs": 3, 00:30:31.226 "num_base_bdevs_discovered": 3, 00:30:31.226 "num_base_bdevs_operational": 3, 00:30:31.226 "base_bdevs_list": [ 00:30:31.226 { 00:30:31.226 "name": "BaseBdev1", 00:30:31.226 "uuid": "dea176cc-8c19-51cc-b405-400a01ac04f9", 00:30:31.226 "is_configured": true, 00:30:31.226 "data_offset": 2048, 00:30:31.226 "data_size": 63488 00:30:31.226 }, 00:30:31.226 { 00:30:31.226 "name": "BaseBdev2", 00:30:31.226 "uuid": "228ebdf3-9c26-5ad5-aa86-5e92fc6f38de", 00:30:31.226 "is_configured": true, 00:30:31.226 "data_offset": 2048, 00:30:31.226 "data_size": 63488 00:30:31.226 }, 00:30:31.226 { 00:30:31.226 "name": "BaseBdev3", 00:30:31.226 "uuid": "b00bd125-ce31-52d4-a083-765f69005b4c", 00:30:31.226 "is_configured": true, 00:30:31.226 "data_offset": 2048, 00:30:31.226 "data_size": 63488 00:30:31.226 } 00:30:31.226 ] 00:30:31.226 }' 00:30:31.226 11:41:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:31.226 11:41:15 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:30:31.794 11:41:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:30:31.794 11:41:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:30:31.794 [2024-06-10 11:41:15.640735] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xbc05d0 00:30:32.732 11:41:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:30:32.992 11:41:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:30:32.992 11:41:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:30:32.992 11:41:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:30:32.992 11:41:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:30:32.992 11:41:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:32.992 11:41:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:32.992 11:41:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:30:32.992 11:41:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:30:32.992 11:41:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:30:32.992 11:41:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:32.992 11:41:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:32.992 11:41:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:32.992 11:41:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:32.992 11:41:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:32.992 11:41:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:32.992 11:41:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:32.992 "name": "raid_bdev1", 00:30:32.992 "uuid": "0378326c-dd59-4125-809e-2e966f1eaccd", 00:30:32.992 "strip_size_kb": 64, 00:30:32.992 "state": "online", 00:30:32.992 "raid_level": "raid0", 00:30:32.992 "superblock": true, 00:30:32.992 "num_base_bdevs": 3, 00:30:32.992 "num_base_bdevs_discovered": 3, 00:30:32.992 "num_base_bdevs_operational": 3, 00:30:32.992 "base_bdevs_list": [ 00:30:32.992 { 00:30:32.992 "name": "BaseBdev1", 00:30:32.992 "uuid": "dea176cc-8c19-51cc-b405-400a01ac04f9", 00:30:32.992 "is_configured": true, 00:30:32.992 "data_offset": 2048, 00:30:32.992 "data_size": 63488 00:30:32.992 }, 00:30:32.992 { 00:30:32.992 "name": "BaseBdev2", 00:30:32.992 "uuid": "228ebdf3-9c26-5ad5-aa86-5e92fc6f38de", 00:30:32.992 "is_configured": true, 00:30:32.992 "data_offset": 2048, 00:30:32.992 "data_size": 63488 00:30:32.992 }, 00:30:32.992 { 00:30:32.992 "name": "BaseBdev3", 00:30:32.992 "uuid": "b00bd125-ce31-52d4-a083-765f69005b4c", 00:30:32.992 "is_configured": true, 00:30:32.992 "data_offset": 2048, 00:30:32.992 "data_size": 63488 00:30:32.992 } 00:30:32.992 ] 00:30:32.992 }' 00:30:32.992 11:41:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:32.992 11:41:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:30:33.561 11:41:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:30:33.821 [2024-06-10 11:41:17.602279] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:30:33.821 [2024-06-10 11:41:17.602307] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:30:33.821 [2024-06-10 11:41:17.604406] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:30:33.821 [2024-06-10 11:41:17.604430] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:33.821 [2024-06-10 11:41:17.604453] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:30:33.821 [2024-06-10 11:41:17.604461] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xbc00b0 name raid_bdev1, state offline 00:30:33.821 0 00:30:33.821 11:41:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 155978 00:30:33.821 11:41:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@949 -- # '[' -z 155978 ']' 00:30:33.821 11:41:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # kill -0 155978 00:30:33.821 11:41:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # uname 00:30:33.821 11:41:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:30:33.821 11:41:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 155978 00:30:33.821 11:41:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:30:33.821 11:41:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:30:33.821 11:41:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 155978' 00:30:33.821 killing process with pid 155978 00:30:33.821 11:41:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # kill 155978 00:30:33.821 [2024-06-10 11:41:17.658431] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:30:33.821 11:41:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@973 -- # wait 155978 00:30:33.821 [2024-06-10 11:41:17.679351] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:30:34.081 11:41:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.oUiXji0xMu 00:30:34.081 11:41:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:30:34.081 11:41:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:30:34.081 11:41:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.51 00:30:34.081 11:41:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:30:34.081 11:41:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:30:34.081 11:41:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:30:34.081 11:41:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.51 != \0\.\0\0 ]] 00:30:34.081 00:30:34.081 real 0m5.567s 00:30:34.081 user 0m8.509s 00:30:34.081 sys 0m0.992s 00:30:34.081 11:41:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:30:34.081 11:41:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:30:34.081 ************************************ 00:30:34.081 END TEST raid_read_error_test 00:30:34.081 ************************************ 00:30:34.081 11:41:17 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 3 write 00:30:34.081 11:41:17 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:30:34.081 11:41:17 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:30:34.081 11:41:17 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:30:34.081 ************************************ 00:30:34.081 START TEST raid_write_error_test 00:30:34.081 ************************************ 00:30:34.081 11:41:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test raid0 3 write 00:30:34.081 11:41:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:30:34.081 11:41:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:30:34.081 11:41:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:30:34.081 11:41:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:30:34.081 11:41:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:30:34.081 11:41:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:30:34.081 11:41:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:30:34.081 11:41:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:30:34.081 11:41:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:30:34.081 11:41:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:30:34.081 11:41:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:30:34.081 11:41:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:30:34.081 11:41:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:30:34.081 11:41:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:30:34.081 11:41:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:30:34.081 11:41:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:30:34.081 11:41:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:30:34.081 11:41:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:30:34.081 11:41:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:30:34.081 11:41:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:30:34.081 11:41:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:30:34.081 11:41:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:30:34.081 11:41:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:30:34.081 11:41:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:30:34.081 11:41:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:30:34.081 11:41:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.yOWqdck40j 00:30:34.081 11:41:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=156783 00:30:34.081 11:41:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 156783 /var/tmp/spdk-raid.sock 00:30:34.081 11:41:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:30:34.081 11:41:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@830 -- # '[' -z 156783 ']' 00:30:34.081 11:41:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:30:34.081 11:41:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:30:34.081 11:41:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:30:34.081 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:30:34.081 11:41:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:30:34.081 11:41:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:30:34.081 [2024-06-10 11:41:18.014461] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:30:34.081 [2024-06-10 11:41:18.014510] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid156783 ] 00:30:34.341 [2024-06-10 11:41:18.101024] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:34.341 [2024-06-10 11:41:18.186560] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:30:34.341 [2024-06-10 11:41:18.245916] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:30:34.341 [2024-06-10 11:41:18.245948] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:30:34.909 11:41:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:30:34.909 11:41:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@863 -- # return 0 00:30:34.909 11:41:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:30:34.909 11:41:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:30:35.168 BaseBdev1_malloc 00:30:35.168 11:41:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:30:35.427 true 00:30:35.427 11:41:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:30:35.427 [2024-06-10 11:41:19.327934] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:30:35.427 [2024-06-10 11:41:19.327972] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:35.427 [2024-06-10 11:41:19.327987] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc6fb10 00:30:35.427 [2024-06-10 11:41:19.327996] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:35.427 [2024-06-10 11:41:19.329324] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:35.427 [2024-06-10 11:41:19.329352] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:30:35.427 BaseBdev1 00:30:35.427 11:41:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:30:35.427 11:41:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:30:35.686 BaseBdev2_malloc 00:30:35.686 11:41:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:30:35.945 true 00:30:35.945 11:41:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:30:35.945 [2024-06-10 11:41:19.821403] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:30:35.945 [2024-06-10 11:41:19.821439] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:35.945 [2024-06-10 11:41:19.821453] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc74280 00:30:35.945 [2024-06-10 11:41:19.821462] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:35.945 [2024-06-10 11:41:19.822617] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:35.945 [2024-06-10 11:41:19.822638] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:30:35.945 BaseBdev2 00:30:35.945 11:41:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:30:35.945 11:41:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:30:36.204 BaseBdev3_malloc 00:30:36.204 11:41:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:30:36.463 true 00:30:36.463 11:41:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:30:36.463 [2024-06-10 11:41:20.367876] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:30:36.463 [2024-06-10 11:41:20.367913] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:36.463 [2024-06-10 11:41:20.367929] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc76ab0 00:30:36.463 [2024-06-10 11:41:20.367954] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:36.463 [2024-06-10 11:41:20.369131] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:36.463 [2024-06-10 11:41:20.369152] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:30:36.463 BaseBdev3 00:30:36.463 11:41:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:30:36.722 [2024-06-10 11:41:20.540347] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:30:36.722 [2024-06-10 11:41:20.541336] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:30:36.722 [2024-06-10 11:41:20.541385] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:30:36.722 [2024-06-10 11:41:20.541538] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xc750b0 00:30:36.722 [2024-06-10 11:41:20.541545] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:30:36.722 [2024-06-10 11:41:20.541701] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc714e0 00:30:36.722 [2024-06-10 11:41:20.541819] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xc750b0 00:30:36.722 [2024-06-10 11:41:20.541825] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xc750b0 00:30:36.722 [2024-06-10 11:41:20.541907] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:36.722 11:41:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:30:36.722 11:41:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:36.722 11:41:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:36.722 11:41:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:30:36.722 11:41:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:30:36.722 11:41:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:30:36.722 11:41:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:36.722 11:41:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:36.722 11:41:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:36.722 11:41:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:36.722 11:41:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:36.722 11:41:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:36.981 11:41:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:36.981 "name": "raid_bdev1", 00:30:36.981 "uuid": "7396d5a2-ae02-4583-87b7-2da60fab1530", 00:30:36.981 "strip_size_kb": 64, 00:30:36.981 "state": "online", 00:30:36.981 "raid_level": "raid0", 00:30:36.981 "superblock": true, 00:30:36.981 "num_base_bdevs": 3, 00:30:36.981 "num_base_bdevs_discovered": 3, 00:30:36.981 "num_base_bdevs_operational": 3, 00:30:36.981 "base_bdevs_list": [ 00:30:36.981 { 00:30:36.981 "name": "BaseBdev1", 00:30:36.981 "uuid": "92a69e21-b247-5fbb-9f67-86561130bdbc", 00:30:36.981 "is_configured": true, 00:30:36.981 "data_offset": 2048, 00:30:36.981 "data_size": 63488 00:30:36.981 }, 00:30:36.981 { 00:30:36.981 "name": "BaseBdev2", 00:30:36.981 "uuid": "ec2dbdd8-5231-5880-aabf-96eec9963d37", 00:30:36.981 "is_configured": true, 00:30:36.982 "data_offset": 2048, 00:30:36.982 "data_size": 63488 00:30:36.982 }, 00:30:36.982 { 00:30:36.982 "name": "BaseBdev3", 00:30:36.982 "uuid": "349f4a56-37d6-50fa-bf0d-a21bb6d922b9", 00:30:36.982 "is_configured": true, 00:30:36.982 "data_offset": 2048, 00:30:36.982 "data_size": 63488 00:30:36.982 } 00:30:36.982 ] 00:30:36.982 }' 00:30:36.982 11:41:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:36.982 11:41:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:30:37.549 11:41:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:30:37.549 11:41:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:30:37.549 [2024-06-10 11:41:21.314531] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc755d0 00:30:38.539 11:41:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:30:38.539 11:41:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:30:38.539 11:41:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:30:38.539 11:41:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:30:38.539 11:41:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:30:38.539 11:41:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:38.539 11:41:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:38.539 11:41:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:30:38.539 11:41:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:30:38.539 11:41:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:30:38.539 11:41:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:38.539 11:41:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:38.539 11:41:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:38.539 11:41:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:38.539 11:41:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:38.539 11:41:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:38.798 11:41:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:38.798 "name": "raid_bdev1", 00:30:38.798 "uuid": "7396d5a2-ae02-4583-87b7-2da60fab1530", 00:30:38.798 "strip_size_kb": 64, 00:30:38.798 "state": "online", 00:30:38.798 "raid_level": "raid0", 00:30:38.798 "superblock": true, 00:30:38.798 "num_base_bdevs": 3, 00:30:38.798 "num_base_bdevs_discovered": 3, 00:30:38.798 "num_base_bdevs_operational": 3, 00:30:38.798 "base_bdevs_list": [ 00:30:38.798 { 00:30:38.798 "name": "BaseBdev1", 00:30:38.798 "uuid": "92a69e21-b247-5fbb-9f67-86561130bdbc", 00:30:38.798 "is_configured": true, 00:30:38.798 "data_offset": 2048, 00:30:38.798 "data_size": 63488 00:30:38.798 }, 00:30:38.798 { 00:30:38.798 "name": "BaseBdev2", 00:30:38.798 "uuid": "ec2dbdd8-5231-5880-aabf-96eec9963d37", 00:30:38.798 "is_configured": true, 00:30:38.798 "data_offset": 2048, 00:30:38.798 "data_size": 63488 00:30:38.798 }, 00:30:38.798 { 00:30:38.798 "name": "BaseBdev3", 00:30:38.798 "uuid": "349f4a56-37d6-50fa-bf0d-a21bb6d922b9", 00:30:38.798 "is_configured": true, 00:30:38.798 "data_offset": 2048, 00:30:38.798 "data_size": 63488 00:30:38.798 } 00:30:38.798 ] 00:30:38.798 }' 00:30:38.798 11:41:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:38.798 11:41:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:30:39.367 11:41:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:30:39.367 [2024-06-10 11:41:23.235430] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:30:39.367 [2024-06-10 11:41:23.235465] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:30:39.367 [2024-06-10 11:41:23.237461] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:30:39.367 [2024-06-10 11:41:23.237485] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:39.367 [2024-06-10 11:41:23.237507] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:30:39.367 [2024-06-10 11:41:23.237515] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc750b0 name raid_bdev1, state offline 00:30:39.367 0 00:30:39.367 11:41:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 156783 00:30:39.367 11:41:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@949 -- # '[' -z 156783 ']' 00:30:39.367 11:41:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # kill -0 156783 00:30:39.367 11:41:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # uname 00:30:39.367 11:41:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:30:39.367 11:41:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 156783 00:30:39.367 11:41:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:30:39.367 11:41:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:30:39.367 11:41:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 156783' 00:30:39.367 killing process with pid 156783 00:30:39.367 11:41:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # kill 156783 00:30:39.367 [2024-06-10 11:41:23.301155] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:30:39.367 11:41:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@973 -- # wait 156783 00:30:39.626 [2024-06-10 11:41:23.321462] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:30:39.626 11:41:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.yOWqdck40j 00:30:39.626 11:41:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:30:39.626 11:41:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:30:39.626 11:41:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.52 00:30:39.626 11:41:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:30:39.626 11:41:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:30:39.626 11:41:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:30:39.626 11:41:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.52 != \0\.\0\0 ]] 00:30:39.626 00:30:39.626 real 0m5.571s 00:30:39.626 user 0m8.472s 00:30:39.626 sys 0m1.019s 00:30:39.626 11:41:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:30:39.626 11:41:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:30:39.626 ************************************ 00:30:39.626 END TEST raid_write_error_test 00:30:39.626 ************************************ 00:30:39.626 11:41:23 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:30:39.626 11:41:23 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 3 false 00:30:39.626 11:41:23 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:30:39.626 11:41:23 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:30:39.626 11:41:23 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:30:39.885 ************************************ 00:30:39.885 START TEST raid_state_function_test 00:30:39.885 ************************************ 00:30:39.885 11:41:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # raid_state_function_test concat 3 false 00:30:39.885 11:41:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:30:39.885 11:41:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:30:39.885 11:41:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:30:39.885 11:41:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:30:39.885 11:41:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:30:39.885 11:41:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:30:39.885 11:41:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:30:39.885 11:41:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:30:39.885 11:41:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:30:39.885 11:41:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:30:39.885 11:41:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:30:39.885 11:41:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:30:39.885 11:41:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:30:39.885 11:41:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:30:39.885 11:41:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:30:39.885 11:41:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:30:39.885 11:41:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:30:39.885 11:41:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:30:39.885 11:41:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:30:39.885 11:41:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:30:39.885 11:41:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:30:39.885 11:41:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:30:39.885 11:41:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:30:39.885 11:41:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:30:39.885 11:41:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:30:39.885 11:41:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:30:39.885 11:41:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=157599 00:30:39.885 11:41:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 157599' 00:30:39.885 Process raid pid: 157599 00:30:39.885 11:41:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 157599 /var/tmp/spdk-raid.sock 00:30:39.885 11:41:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@830 -- # '[' -z 157599 ']' 00:30:39.885 11:41:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:30:39.885 11:41:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:30:39.885 11:41:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:30:39.885 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:30:39.885 11:41:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:30:39.885 11:41:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:30:39.885 11:41:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:30:39.885 [2024-06-10 11:41:23.644936] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:30:39.885 [2024-06-10 11:41:23.644982] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:30:39.885 [2024-06-10 11:41:23.731938] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:39.885 [2024-06-10 11:41:23.816690] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:30:40.144 [2024-06-10 11:41:23.876107] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:30:40.144 [2024-06-10 11:41:23.876131] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:30:40.711 11:41:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:30:40.711 11:41:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@863 -- # return 0 00:30:40.711 11:41:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:30:40.711 [2024-06-10 11:41:24.563249] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:30:40.711 [2024-06-10 11:41:24.563293] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:30:40.711 [2024-06-10 11:41:24.563301] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:30:40.711 [2024-06-10 11:41:24.563309] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:30:40.711 [2024-06-10 11:41:24.563314] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:30:40.711 [2024-06-10 11:41:24.563321] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:30:40.711 11:41:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:30:40.711 11:41:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:30:40.711 11:41:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:30:40.711 11:41:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:30:40.711 11:41:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:30:40.711 11:41:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:30:40.711 11:41:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:40.711 11:41:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:40.711 11:41:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:40.711 11:41:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:40.711 11:41:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:40.711 11:41:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:30:40.970 11:41:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:40.970 "name": "Existed_Raid", 00:30:40.970 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:40.970 "strip_size_kb": 64, 00:30:40.970 "state": "configuring", 00:30:40.970 "raid_level": "concat", 00:30:40.970 "superblock": false, 00:30:40.970 "num_base_bdevs": 3, 00:30:40.970 "num_base_bdevs_discovered": 0, 00:30:40.970 "num_base_bdevs_operational": 3, 00:30:40.970 "base_bdevs_list": [ 00:30:40.970 { 00:30:40.970 "name": "BaseBdev1", 00:30:40.970 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:40.970 "is_configured": false, 00:30:40.970 "data_offset": 0, 00:30:40.970 "data_size": 0 00:30:40.970 }, 00:30:40.970 { 00:30:40.970 "name": "BaseBdev2", 00:30:40.970 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:40.970 "is_configured": false, 00:30:40.970 "data_offset": 0, 00:30:40.970 "data_size": 0 00:30:40.970 }, 00:30:40.970 { 00:30:40.970 "name": "BaseBdev3", 00:30:40.970 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:40.970 "is_configured": false, 00:30:40.970 "data_offset": 0, 00:30:40.970 "data_size": 0 00:30:40.970 } 00:30:40.970 ] 00:30:40.970 }' 00:30:40.970 11:41:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:40.970 11:41:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:30:41.537 11:41:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:30:41.537 [2024-06-10 11:41:25.393325] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:30:41.537 [2024-06-10 11:41:25.393351] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2396530 name Existed_Raid, state configuring 00:30:41.537 11:41:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:30:41.795 [2024-06-10 11:41:25.577814] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:30:41.796 [2024-06-10 11:41:25.577839] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:30:41.796 [2024-06-10 11:41:25.577845] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:30:41.796 [2024-06-10 11:41:25.577852] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:30:41.796 [2024-06-10 11:41:25.577882] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:30:41.796 [2024-06-10 11:41:25.577890] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:30:41.796 11:41:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:30:42.055 [2024-06-10 11:41:25.758768] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:30:42.055 BaseBdev1 00:30:42.055 11:41:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:30:42.055 11:41:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:30:42.055 11:41:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:30:42.055 11:41:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:30:42.055 11:41:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:30:42.055 11:41:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:30:42.055 11:41:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:30:42.055 11:41:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:30:42.315 [ 00:30:42.315 { 00:30:42.315 "name": "BaseBdev1", 00:30:42.315 "aliases": [ 00:30:42.315 "62a4eae2-53ca-41e7-accb-0e5fd0660095" 00:30:42.315 ], 00:30:42.315 "product_name": "Malloc disk", 00:30:42.315 "block_size": 512, 00:30:42.315 "num_blocks": 65536, 00:30:42.315 "uuid": "62a4eae2-53ca-41e7-accb-0e5fd0660095", 00:30:42.315 "assigned_rate_limits": { 00:30:42.315 "rw_ios_per_sec": 0, 00:30:42.315 "rw_mbytes_per_sec": 0, 00:30:42.315 "r_mbytes_per_sec": 0, 00:30:42.315 "w_mbytes_per_sec": 0 00:30:42.315 }, 00:30:42.315 "claimed": true, 00:30:42.315 "claim_type": "exclusive_write", 00:30:42.315 "zoned": false, 00:30:42.315 "supported_io_types": { 00:30:42.315 "read": true, 00:30:42.315 "write": true, 00:30:42.315 "unmap": true, 00:30:42.315 "write_zeroes": true, 00:30:42.315 "flush": true, 00:30:42.315 "reset": true, 00:30:42.315 "compare": false, 00:30:42.315 "compare_and_write": false, 00:30:42.315 "abort": true, 00:30:42.315 "nvme_admin": false, 00:30:42.315 "nvme_io": false 00:30:42.315 }, 00:30:42.315 "memory_domains": [ 00:30:42.315 { 00:30:42.315 "dma_device_id": "system", 00:30:42.315 "dma_device_type": 1 00:30:42.315 }, 00:30:42.315 { 00:30:42.315 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:42.315 "dma_device_type": 2 00:30:42.315 } 00:30:42.315 ], 00:30:42.315 "driver_specific": {} 00:30:42.315 } 00:30:42.315 ] 00:30:42.315 11:41:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:30:42.315 11:41:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:30:42.315 11:41:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:30:42.315 11:41:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:30:42.315 11:41:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:30:42.315 11:41:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:30:42.315 11:41:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:30:42.315 11:41:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:42.315 11:41:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:42.315 11:41:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:42.315 11:41:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:42.315 11:41:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:30:42.315 11:41:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:42.574 11:41:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:42.574 "name": "Existed_Raid", 00:30:42.574 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:42.574 "strip_size_kb": 64, 00:30:42.574 "state": "configuring", 00:30:42.574 "raid_level": "concat", 00:30:42.574 "superblock": false, 00:30:42.574 "num_base_bdevs": 3, 00:30:42.574 "num_base_bdevs_discovered": 1, 00:30:42.574 "num_base_bdevs_operational": 3, 00:30:42.574 "base_bdevs_list": [ 00:30:42.574 { 00:30:42.574 "name": "BaseBdev1", 00:30:42.574 "uuid": "62a4eae2-53ca-41e7-accb-0e5fd0660095", 00:30:42.574 "is_configured": true, 00:30:42.574 "data_offset": 0, 00:30:42.574 "data_size": 65536 00:30:42.574 }, 00:30:42.574 { 00:30:42.574 "name": "BaseBdev2", 00:30:42.574 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:42.574 "is_configured": false, 00:30:42.574 "data_offset": 0, 00:30:42.574 "data_size": 0 00:30:42.574 }, 00:30:42.574 { 00:30:42.574 "name": "BaseBdev3", 00:30:42.574 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:42.574 "is_configured": false, 00:30:42.574 "data_offset": 0, 00:30:42.574 "data_size": 0 00:30:42.574 } 00:30:42.574 ] 00:30:42.574 }' 00:30:42.574 11:41:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:42.574 11:41:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:30:43.141 11:41:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:30:43.141 [2024-06-10 11:41:26.965889] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:30:43.141 [2024-06-10 11:41:26.965919] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2395e00 name Existed_Raid, state configuring 00:30:43.141 11:41:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:30:43.400 [2024-06-10 11:41:27.146376] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:30:43.400 [2024-06-10 11:41:27.147474] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:30:43.400 [2024-06-10 11:41:27.147500] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:30:43.400 [2024-06-10 11:41:27.147506] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:30:43.400 [2024-06-10 11:41:27.147514] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:30:43.400 11:41:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:30:43.400 11:41:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:30:43.400 11:41:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:30:43.400 11:41:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:30:43.400 11:41:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:30:43.400 11:41:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:30:43.400 11:41:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:30:43.400 11:41:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:30:43.400 11:41:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:43.400 11:41:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:43.400 11:41:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:43.400 11:41:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:43.400 11:41:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:43.400 11:41:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:30:43.658 11:41:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:43.658 "name": "Existed_Raid", 00:30:43.658 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:43.658 "strip_size_kb": 64, 00:30:43.658 "state": "configuring", 00:30:43.658 "raid_level": "concat", 00:30:43.658 "superblock": false, 00:30:43.658 "num_base_bdevs": 3, 00:30:43.658 "num_base_bdevs_discovered": 1, 00:30:43.658 "num_base_bdevs_operational": 3, 00:30:43.658 "base_bdevs_list": [ 00:30:43.658 { 00:30:43.658 "name": "BaseBdev1", 00:30:43.658 "uuid": "62a4eae2-53ca-41e7-accb-0e5fd0660095", 00:30:43.658 "is_configured": true, 00:30:43.658 "data_offset": 0, 00:30:43.658 "data_size": 65536 00:30:43.658 }, 00:30:43.658 { 00:30:43.658 "name": "BaseBdev2", 00:30:43.658 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:43.658 "is_configured": false, 00:30:43.658 "data_offset": 0, 00:30:43.658 "data_size": 0 00:30:43.658 }, 00:30:43.658 { 00:30:43.658 "name": "BaseBdev3", 00:30:43.658 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:43.658 "is_configured": false, 00:30:43.658 "data_offset": 0, 00:30:43.658 "data_size": 0 00:30:43.658 } 00:30:43.658 ] 00:30:43.658 }' 00:30:43.658 11:41:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:43.658 11:41:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:30:43.916 11:41:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:30:44.175 [2024-06-10 11:41:27.975327] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:30:44.175 BaseBdev2 00:30:44.175 11:41:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:30:44.175 11:41:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:30:44.175 11:41:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:30:44.175 11:41:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:30:44.175 11:41:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:30:44.175 11:41:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:30:44.175 11:41:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:30:44.433 11:41:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:30:44.433 [ 00:30:44.433 { 00:30:44.433 "name": "BaseBdev2", 00:30:44.433 "aliases": [ 00:30:44.433 "c0d1af29-cca8-461e-8fdc-00394cd82227" 00:30:44.433 ], 00:30:44.433 "product_name": "Malloc disk", 00:30:44.433 "block_size": 512, 00:30:44.433 "num_blocks": 65536, 00:30:44.433 "uuid": "c0d1af29-cca8-461e-8fdc-00394cd82227", 00:30:44.433 "assigned_rate_limits": { 00:30:44.433 "rw_ios_per_sec": 0, 00:30:44.433 "rw_mbytes_per_sec": 0, 00:30:44.433 "r_mbytes_per_sec": 0, 00:30:44.433 "w_mbytes_per_sec": 0 00:30:44.433 }, 00:30:44.433 "claimed": true, 00:30:44.433 "claim_type": "exclusive_write", 00:30:44.433 "zoned": false, 00:30:44.433 "supported_io_types": { 00:30:44.433 "read": true, 00:30:44.433 "write": true, 00:30:44.433 "unmap": true, 00:30:44.433 "write_zeroes": true, 00:30:44.433 "flush": true, 00:30:44.433 "reset": true, 00:30:44.433 "compare": false, 00:30:44.433 "compare_and_write": false, 00:30:44.433 "abort": true, 00:30:44.433 "nvme_admin": false, 00:30:44.433 "nvme_io": false 00:30:44.433 }, 00:30:44.433 "memory_domains": [ 00:30:44.433 { 00:30:44.433 "dma_device_id": "system", 00:30:44.433 "dma_device_type": 1 00:30:44.433 }, 00:30:44.433 { 00:30:44.433 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:44.434 "dma_device_type": 2 00:30:44.434 } 00:30:44.434 ], 00:30:44.434 "driver_specific": {} 00:30:44.434 } 00:30:44.434 ] 00:30:44.434 11:41:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:30:44.434 11:41:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:30:44.434 11:41:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:30:44.434 11:41:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:30:44.434 11:41:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:30:44.434 11:41:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:30:44.434 11:41:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:30:44.434 11:41:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:30:44.434 11:41:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:30:44.434 11:41:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:44.434 11:41:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:44.434 11:41:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:44.434 11:41:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:44.434 11:41:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:30:44.434 11:41:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:44.692 11:41:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:44.692 "name": "Existed_Raid", 00:30:44.692 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:44.692 "strip_size_kb": 64, 00:30:44.692 "state": "configuring", 00:30:44.692 "raid_level": "concat", 00:30:44.692 "superblock": false, 00:30:44.692 "num_base_bdevs": 3, 00:30:44.692 "num_base_bdevs_discovered": 2, 00:30:44.692 "num_base_bdevs_operational": 3, 00:30:44.692 "base_bdevs_list": [ 00:30:44.692 { 00:30:44.692 "name": "BaseBdev1", 00:30:44.692 "uuid": "62a4eae2-53ca-41e7-accb-0e5fd0660095", 00:30:44.692 "is_configured": true, 00:30:44.692 "data_offset": 0, 00:30:44.692 "data_size": 65536 00:30:44.692 }, 00:30:44.692 { 00:30:44.692 "name": "BaseBdev2", 00:30:44.692 "uuid": "c0d1af29-cca8-461e-8fdc-00394cd82227", 00:30:44.692 "is_configured": true, 00:30:44.692 "data_offset": 0, 00:30:44.692 "data_size": 65536 00:30:44.692 }, 00:30:44.692 { 00:30:44.692 "name": "BaseBdev3", 00:30:44.692 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:44.692 "is_configured": false, 00:30:44.692 "data_offset": 0, 00:30:44.692 "data_size": 0 00:30:44.692 } 00:30:44.692 ] 00:30:44.692 }' 00:30:44.692 11:41:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:44.692 11:41:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:30:45.260 11:41:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:30:45.260 [2024-06-10 11:41:29.193319] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:30:45.260 [2024-06-10 11:41:29.193348] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2396cf0 00:30:45.260 [2024-06-10 11:41:29.193354] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:30:45.260 [2024-06-10 11:41:29.193488] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23adbe0 00:30:45.260 [2024-06-10 11:41:29.193571] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2396cf0 00:30:45.260 [2024-06-10 11:41:29.193577] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2396cf0 00:30:45.260 [2024-06-10 11:41:29.193693] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:45.260 BaseBdev3 00:30:45.519 11:41:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:30:45.519 11:41:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:30:45.519 11:41:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:30:45.519 11:41:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:30:45.519 11:41:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:30:45.519 11:41:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:30:45.519 11:41:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:30:45.519 11:41:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:30:45.778 [ 00:30:45.778 { 00:30:45.778 "name": "BaseBdev3", 00:30:45.778 "aliases": [ 00:30:45.778 "9a549b07-0b39-458e-bf15-757e81480f2f" 00:30:45.778 ], 00:30:45.778 "product_name": "Malloc disk", 00:30:45.778 "block_size": 512, 00:30:45.778 "num_blocks": 65536, 00:30:45.778 "uuid": "9a549b07-0b39-458e-bf15-757e81480f2f", 00:30:45.778 "assigned_rate_limits": { 00:30:45.778 "rw_ios_per_sec": 0, 00:30:45.778 "rw_mbytes_per_sec": 0, 00:30:45.778 "r_mbytes_per_sec": 0, 00:30:45.778 "w_mbytes_per_sec": 0 00:30:45.778 }, 00:30:45.778 "claimed": true, 00:30:45.778 "claim_type": "exclusive_write", 00:30:45.778 "zoned": false, 00:30:45.778 "supported_io_types": { 00:30:45.778 "read": true, 00:30:45.778 "write": true, 00:30:45.778 "unmap": true, 00:30:45.778 "write_zeroes": true, 00:30:45.778 "flush": true, 00:30:45.778 "reset": true, 00:30:45.778 "compare": false, 00:30:45.778 "compare_and_write": false, 00:30:45.778 "abort": true, 00:30:45.778 "nvme_admin": false, 00:30:45.778 "nvme_io": false 00:30:45.778 }, 00:30:45.778 "memory_domains": [ 00:30:45.778 { 00:30:45.778 "dma_device_id": "system", 00:30:45.778 "dma_device_type": 1 00:30:45.778 }, 00:30:45.778 { 00:30:45.778 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:45.778 "dma_device_type": 2 00:30:45.778 } 00:30:45.778 ], 00:30:45.778 "driver_specific": {} 00:30:45.778 } 00:30:45.778 ] 00:30:45.778 11:41:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:30:45.778 11:41:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:30:45.778 11:41:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:30:45.778 11:41:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:30:45.778 11:41:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:30:45.778 11:41:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:45.778 11:41:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:30:45.778 11:41:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:30:45.778 11:41:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:30:45.778 11:41:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:45.778 11:41:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:45.778 11:41:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:45.778 11:41:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:45.778 11:41:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:45.778 11:41:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:30:46.036 11:41:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:46.036 "name": "Existed_Raid", 00:30:46.036 "uuid": "e694ec83-b0ce-4531-ad22-c68e23ec9c81", 00:30:46.036 "strip_size_kb": 64, 00:30:46.036 "state": "online", 00:30:46.037 "raid_level": "concat", 00:30:46.037 "superblock": false, 00:30:46.037 "num_base_bdevs": 3, 00:30:46.037 "num_base_bdevs_discovered": 3, 00:30:46.037 "num_base_bdevs_operational": 3, 00:30:46.037 "base_bdevs_list": [ 00:30:46.037 { 00:30:46.037 "name": "BaseBdev1", 00:30:46.037 "uuid": "62a4eae2-53ca-41e7-accb-0e5fd0660095", 00:30:46.037 "is_configured": true, 00:30:46.037 "data_offset": 0, 00:30:46.037 "data_size": 65536 00:30:46.037 }, 00:30:46.037 { 00:30:46.037 "name": "BaseBdev2", 00:30:46.037 "uuid": "c0d1af29-cca8-461e-8fdc-00394cd82227", 00:30:46.037 "is_configured": true, 00:30:46.037 "data_offset": 0, 00:30:46.037 "data_size": 65536 00:30:46.037 }, 00:30:46.037 { 00:30:46.037 "name": "BaseBdev3", 00:30:46.037 "uuid": "9a549b07-0b39-458e-bf15-757e81480f2f", 00:30:46.037 "is_configured": true, 00:30:46.037 "data_offset": 0, 00:30:46.037 "data_size": 65536 00:30:46.037 } 00:30:46.037 ] 00:30:46.037 }' 00:30:46.037 11:41:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:46.037 11:41:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:30:46.295 11:41:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:30:46.295 11:41:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:30:46.295 11:41:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:30:46.295 11:41:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:30:46.554 11:41:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:30:46.554 11:41:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:30:46.554 11:41:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:30:46.554 11:41:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:30:46.554 [2024-06-10 11:41:30.396613] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:30:46.554 11:41:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:30:46.554 "name": "Existed_Raid", 00:30:46.554 "aliases": [ 00:30:46.554 "e694ec83-b0ce-4531-ad22-c68e23ec9c81" 00:30:46.554 ], 00:30:46.554 "product_name": "Raid Volume", 00:30:46.554 "block_size": 512, 00:30:46.554 "num_blocks": 196608, 00:30:46.554 "uuid": "e694ec83-b0ce-4531-ad22-c68e23ec9c81", 00:30:46.554 "assigned_rate_limits": { 00:30:46.554 "rw_ios_per_sec": 0, 00:30:46.554 "rw_mbytes_per_sec": 0, 00:30:46.554 "r_mbytes_per_sec": 0, 00:30:46.554 "w_mbytes_per_sec": 0 00:30:46.554 }, 00:30:46.554 "claimed": false, 00:30:46.554 "zoned": false, 00:30:46.554 "supported_io_types": { 00:30:46.554 "read": true, 00:30:46.554 "write": true, 00:30:46.554 "unmap": true, 00:30:46.554 "write_zeroes": true, 00:30:46.554 "flush": true, 00:30:46.554 "reset": true, 00:30:46.554 "compare": false, 00:30:46.554 "compare_and_write": false, 00:30:46.555 "abort": false, 00:30:46.555 "nvme_admin": false, 00:30:46.555 "nvme_io": false 00:30:46.555 }, 00:30:46.555 "memory_domains": [ 00:30:46.555 { 00:30:46.555 "dma_device_id": "system", 00:30:46.555 "dma_device_type": 1 00:30:46.555 }, 00:30:46.555 { 00:30:46.555 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:46.555 "dma_device_type": 2 00:30:46.555 }, 00:30:46.555 { 00:30:46.555 "dma_device_id": "system", 00:30:46.555 "dma_device_type": 1 00:30:46.555 }, 00:30:46.555 { 00:30:46.555 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:46.555 "dma_device_type": 2 00:30:46.555 }, 00:30:46.555 { 00:30:46.555 "dma_device_id": "system", 00:30:46.555 "dma_device_type": 1 00:30:46.555 }, 00:30:46.555 { 00:30:46.555 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:46.555 "dma_device_type": 2 00:30:46.555 } 00:30:46.555 ], 00:30:46.555 "driver_specific": { 00:30:46.555 "raid": { 00:30:46.555 "uuid": "e694ec83-b0ce-4531-ad22-c68e23ec9c81", 00:30:46.555 "strip_size_kb": 64, 00:30:46.555 "state": "online", 00:30:46.555 "raid_level": "concat", 00:30:46.555 "superblock": false, 00:30:46.555 "num_base_bdevs": 3, 00:30:46.555 "num_base_bdevs_discovered": 3, 00:30:46.555 "num_base_bdevs_operational": 3, 00:30:46.555 "base_bdevs_list": [ 00:30:46.555 { 00:30:46.555 "name": "BaseBdev1", 00:30:46.555 "uuid": "62a4eae2-53ca-41e7-accb-0e5fd0660095", 00:30:46.555 "is_configured": true, 00:30:46.555 "data_offset": 0, 00:30:46.555 "data_size": 65536 00:30:46.555 }, 00:30:46.555 { 00:30:46.555 "name": "BaseBdev2", 00:30:46.555 "uuid": "c0d1af29-cca8-461e-8fdc-00394cd82227", 00:30:46.555 "is_configured": true, 00:30:46.555 "data_offset": 0, 00:30:46.555 "data_size": 65536 00:30:46.555 }, 00:30:46.555 { 00:30:46.555 "name": "BaseBdev3", 00:30:46.555 "uuid": "9a549b07-0b39-458e-bf15-757e81480f2f", 00:30:46.555 "is_configured": true, 00:30:46.555 "data_offset": 0, 00:30:46.555 "data_size": 65536 00:30:46.555 } 00:30:46.555 ] 00:30:46.555 } 00:30:46.555 } 00:30:46.555 }' 00:30:46.555 11:41:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:30:46.555 11:41:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:30:46.555 BaseBdev2 00:30:46.555 BaseBdev3' 00:30:46.555 11:41:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:30:46.555 11:41:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:30:46.555 11:41:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:30:46.813 11:41:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:30:46.813 "name": "BaseBdev1", 00:30:46.813 "aliases": [ 00:30:46.813 "62a4eae2-53ca-41e7-accb-0e5fd0660095" 00:30:46.813 ], 00:30:46.813 "product_name": "Malloc disk", 00:30:46.813 "block_size": 512, 00:30:46.813 "num_blocks": 65536, 00:30:46.813 "uuid": "62a4eae2-53ca-41e7-accb-0e5fd0660095", 00:30:46.813 "assigned_rate_limits": { 00:30:46.813 "rw_ios_per_sec": 0, 00:30:46.813 "rw_mbytes_per_sec": 0, 00:30:46.813 "r_mbytes_per_sec": 0, 00:30:46.813 "w_mbytes_per_sec": 0 00:30:46.813 }, 00:30:46.813 "claimed": true, 00:30:46.813 "claim_type": "exclusive_write", 00:30:46.813 "zoned": false, 00:30:46.813 "supported_io_types": { 00:30:46.813 "read": true, 00:30:46.813 "write": true, 00:30:46.813 "unmap": true, 00:30:46.813 "write_zeroes": true, 00:30:46.813 "flush": true, 00:30:46.813 "reset": true, 00:30:46.813 "compare": false, 00:30:46.813 "compare_and_write": false, 00:30:46.813 "abort": true, 00:30:46.813 "nvme_admin": false, 00:30:46.813 "nvme_io": false 00:30:46.813 }, 00:30:46.813 "memory_domains": [ 00:30:46.813 { 00:30:46.813 "dma_device_id": "system", 00:30:46.813 "dma_device_type": 1 00:30:46.813 }, 00:30:46.813 { 00:30:46.813 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:46.813 "dma_device_type": 2 00:30:46.813 } 00:30:46.813 ], 00:30:46.813 "driver_specific": {} 00:30:46.813 }' 00:30:46.813 11:41:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:46.813 11:41:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:46.813 11:41:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:30:46.813 11:41:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:46.813 11:41:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:47.072 11:41:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:30:47.072 11:41:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:47.072 11:41:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:47.072 11:41:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:30:47.072 11:41:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:47.072 11:41:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:47.072 11:41:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:30:47.072 11:41:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:30:47.072 11:41:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:30:47.072 11:41:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:30:47.331 11:41:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:30:47.331 "name": "BaseBdev2", 00:30:47.331 "aliases": [ 00:30:47.331 "c0d1af29-cca8-461e-8fdc-00394cd82227" 00:30:47.331 ], 00:30:47.331 "product_name": "Malloc disk", 00:30:47.331 "block_size": 512, 00:30:47.331 "num_blocks": 65536, 00:30:47.331 "uuid": "c0d1af29-cca8-461e-8fdc-00394cd82227", 00:30:47.331 "assigned_rate_limits": { 00:30:47.331 "rw_ios_per_sec": 0, 00:30:47.331 "rw_mbytes_per_sec": 0, 00:30:47.331 "r_mbytes_per_sec": 0, 00:30:47.331 "w_mbytes_per_sec": 0 00:30:47.331 }, 00:30:47.331 "claimed": true, 00:30:47.331 "claim_type": "exclusive_write", 00:30:47.331 "zoned": false, 00:30:47.331 "supported_io_types": { 00:30:47.331 "read": true, 00:30:47.331 "write": true, 00:30:47.331 "unmap": true, 00:30:47.331 "write_zeroes": true, 00:30:47.331 "flush": true, 00:30:47.331 "reset": true, 00:30:47.331 "compare": false, 00:30:47.331 "compare_and_write": false, 00:30:47.331 "abort": true, 00:30:47.331 "nvme_admin": false, 00:30:47.331 "nvme_io": false 00:30:47.331 }, 00:30:47.331 "memory_domains": [ 00:30:47.331 { 00:30:47.331 "dma_device_id": "system", 00:30:47.331 "dma_device_type": 1 00:30:47.331 }, 00:30:47.331 { 00:30:47.331 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:47.331 "dma_device_type": 2 00:30:47.331 } 00:30:47.331 ], 00:30:47.331 "driver_specific": {} 00:30:47.331 }' 00:30:47.331 11:41:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:47.331 11:41:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:47.331 11:41:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:30:47.331 11:41:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:47.331 11:41:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:47.590 11:41:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:30:47.590 11:41:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:47.590 11:41:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:47.590 11:41:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:30:47.590 11:41:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:47.590 11:41:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:47.590 11:41:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:30:47.590 11:41:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:30:47.590 11:41:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:30:47.590 11:41:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:30:47.848 11:41:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:30:47.848 "name": "BaseBdev3", 00:30:47.848 "aliases": [ 00:30:47.848 "9a549b07-0b39-458e-bf15-757e81480f2f" 00:30:47.848 ], 00:30:47.848 "product_name": "Malloc disk", 00:30:47.848 "block_size": 512, 00:30:47.848 "num_blocks": 65536, 00:30:47.848 "uuid": "9a549b07-0b39-458e-bf15-757e81480f2f", 00:30:47.848 "assigned_rate_limits": { 00:30:47.848 "rw_ios_per_sec": 0, 00:30:47.849 "rw_mbytes_per_sec": 0, 00:30:47.849 "r_mbytes_per_sec": 0, 00:30:47.849 "w_mbytes_per_sec": 0 00:30:47.849 }, 00:30:47.849 "claimed": true, 00:30:47.849 "claim_type": "exclusive_write", 00:30:47.849 "zoned": false, 00:30:47.849 "supported_io_types": { 00:30:47.849 "read": true, 00:30:47.849 "write": true, 00:30:47.849 "unmap": true, 00:30:47.849 "write_zeroes": true, 00:30:47.849 "flush": true, 00:30:47.849 "reset": true, 00:30:47.849 "compare": false, 00:30:47.849 "compare_and_write": false, 00:30:47.849 "abort": true, 00:30:47.849 "nvme_admin": false, 00:30:47.849 "nvme_io": false 00:30:47.849 }, 00:30:47.849 "memory_domains": [ 00:30:47.849 { 00:30:47.849 "dma_device_id": "system", 00:30:47.849 "dma_device_type": 1 00:30:47.849 }, 00:30:47.849 { 00:30:47.849 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:47.849 "dma_device_type": 2 00:30:47.849 } 00:30:47.849 ], 00:30:47.849 "driver_specific": {} 00:30:47.849 }' 00:30:47.849 11:41:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:47.849 11:41:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:47.849 11:41:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:30:47.849 11:41:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:47.849 11:41:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:47.849 11:41:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:30:47.849 11:41:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:47.849 11:41:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:48.107 11:41:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:30:48.107 11:41:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:48.107 11:41:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:48.107 11:41:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:30:48.107 11:41:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:30:48.366 [2024-06-10 11:41:32.064806] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:30:48.366 [2024-06-10 11:41:32.064825] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:30:48.366 [2024-06-10 11:41:32.064852] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:30:48.366 11:41:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:30:48.366 11:41:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:30:48.366 11:41:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:30:48.366 11:41:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:30:48.366 11:41:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:30:48.366 11:41:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 2 00:30:48.366 11:41:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:30:48.366 11:41:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:30:48.366 11:41:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:30:48.366 11:41:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:30:48.366 11:41:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:30:48.366 11:41:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:48.366 11:41:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:48.366 11:41:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:48.366 11:41:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:48.366 11:41:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:30:48.366 11:41:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:48.366 11:41:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:48.366 "name": "Existed_Raid", 00:30:48.366 "uuid": "e694ec83-b0ce-4531-ad22-c68e23ec9c81", 00:30:48.366 "strip_size_kb": 64, 00:30:48.366 "state": "offline", 00:30:48.366 "raid_level": "concat", 00:30:48.366 "superblock": false, 00:30:48.366 "num_base_bdevs": 3, 00:30:48.366 "num_base_bdevs_discovered": 2, 00:30:48.366 "num_base_bdevs_operational": 2, 00:30:48.366 "base_bdevs_list": [ 00:30:48.366 { 00:30:48.366 "name": null, 00:30:48.366 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:48.366 "is_configured": false, 00:30:48.366 "data_offset": 0, 00:30:48.366 "data_size": 65536 00:30:48.366 }, 00:30:48.366 { 00:30:48.366 "name": "BaseBdev2", 00:30:48.366 "uuid": "c0d1af29-cca8-461e-8fdc-00394cd82227", 00:30:48.366 "is_configured": true, 00:30:48.366 "data_offset": 0, 00:30:48.366 "data_size": 65536 00:30:48.366 }, 00:30:48.366 { 00:30:48.366 "name": "BaseBdev3", 00:30:48.366 "uuid": "9a549b07-0b39-458e-bf15-757e81480f2f", 00:30:48.366 "is_configured": true, 00:30:48.366 "data_offset": 0, 00:30:48.366 "data_size": 65536 00:30:48.366 } 00:30:48.366 ] 00:30:48.366 }' 00:30:48.366 11:41:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:48.366 11:41:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:30:48.932 11:41:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:30:48.932 11:41:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:30:48.932 11:41:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:48.932 11:41:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:30:49.190 11:41:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:30:49.190 11:41:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:30:49.190 11:41:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:30:49.190 [2024-06-10 11:41:33.120365] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:30:49.448 11:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:30:49.448 11:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:30:49.448 11:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:49.448 11:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:30:49.448 11:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:30:49.448 11:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:30:49.448 11:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:30:49.707 [2024-06-10 11:41:33.485119] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:30:49.707 [2024-06-10 11:41:33.485152] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2396cf0 name Existed_Raid, state offline 00:30:49.707 11:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:30:49.707 11:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:30:49.707 11:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:49.707 11:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:30:49.967 11:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:30:49.967 11:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:30:49.967 11:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:30:49.967 11:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:30:49.967 11:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:30:49.967 11:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:30:49.967 BaseBdev2 00:30:49.967 11:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:30:49.967 11:41:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:30:49.967 11:41:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:30:49.967 11:41:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:30:49.967 11:41:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:30:49.967 11:41:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:30:49.968 11:41:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:30:50.227 11:41:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:30:50.485 [ 00:30:50.485 { 00:30:50.485 "name": "BaseBdev2", 00:30:50.485 "aliases": [ 00:30:50.485 "564b430b-50d9-475f-aced-7603651879e3" 00:30:50.485 ], 00:30:50.485 "product_name": "Malloc disk", 00:30:50.485 "block_size": 512, 00:30:50.485 "num_blocks": 65536, 00:30:50.485 "uuid": "564b430b-50d9-475f-aced-7603651879e3", 00:30:50.485 "assigned_rate_limits": { 00:30:50.485 "rw_ios_per_sec": 0, 00:30:50.485 "rw_mbytes_per_sec": 0, 00:30:50.485 "r_mbytes_per_sec": 0, 00:30:50.485 "w_mbytes_per_sec": 0 00:30:50.485 }, 00:30:50.485 "claimed": false, 00:30:50.485 "zoned": false, 00:30:50.485 "supported_io_types": { 00:30:50.485 "read": true, 00:30:50.485 "write": true, 00:30:50.485 "unmap": true, 00:30:50.485 "write_zeroes": true, 00:30:50.485 "flush": true, 00:30:50.486 "reset": true, 00:30:50.486 "compare": false, 00:30:50.486 "compare_and_write": false, 00:30:50.486 "abort": true, 00:30:50.486 "nvme_admin": false, 00:30:50.486 "nvme_io": false 00:30:50.486 }, 00:30:50.486 "memory_domains": [ 00:30:50.486 { 00:30:50.486 "dma_device_id": "system", 00:30:50.486 "dma_device_type": 1 00:30:50.486 }, 00:30:50.486 { 00:30:50.486 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:50.486 "dma_device_type": 2 00:30:50.486 } 00:30:50.486 ], 00:30:50.486 "driver_specific": {} 00:30:50.486 } 00:30:50.486 ] 00:30:50.486 11:41:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:30:50.486 11:41:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:30:50.486 11:41:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:30:50.486 11:41:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:30:50.486 BaseBdev3 00:30:50.486 11:41:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:30:50.486 11:41:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:30:50.486 11:41:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:30:50.486 11:41:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:30:50.486 11:41:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:30:50.486 11:41:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:30:50.486 11:41:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:30:50.745 11:41:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:30:51.005 [ 00:30:51.005 { 00:30:51.005 "name": "BaseBdev3", 00:30:51.005 "aliases": [ 00:30:51.005 "31d3a3ad-4bac-477a-bb0d-e5a68000eb32" 00:30:51.005 ], 00:30:51.005 "product_name": "Malloc disk", 00:30:51.005 "block_size": 512, 00:30:51.005 "num_blocks": 65536, 00:30:51.005 "uuid": "31d3a3ad-4bac-477a-bb0d-e5a68000eb32", 00:30:51.005 "assigned_rate_limits": { 00:30:51.005 "rw_ios_per_sec": 0, 00:30:51.005 "rw_mbytes_per_sec": 0, 00:30:51.005 "r_mbytes_per_sec": 0, 00:30:51.005 "w_mbytes_per_sec": 0 00:30:51.005 }, 00:30:51.005 "claimed": false, 00:30:51.005 "zoned": false, 00:30:51.005 "supported_io_types": { 00:30:51.005 "read": true, 00:30:51.005 "write": true, 00:30:51.005 "unmap": true, 00:30:51.005 "write_zeroes": true, 00:30:51.005 "flush": true, 00:30:51.005 "reset": true, 00:30:51.005 "compare": false, 00:30:51.005 "compare_and_write": false, 00:30:51.005 "abort": true, 00:30:51.005 "nvme_admin": false, 00:30:51.005 "nvme_io": false 00:30:51.005 }, 00:30:51.005 "memory_domains": [ 00:30:51.005 { 00:30:51.005 "dma_device_id": "system", 00:30:51.005 "dma_device_type": 1 00:30:51.005 }, 00:30:51.005 { 00:30:51.005 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:51.005 "dma_device_type": 2 00:30:51.005 } 00:30:51.005 ], 00:30:51.005 "driver_specific": {} 00:30:51.005 } 00:30:51.005 ] 00:30:51.005 11:41:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:30:51.005 11:41:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:30:51.005 11:41:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:30:51.005 11:41:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:30:51.005 [2024-06-10 11:41:34.851801] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:30:51.005 [2024-06-10 11:41:34.851836] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:30:51.005 [2024-06-10 11:41:34.851856] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:30:51.005 [2024-06-10 11:41:34.852856] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:30:51.005 11:41:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:30:51.005 11:41:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:30:51.005 11:41:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:30:51.005 11:41:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:30:51.005 11:41:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:30:51.005 11:41:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:30:51.005 11:41:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:51.005 11:41:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:51.005 11:41:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:51.005 11:41:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:51.005 11:41:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:51.005 11:41:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:30:51.264 11:41:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:51.264 "name": "Existed_Raid", 00:30:51.264 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:51.264 "strip_size_kb": 64, 00:30:51.264 "state": "configuring", 00:30:51.264 "raid_level": "concat", 00:30:51.264 "superblock": false, 00:30:51.264 "num_base_bdevs": 3, 00:30:51.264 "num_base_bdevs_discovered": 2, 00:30:51.264 "num_base_bdevs_operational": 3, 00:30:51.264 "base_bdevs_list": [ 00:30:51.264 { 00:30:51.264 "name": "BaseBdev1", 00:30:51.264 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:51.264 "is_configured": false, 00:30:51.264 "data_offset": 0, 00:30:51.264 "data_size": 0 00:30:51.264 }, 00:30:51.264 { 00:30:51.264 "name": "BaseBdev2", 00:30:51.264 "uuid": "564b430b-50d9-475f-aced-7603651879e3", 00:30:51.264 "is_configured": true, 00:30:51.264 "data_offset": 0, 00:30:51.264 "data_size": 65536 00:30:51.264 }, 00:30:51.264 { 00:30:51.264 "name": "BaseBdev3", 00:30:51.264 "uuid": "31d3a3ad-4bac-477a-bb0d-e5a68000eb32", 00:30:51.264 "is_configured": true, 00:30:51.264 "data_offset": 0, 00:30:51.264 "data_size": 65536 00:30:51.264 } 00:30:51.264 ] 00:30:51.264 }' 00:30:51.264 11:41:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:51.264 11:41:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:30:51.832 11:41:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:30:51.832 [2024-06-10 11:41:35.685934] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:30:51.832 11:41:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:30:51.832 11:41:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:30:51.832 11:41:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:30:51.832 11:41:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:30:51.832 11:41:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:30:51.832 11:41:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:30:51.832 11:41:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:51.832 11:41:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:51.832 11:41:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:51.832 11:41:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:51.832 11:41:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:51.832 11:41:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:30:52.091 11:41:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:52.091 "name": "Existed_Raid", 00:30:52.091 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:52.091 "strip_size_kb": 64, 00:30:52.091 "state": "configuring", 00:30:52.091 "raid_level": "concat", 00:30:52.091 "superblock": false, 00:30:52.091 "num_base_bdevs": 3, 00:30:52.092 "num_base_bdevs_discovered": 1, 00:30:52.092 "num_base_bdevs_operational": 3, 00:30:52.092 "base_bdevs_list": [ 00:30:52.092 { 00:30:52.092 "name": "BaseBdev1", 00:30:52.092 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:52.092 "is_configured": false, 00:30:52.092 "data_offset": 0, 00:30:52.092 "data_size": 0 00:30:52.092 }, 00:30:52.092 { 00:30:52.092 "name": null, 00:30:52.092 "uuid": "564b430b-50d9-475f-aced-7603651879e3", 00:30:52.092 "is_configured": false, 00:30:52.092 "data_offset": 0, 00:30:52.092 "data_size": 65536 00:30:52.092 }, 00:30:52.092 { 00:30:52.092 "name": "BaseBdev3", 00:30:52.092 "uuid": "31d3a3ad-4bac-477a-bb0d-e5a68000eb32", 00:30:52.092 "is_configured": true, 00:30:52.092 "data_offset": 0, 00:30:52.092 "data_size": 65536 00:30:52.092 } 00:30:52.092 ] 00:30:52.092 }' 00:30:52.092 11:41:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:52.092 11:41:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:30:52.658 11:41:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:52.658 11:41:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:30:52.658 11:41:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:30:52.658 11:41:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:30:52.917 [2024-06-10 11:41:36.739574] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:30:52.917 BaseBdev1 00:30:52.917 11:41:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:30:52.917 11:41:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:30:52.917 11:41:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:30:52.917 11:41:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:30:52.917 11:41:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:30:52.917 11:41:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:30:52.917 11:41:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:30:53.176 11:41:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:30:53.176 [ 00:30:53.176 { 00:30:53.176 "name": "BaseBdev1", 00:30:53.176 "aliases": [ 00:30:53.176 "d25ff60f-4a44-420e-8ae9-9605fb52cbf0" 00:30:53.176 ], 00:30:53.176 "product_name": "Malloc disk", 00:30:53.176 "block_size": 512, 00:30:53.176 "num_blocks": 65536, 00:30:53.176 "uuid": "d25ff60f-4a44-420e-8ae9-9605fb52cbf0", 00:30:53.176 "assigned_rate_limits": { 00:30:53.176 "rw_ios_per_sec": 0, 00:30:53.176 "rw_mbytes_per_sec": 0, 00:30:53.176 "r_mbytes_per_sec": 0, 00:30:53.176 "w_mbytes_per_sec": 0 00:30:53.176 }, 00:30:53.176 "claimed": true, 00:30:53.176 "claim_type": "exclusive_write", 00:30:53.176 "zoned": false, 00:30:53.176 "supported_io_types": { 00:30:53.176 "read": true, 00:30:53.176 "write": true, 00:30:53.176 "unmap": true, 00:30:53.176 "write_zeroes": true, 00:30:53.176 "flush": true, 00:30:53.176 "reset": true, 00:30:53.176 "compare": false, 00:30:53.176 "compare_and_write": false, 00:30:53.176 "abort": true, 00:30:53.176 "nvme_admin": false, 00:30:53.176 "nvme_io": false 00:30:53.176 }, 00:30:53.176 "memory_domains": [ 00:30:53.176 { 00:30:53.176 "dma_device_id": "system", 00:30:53.176 "dma_device_type": 1 00:30:53.176 }, 00:30:53.176 { 00:30:53.176 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:53.177 "dma_device_type": 2 00:30:53.177 } 00:30:53.177 ], 00:30:53.177 "driver_specific": {} 00:30:53.177 } 00:30:53.177 ] 00:30:53.177 11:41:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:30:53.177 11:41:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:30:53.177 11:41:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:30:53.177 11:41:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:30:53.177 11:41:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:30:53.177 11:41:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:30:53.177 11:41:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:30:53.177 11:41:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:53.177 11:41:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:53.177 11:41:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:53.177 11:41:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:53.177 11:41:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:53.177 11:41:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:30:53.435 11:41:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:53.435 "name": "Existed_Raid", 00:30:53.436 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:53.436 "strip_size_kb": 64, 00:30:53.436 "state": "configuring", 00:30:53.436 "raid_level": "concat", 00:30:53.436 "superblock": false, 00:30:53.436 "num_base_bdevs": 3, 00:30:53.436 "num_base_bdevs_discovered": 2, 00:30:53.436 "num_base_bdevs_operational": 3, 00:30:53.436 "base_bdevs_list": [ 00:30:53.436 { 00:30:53.436 "name": "BaseBdev1", 00:30:53.436 "uuid": "d25ff60f-4a44-420e-8ae9-9605fb52cbf0", 00:30:53.436 "is_configured": true, 00:30:53.436 "data_offset": 0, 00:30:53.436 "data_size": 65536 00:30:53.436 }, 00:30:53.436 { 00:30:53.436 "name": null, 00:30:53.436 "uuid": "564b430b-50d9-475f-aced-7603651879e3", 00:30:53.436 "is_configured": false, 00:30:53.436 "data_offset": 0, 00:30:53.436 "data_size": 65536 00:30:53.436 }, 00:30:53.436 { 00:30:53.436 "name": "BaseBdev3", 00:30:53.436 "uuid": "31d3a3ad-4bac-477a-bb0d-e5a68000eb32", 00:30:53.436 "is_configured": true, 00:30:53.436 "data_offset": 0, 00:30:53.436 "data_size": 65536 00:30:53.436 } 00:30:53.436 ] 00:30:53.436 }' 00:30:53.436 11:41:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:53.436 11:41:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:30:54.002 11:41:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:54.002 11:41:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:30:54.002 11:41:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:30:54.002 11:41:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:30:54.260 [2024-06-10 11:41:38.091110] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:30:54.260 11:41:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:30:54.260 11:41:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:30:54.260 11:41:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:30:54.260 11:41:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:30:54.260 11:41:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:30:54.260 11:41:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:30:54.260 11:41:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:54.260 11:41:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:54.260 11:41:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:54.260 11:41:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:54.260 11:41:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:54.260 11:41:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:30:54.519 11:41:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:54.519 "name": "Existed_Raid", 00:30:54.519 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:54.519 "strip_size_kb": 64, 00:30:54.519 "state": "configuring", 00:30:54.519 "raid_level": "concat", 00:30:54.519 "superblock": false, 00:30:54.519 "num_base_bdevs": 3, 00:30:54.519 "num_base_bdevs_discovered": 1, 00:30:54.519 "num_base_bdevs_operational": 3, 00:30:54.519 "base_bdevs_list": [ 00:30:54.519 { 00:30:54.519 "name": "BaseBdev1", 00:30:54.519 "uuid": "d25ff60f-4a44-420e-8ae9-9605fb52cbf0", 00:30:54.519 "is_configured": true, 00:30:54.519 "data_offset": 0, 00:30:54.519 "data_size": 65536 00:30:54.519 }, 00:30:54.519 { 00:30:54.519 "name": null, 00:30:54.519 "uuid": "564b430b-50d9-475f-aced-7603651879e3", 00:30:54.519 "is_configured": false, 00:30:54.519 "data_offset": 0, 00:30:54.519 "data_size": 65536 00:30:54.519 }, 00:30:54.519 { 00:30:54.519 "name": null, 00:30:54.519 "uuid": "31d3a3ad-4bac-477a-bb0d-e5a68000eb32", 00:30:54.519 "is_configured": false, 00:30:54.519 "data_offset": 0, 00:30:54.519 "data_size": 65536 00:30:54.519 } 00:30:54.519 ] 00:30:54.519 }' 00:30:54.519 11:41:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:54.519 11:41:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:30:55.087 11:41:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:30:55.087 11:41:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:55.087 11:41:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:30:55.087 11:41:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:30:55.346 [2024-06-10 11:41:39.145827] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:30:55.346 11:41:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:30:55.346 11:41:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:30:55.346 11:41:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:30:55.346 11:41:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:30:55.346 11:41:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:30:55.346 11:41:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:30:55.346 11:41:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:55.346 11:41:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:55.346 11:41:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:55.346 11:41:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:55.346 11:41:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:55.346 11:41:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:30:55.605 11:41:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:55.605 "name": "Existed_Raid", 00:30:55.605 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:55.605 "strip_size_kb": 64, 00:30:55.605 "state": "configuring", 00:30:55.605 "raid_level": "concat", 00:30:55.605 "superblock": false, 00:30:55.605 "num_base_bdevs": 3, 00:30:55.605 "num_base_bdevs_discovered": 2, 00:30:55.605 "num_base_bdevs_operational": 3, 00:30:55.605 "base_bdevs_list": [ 00:30:55.605 { 00:30:55.605 "name": "BaseBdev1", 00:30:55.605 "uuid": "d25ff60f-4a44-420e-8ae9-9605fb52cbf0", 00:30:55.605 "is_configured": true, 00:30:55.605 "data_offset": 0, 00:30:55.605 "data_size": 65536 00:30:55.605 }, 00:30:55.605 { 00:30:55.605 "name": null, 00:30:55.605 "uuid": "564b430b-50d9-475f-aced-7603651879e3", 00:30:55.605 "is_configured": false, 00:30:55.605 "data_offset": 0, 00:30:55.605 "data_size": 65536 00:30:55.605 }, 00:30:55.605 { 00:30:55.605 "name": "BaseBdev3", 00:30:55.605 "uuid": "31d3a3ad-4bac-477a-bb0d-e5a68000eb32", 00:30:55.605 "is_configured": true, 00:30:55.605 "data_offset": 0, 00:30:55.605 "data_size": 65536 00:30:55.605 } 00:30:55.605 ] 00:30:55.605 }' 00:30:55.605 11:41:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:55.605 11:41:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:30:56.174 11:41:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:56.174 11:41:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:30:56.174 11:41:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:30:56.174 11:41:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:30:56.433 [2024-06-10 11:41:40.152438] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:30:56.433 11:41:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:30:56.433 11:41:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:30:56.433 11:41:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:30:56.433 11:41:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:30:56.433 11:41:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:30:56.433 11:41:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:30:56.433 11:41:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:56.433 11:41:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:56.433 11:41:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:56.433 11:41:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:56.433 11:41:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:56.433 11:41:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:30:56.433 11:41:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:56.433 "name": "Existed_Raid", 00:30:56.433 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:56.433 "strip_size_kb": 64, 00:30:56.433 "state": "configuring", 00:30:56.433 "raid_level": "concat", 00:30:56.433 "superblock": false, 00:30:56.433 "num_base_bdevs": 3, 00:30:56.433 "num_base_bdevs_discovered": 1, 00:30:56.433 "num_base_bdevs_operational": 3, 00:30:56.433 "base_bdevs_list": [ 00:30:56.433 { 00:30:56.433 "name": null, 00:30:56.433 "uuid": "d25ff60f-4a44-420e-8ae9-9605fb52cbf0", 00:30:56.433 "is_configured": false, 00:30:56.433 "data_offset": 0, 00:30:56.433 "data_size": 65536 00:30:56.433 }, 00:30:56.433 { 00:30:56.433 "name": null, 00:30:56.433 "uuid": "564b430b-50d9-475f-aced-7603651879e3", 00:30:56.433 "is_configured": false, 00:30:56.433 "data_offset": 0, 00:30:56.433 "data_size": 65536 00:30:56.433 }, 00:30:56.433 { 00:30:56.433 "name": "BaseBdev3", 00:30:56.433 "uuid": "31d3a3ad-4bac-477a-bb0d-e5a68000eb32", 00:30:56.433 "is_configured": true, 00:30:56.433 "data_offset": 0, 00:30:56.433 "data_size": 65536 00:30:56.433 } 00:30:56.433 ] 00:30:56.433 }' 00:30:56.433 11:41:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:56.433 11:41:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:30:57.002 11:41:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:57.002 11:41:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:30:57.261 11:41:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:30:57.261 11:41:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:30:57.261 [2024-06-10 11:41:41.202516] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:30:57.520 11:41:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:30:57.520 11:41:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:30:57.520 11:41:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:30:57.520 11:41:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:30:57.520 11:41:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:30:57.520 11:41:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:30:57.520 11:41:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:57.520 11:41:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:57.520 11:41:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:57.520 11:41:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:57.520 11:41:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:57.520 11:41:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:30:57.520 11:41:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:57.520 "name": "Existed_Raid", 00:30:57.520 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:57.520 "strip_size_kb": 64, 00:30:57.520 "state": "configuring", 00:30:57.520 "raid_level": "concat", 00:30:57.520 "superblock": false, 00:30:57.520 "num_base_bdevs": 3, 00:30:57.520 "num_base_bdevs_discovered": 2, 00:30:57.520 "num_base_bdevs_operational": 3, 00:30:57.520 "base_bdevs_list": [ 00:30:57.520 { 00:30:57.520 "name": null, 00:30:57.520 "uuid": "d25ff60f-4a44-420e-8ae9-9605fb52cbf0", 00:30:57.520 "is_configured": false, 00:30:57.520 "data_offset": 0, 00:30:57.520 "data_size": 65536 00:30:57.520 }, 00:30:57.520 { 00:30:57.520 "name": "BaseBdev2", 00:30:57.520 "uuid": "564b430b-50d9-475f-aced-7603651879e3", 00:30:57.520 "is_configured": true, 00:30:57.520 "data_offset": 0, 00:30:57.520 "data_size": 65536 00:30:57.520 }, 00:30:57.520 { 00:30:57.520 "name": "BaseBdev3", 00:30:57.520 "uuid": "31d3a3ad-4bac-477a-bb0d-e5a68000eb32", 00:30:57.520 "is_configured": true, 00:30:57.520 "data_offset": 0, 00:30:57.521 "data_size": 65536 00:30:57.521 } 00:30:57.521 ] 00:30:57.521 }' 00:30:57.521 11:41:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:57.521 11:41:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:30:58.089 11:41:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:58.089 11:41:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:30:58.089 11:41:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:30:58.089 11:41:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:58.089 11:41:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:30:58.355 11:41:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u d25ff60f-4a44-420e-8ae9-9605fb52cbf0 00:30:58.617 [2024-06-10 11:41:42.368341] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:30:58.617 [2024-06-10 11:41:42.368373] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x238e410 00:30:58.617 [2024-06-10 11:41:42.368379] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:30:58.617 [2024-06-10 11:41:42.368514] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x254a140 00:30:58.617 [2024-06-10 11:41:42.368590] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x238e410 00:30:58.617 [2024-06-10 11:41:42.368596] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x238e410 00:30:58.617 [2024-06-10 11:41:42.368723] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:58.617 NewBaseBdev 00:30:58.617 11:41:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:30:58.617 11:41:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=NewBaseBdev 00:30:58.617 11:41:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:30:58.617 11:41:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:30:58.617 11:41:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:30:58.617 11:41:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:30:58.617 11:41:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:30:58.617 11:41:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:30:58.876 [ 00:30:58.876 { 00:30:58.876 "name": "NewBaseBdev", 00:30:58.876 "aliases": [ 00:30:58.876 "d25ff60f-4a44-420e-8ae9-9605fb52cbf0" 00:30:58.876 ], 00:30:58.876 "product_name": "Malloc disk", 00:30:58.876 "block_size": 512, 00:30:58.876 "num_blocks": 65536, 00:30:58.876 "uuid": "d25ff60f-4a44-420e-8ae9-9605fb52cbf0", 00:30:58.876 "assigned_rate_limits": { 00:30:58.876 "rw_ios_per_sec": 0, 00:30:58.876 "rw_mbytes_per_sec": 0, 00:30:58.876 "r_mbytes_per_sec": 0, 00:30:58.876 "w_mbytes_per_sec": 0 00:30:58.876 }, 00:30:58.876 "claimed": true, 00:30:58.876 "claim_type": "exclusive_write", 00:30:58.876 "zoned": false, 00:30:58.876 "supported_io_types": { 00:30:58.876 "read": true, 00:30:58.876 "write": true, 00:30:58.876 "unmap": true, 00:30:58.876 "write_zeroes": true, 00:30:58.876 "flush": true, 00:30:58.876 "reset": true, 00:30:58.876 "compare": false, 00:30:58.876 "compare_and_write": false, 00:30:58.876 "abort": true, 00:30:58.876 "nvme_admin": false, 00:30:58.876 "nvme_io": false 00:30:58.876 }, 00:30:58.876 "memory_domains": [ 00:30:58.876 { 00:30:58.876 "dma_device_id": "system", 00:30:58.876 "dma_device_type": 1 00:30:58.876 }, 00:30:58.876 { 00:30:58.876 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:58.876 "dma_device_type": 2 00:30:58.876 } 00:30:58.876 ], 00:30:58.876 "driver_specific": {} 00:30:58.876 } 00:30:58.876 ] 00:30:58.876 11:41:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:30:58.876 11:41:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:30:58.876 11:41:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:30:58.876 11:41:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:58.876 11:41:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:30:58.876 11:41:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:30:58.876 11:41:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:30:58.876 11:41:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:58.876 11:41:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:58.876 11:41:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:58.876 11:41:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:58.876 11:41:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:58.876 11:41:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:30:59.136 11:41:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:59.136 "name": "Existed_Raid", 00:30:59.136 "uuid": "dfa4e361-39da-48b9-9409-509be3a69a36", 00:30:59.136 "strip_size_kb": 64, 00:30:59.136 "state": "online", 00:30:59.136 "raid_level": "concat", 00:30:59.136 "superblock": false, 00:30:59.136 "num_base_bdevs": 3, 00:30:59.136 "num_base_bdevs_discovered": 3, 00:30:59.136 "num_base_bdevs_operational": 3, 00:30:59.136 "base_bdevs_list": [ 00:30:59.136 { 00:30:59.136 "name": "NewBaseBdev", 00:30:59.136 "uuid": "d25ff60f-4a44-420e-8ae9-9605fb52cbf0", 00:30:59.136 "is_configured": true, 00:30:59.136 "data_offset": 0, 00:30:59.136 "data_size": 65536 00:30:59.136 }, 00:30:59.136 { 00:30:59.136 "name": "BaseBdev2", 00:30:59.136 "uuid": "564b430b-50d9-475f-aced-7603651879e3", 00:30:59.136 "is_configured": true, 00:30:59.136 "data_offset": 0, 00:30:59.136 "data_size": 65536 00:30:59.136 }, 00:30:59.136 { 00:30:59.136 "name": "BaseBdev3", 00:30:59.136 "uuid": "31d3a3ad-4bac-477a-bb0d-e5a68000eb32", 00:30:59.136 "is_configured": true, 00:30:59.136 "data_offset": 0, 00:30:59.136 "data_size": 65536 00:30:59.136 } 00:30:59.136 ] 00:30:59.136 }' 00:30:59.136 11:41:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:59.136 11:41:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:30:59.762 11:41:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:30:59.762 11:41:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:30:59.762 11:41:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:30:59.762 11:41:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:30:59.762 11:41:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:30:59.762 11:41:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:30:59.762 11:41:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:30:59.762 11:41:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:30:59.762 [2024-06-10 11:41:43.539585] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:30:59.762 11:41:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:30:59.762 "name": "Existed_Raid", 00:30:59.762 "aliases": [ 00:30:59.762 "dfa4e361-39da-48b9-9409-509be3a69a36" 00:30:59.762 ], 00:30:59.762 "product_name": "Raid Volume", 00:30:59.762 "block_size": 512, 00:30:59.762 "num_blocks": 196608, 00:30:59.762 "uuid": "dfa4e361-39da-48b9-9409-509be3a69a36", 00:30:59.762 "assigned_rate_limits": { 00:30:59.762 "rw_ios_per_sec": 0, 00:30:59.762 "rw_mbytes_per_sec": 0, 00:30:59.762 "r_mbytes_per_sec": 0, 00:30:59.762 "w_mbytes_per_sec": 0 00:30:59.762 }, 00:30:59.762 "claimed": false, 00:30:59.762 "zoned": false, 00:30:59.762 "supported_io_types": { 00:30:59.762 "read": true, 00:30:59.762 "write": true, 00:30:59.762 "unmap": true, 00:30:59.762 "write_zeroes": true, 00:30:59.762 "flush": true, 00:30:59.762 "reset": true, 00:30:59.762 "compare": false, 00:30:59.762 "compare_and_write": false, 00:30:59.762 "abort": false, 00:30:59.762 "nvme_admin": false, 00:30:59.762 "nvme_io": false 00:30:59.762 }, 00:30:59.762 "memory_domains": [ 00:30:59.762 { 00:30:59.762 "dma_device_id": "system", 00:30:59.762 "dma_device_type": 1 00:30:59.762 }, 00:30:59.762 { 00:30:59.762 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:59.762 "dma_device_type": 2 00:30:59.762 }, 00:30:59.762 { 00:30:59.762 "dma_device_id": "system", 00:30:59.762 "dma_device_type": 1 00:30:59.762 }, 00:30:59.762 { 00:30:59.762 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:59.762 "dma_device_type": 2 00:30:59.762 }, 00:30:59.762 { 00:30:59.762 "dma_device_id": "system", 00:30:59.762 "dma_device_type": 1 00:30:59.762 }, 00:30:59.762 { 00:30:59.762 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:59.762 "dma_device_type": 2 00:30:59.762 } 00:30:59.762 ], 00:30:59.762 "driver_specific": { 00:30:59.762 "raid": { 00:30:59.762 "uuid": "dfa4e361-39da-48b9-9409-509be3a69a36", 00:30:59.762 "strip_size_kb": 64, 00:30:59.762 "state": "online", 00:30:59.762 "raid_level": "concat", 00:30:59.762 "superblock": false, 00:30:59.762 "num_base_bdevs": 3, 00:30:59.762 "num_base_bdevs_discovered": 3, 00:30:59.762 "num_base_bdevs_operational": 3, 00:30:59.762 "base_bdevs_list": [ 00:30:59.762 { 00:30:59.762 "name": "NewBaseBdev", 00:30:59.762 "uuid": "d25ff60f-4a44-420e-8ae9-9605fb52cbf0", 00:30:59.762 "is_configured": true, 00:30:59.762 "data_offset": 0, 00:30:59.762 "data_size": 65536 00:30:59.762 }, 00:30:59.762 { 00:30:59.762 "name": "BaseBdev2", 00:30:59.762 "uuid": "564b430b-50d9-475f-aced-7603651879e3", 00:30:59.762 "is_configured": true, 00:30:59.762 "data_offset": 0, 00:30:59.762 "data_size": 65536 00:30:59.762 }, 00:30:59.762 { 00:30:59.762 "name": "BaseBdev3", 00:30:59.762 "uuid": "31d3a3ad-4bac-477a-bb0d-e5a68000eb32", 00:30:59.762 "is_configured": true, 00:30:59.762 "data_offset": 0, 00:30:59.762 "data_size": 65536 00:30:59.762 } 00:30:59.762 ] 00:30:59.762 } 00:30:59.762 } 00:30:59.762 }' 00:30:59.762 11:41:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:30:59.762 11:41:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:30:59.762 BaseBdev2 00:30:59.762 BaseBdev3' 00:30:59.762 11:41:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:30:59.762 11:41:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:30:59.762 11:41:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:31:00.021 11:41:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:31:00.021 "name": "NewBaseBdev", 00:31:00.021 "aliases": [ 00:31:00.021 "d25ff60f-4a44-420e-8ae9-9605fb52cbf0" 00:31:00.021 ], 00:31:00.021 "product_name": "Malloc disk", 00:31:00.021 "block_size": 512, 00:31:00.021 "num_blocks": 65536, 00:31:00.021 "uuid": "d25ff60f-4a44-420e-8ae9-9605fb52cbf0", 00:31:00.021 "assigned_rate_limits": { 00:31:00.021 "rw_ios_per_sec": 0, 00:31:00.021 "rw_mbytes_per_sec": 0, 00:31:00.021 "r_mbytes_per_sec": 0, 00:31:00.021 "w_mbytes_per_sec": 0 00:31:00.021 }, 00:31:00.021 "claimed": true, 00:31:00.021 "claim_type": "exclusive_write", 00:31:00.021 "zoned": false, 00:31:00.021 "supported_io_types": { 00:31:00.021 "read": true, 00:31:00.021 "write": true, 00:31:00.021 "unmap": true, 00:31:00.021 "write_zeroes": true, 00:31:00.021 "flush": true, 00:31:00.021 "reset": true, 00:31:00.021 "compare": false, 00:31:00.021 "compare_and_write": false, 00:31:00.021 "abort": true, 00:31:00.021 "nvme_admin": false, 00:31:00.021 "nvme_io": false 00:31:00.021 }, 00:31:00.021 "memory_domains": [ 00:31:00.021 { 00:31:00.021 "dma_device_id": "system", 00:31:00.021 "dma_device_type": 1 00:31:00.021 }, 00:31:00.021 { 00:31:00.021 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:00.021 "dma_device_type": 2 00:31:00.021 } 00:31:00.021 ], 00:31:00.021 "driver_specific": {} 00:31:00.021 }' 00:31:00.021 11:41:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:00.021 11:41:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:00.021 11:41:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:31:00.021 11:41:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:00.021 11:41:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:00.021 11:41:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:31:00.021 11:41:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:00.280 11:41:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:00.280 11:41:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:31:00.280 11:41:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:00.280 11:41:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:00.280 11:41:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:31:00.280 11:41:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:31:00.280 11:41:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:31:00.280 11:41:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:31:00.539 11:41:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:31:00.539 "name": "BaseBdev2", 00:31:00.539 "aliases": [ 00:31:00.539 "564b430b-50d9-475f-aced-7603651879e3" 00:31:00.539 ], 00:31:00.539 "product_name": "Malloc disk", 00:31:00.539 "block_size": 512, 00:31:00.539 "num_blocks": 65536, 00:31:00.539 "uuid": "564b430b-50d9-475f-aced-7603651879e3", 00:31:00.539 "assigned_rate_limits": { 00:31:00.539 "rw_ios_per_sec": 0, 00:31:00.539 "rw_mbytes_per_sec": 0, 00:31:00.539 "r_mbytes_per_sec": 0, 00:31:00.539 "w_mbytes_per_sec": 0 00:31:00.539 }, 00:31:00.539 "claimed": true, 00:31:00.539 "claim_type": "exclusive_write", 00:31:00.539 "zoned": false, 00:31:00.539 "supported_io_types": { 00:31:00.539 "read": true, 00:31:00.539 "write": true, 00:31:00.539 "unmap": true, 00:31:00.539 "write_zeroes": true, 00:31:00.539 "flush": true, 00:31:00.539 "reset": true, 00:31:00.539 "compare": false, 00:31:00.539 "compare_and_write": false, 00:31:00.539 "abort": true, 00:31:00.539 "nvme_admin": false, 00:31:00.539 "nvme_io": false 00:31:00.539 }, 00:31:00.539 "memory_domains": [ 00:31:00.539 { 00:31:00.539 "dma_device_id": "system", 00:31:00.539 "dma_device_type": 1 00:31:00.539 }, 00:31:00.539 { 00:31:00.539 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:00.539 "dma_device_type": 2 00:31:00.539 } 00:31:00.539 ], 00:31:00.539 "driver_specific": {} 00:31:00.539 }' 00:31:00.539 11:41:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:00.539 11:41:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:00.539 11:41:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:31:00.539 11:41:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:00.539 11:41:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:00.539 11:41:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:31:00.539 11:41:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:00.539 11:41:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:00.539 11:41:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:31:00.539 11:41:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:00.798 11:41:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:00.798 11:41:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:31:00.798 11:41:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:31:00.798 11:41:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:31:00.798 11:41:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:31:00.798 11:41:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:31:00.798 "name": "BaseBdev3", 00:31:00.798 "aliases": [ 00:31:00.798 "31d3a3ad-4bac-477a-bb0d-e5a68000eb32" 00:31:00.798 ], 00:31:00.798 "product_name": "Malloc disk", 00:31:00.798 "block_size": 512, 00:31:00.798 "num_blocks": 65536, 00:31:00.798 "uuid": "31d3a3ad-4bac-477a-bb0d-e5a68000eb32", 00:31:00.798 "assigned_rate_limits": { 00:31:00.798 "rw_ios_per_sec": 0, 00:31:00.798 "rw_mbytes_per_sec": 0, 00:31:00.798 "r_mbytes_per_sec": 0, 00:31:00.798 "w_mbytes_per_sec": 0 00:31:00.798 }, 00:31:00.798 "claimed": true, 00:31:00.798 "claim_type": "exclusive_write", 00:31:00.798 "zoned": false, 00:31:00.798 "supported_io_types": { 00:31:00.798 "read": true, 00:31:00.798 "write": true, 00:31:00.798 "unmap": true, 00:31:00.798 "write_zeroes": true, 00:31:00.798 "flush": true, 00:31:00.798 "reset": true, 00:31:00.798 "compare": false, 00:31:00.798 "compare_and_write": false, 00:31:00.798 "abort": true, 00:31:00.798 "nvme_admin": false, 00:31:00.798 "nvme_io": false 00:31:00.798 }, 00:31:00.798 "memory_domains": [ 00:31:00.798 { 00:31:00.798 "dma_device_id": "system", 00:31:00.798 "dma_device_type": 1 00:31:00.798 }, 00:31:00.798 { 00:31:00.798 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:00.798 "dma_device_type": 2 00:31:00.798 } 00:31:00.798 ], 00:31:00.798 "driver_specific": {} 00:31:00.798 }' 00:31:00.798 11:41:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:00.798 11:41:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:01.057 11:41:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:31:01.057 11:41:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:01.057 11:41:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:01.057 11:41:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:31:01.057 11:41:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:01.057 11:41:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:01.057 11:41:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:31:01.057 11:41:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:01.057 11:41:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:01.057 11:41:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:31:01.057 11:41:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:31:01.316 [2024-06-10 11:41:45.127526] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:31:01.316 [2024-06-10 11:41:45.127547] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:31:01.316 [2024-06-10 11:41:45.127586] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:31:01.316 [2024-06-10 11:41:45.127625] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:31:01.316 [2024-06-10 11:41:45.127633] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x238e410 name Existed_Raid, state offline 00:31:01.317 11:41:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 157599 00:31:01.317 11:41:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@949 -- # '[' -z 157599 ']' 00:31:01.317 11:41:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # kill -0 157599 00:31:01.317 11:41:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # uname 00:31:01.317 11:41:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:31:01.317 11:41:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 157599 00:31:01.317 11:41:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:31:01.317 11:41:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:31:01.317 11:41:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 157599' 00:31:01.317 killing process with pid 157599 00:31:01.317 11:41:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # kill 157599 00:31:01.317 [2024-06-10 11:41:45.174406] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:31:01.317 11:41:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@973 -- # wait 157599 00:31:01.317 [2024-06-10 11:41:45.199720] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:31:01.576 11:41:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:31:01.576 00:31:01.576 real 0m21.794s 00:31:01.576 user 0m39.723s 00:31:01.576 sys 0m4.238s 00:31:01.576 11:41:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:31:01.576 11:41:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:31:01.576 ************************************ 00:31:01.576 END TEST raid_state_function_test 00:31:01.576 ************************************ 00:31:01.576 11:41:45 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 3 true 00:31:01.576 11:41:45 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:31:01.576 11:41:45 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:31:01.576 11:41:45 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:31:01.576 ************************************ 00:31:01.576 START TEST raid_state_function_test_sb 00:31:01.576 ************************************ 00:31:01.576 11:41:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # raid_state_function_test concat 3 true 00:31:01.576 11:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:31:01.576 11:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:31:01.576 11:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:31:01.576 11:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:31:01.576 11:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:31:01.576 11:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:31:01.576 11:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:31:01.576 11:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:31:01.576 11:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:31:01.576 11:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:31:01.576 11:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:31:01.576 11:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:31:01.576 11:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:31:01.576 11:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:31:01.577 11:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:31:01.577 11:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:31:01.577 11:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:31:01.577 11:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:31:01.577 11:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:31:01.577 11:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:31:01.577 11:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:31:01.577 11:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:31:01.577 11:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:31:01.577 11:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:31:01.577 11:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:31:01.577 11:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:31:01.577 11:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=161031 00:31:01.577 11:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 161031' 00:31:01.577 Process raid pid: 161031 00:31:01.577 11:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:31:01.577 11:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 161031 /var/tmp/spdk-raid.sock 00:31:01.577 11:41:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@830 -- # '[' -z 161031 ']' 00:31:01.577 11:41:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:31:01.577 11:41:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local max_retries=100 00:31:01.577 11:41:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:31:01.577 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:31:01.577 11:41:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@839 -- # xtrace_disable 00:31:01.577 11:41:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:31:01.836 [2024-06-10 11:41:45.523828] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:31:01.836 [2024-06-10 11:41:45.523887] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:31:01.836 [2024-06-10 11:41:45.612258] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:01.836 [2024-06-10 11:41:45.700576] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:31:01.836 [2024-06-10 11:41:45.761931] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:31:01.837 [2024-06-10 11:41:45.761958] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:31:02.405 11:41:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:31:02.405 11:41:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@863 -- # return 0 00:31:02.405 11:41:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:31:02.664 [2024-06-10 11:41:46.477644] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:31:02.664 [2024-06-10 11:41:46.477681] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:31:02.664 [2024-06-10 11:41:46.477688] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:31:02.664 [2024-06-10 11:41:46.477700] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:31:02.664 [2024-06-10 11:41:46.477706] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:31:02.664 [2024-06-10 11:41:46.477713] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:31:02.664 11:41:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:31:02.664 11:41:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:31:02.664 11:41:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:31:02.664 11:41:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:31:02.664 11:41:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:31:02.664 11:41:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:31:02.664 11:41:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:02.664 11:41:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:02.664 11:41:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:02.664 11:41:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:02.664 11:41:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:02.664 11:41:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:31:02.923 11:41:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:02.923 "name": "Existed_Raid", 00:31:02.923 "uuid": "7a9b7ccb-58f3-4792-bd53-b2b394f82574", 00:31:02.923 "strip_size_kb": 64, 00:31:02.923 "state": "configuring", 00:31:02.923 "raid_level": "concat", 00:31:02.923 "superblock": true, 00:31:02.923 "num_base_bdevs": 3, 00:31:02.923 "num_base_bdevs_discovered": 0, 00:31:02.923 "num_base_bdevs_operational": 3, 00:31:02.923 "base_bdevs_list": [ 00:31:02.923 { 00:31:02.923 "name": "BaseBdev1", 00:31:02.923 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:02.923 "is_configured": false, 00:31:02.923 "data_offset": 0, 00:31:02.923 "data_size": 0 00:31:02.923 }, 00:31:02.923 { 00:31:02.923 "name": "BaseBdev2", 00:31:02.923 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:02.923 "is_configured": false, 00:31:02.923 "data_offset": 0, 00:31:02.923 "data_size": 0 00:31:02.923 }, 00:31:02.923 { 00:31:02.923 "name": "BaseBdev3", 00:31:02.923 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:02.923 "is_configured": false, 00:31:02.923 "data_offset": 0, 00:31:02.923 "data_size": 0 00:31:02.923 } 00:31:02.923 ] 00:31:02.923 }' 00:31:02.923 11:41:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:02.923 11:41:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:31:03.490 11:41:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:31:03.490 [2024-06-10 11:41:47.287636] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:31:03.490 [2024-06-10 11:41:47.287662] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x180c530 name Existed_Raid, state configuring 00:31:03.490 11:41:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:31:03.749 [2024-06-10 11:41:47.464122] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:31:03.749 [2024-06-10 11:41:47.464147] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:31:03.749 [2024-06-10 11:41:47.464153] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:31:03.749 [2024-06-10 11:41:47.464161] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:31:03.749 [2024-06-10 11:41:47.464183] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:31:03.749 [2024-06-10 11:41:47.464194] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:31:03.749 11:41:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:31:03.749 [2024-06-10 11:41:47.645128] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:31:03.749 BaseBdev1 00:31:03.749 11:41:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:31:03.749 11:41:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:31:03.749 11:41:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:31:03.749 11:41:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:31:03.749 11:41:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:31:03.749 11:41:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:31:03.749 11:41:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:31:04.008 11:41:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:31:04.267 [ 00:31:04.267 { 00:31:04.267 "name": "BaseBdev1", 00:31:04.267 "aliases": [ 00:31:04.267 "386cc45d-9715-41f3-ba5a-e4de9d47b3cd" 00:31:04.267 ], 00:31:04.267 "product_name": "Malloc disk", 00:31:04.267 "block_size": 512, 00:31:04.267 "num_blocks": 65536, 00:31:04.267 "uuid": "386cc45d-9715-41f3-ba5a-e4de9d47b3cd", 00:31:04.267 "assigned_rate_limits": { 00:31:04.267 "rw_ios_per_sec": 0, 00:31:04.267 "rw_mbytes_per_sec": 0, 00:31:04.267 "r_mbytes_per_sec": 0, 00:31:04.267 "w_mbytes_per_sec": 0 00:31:04.267 }, 00:31:04.267 "claimed": true, 00:31:04.267 "claim_type": "exclusive_write", 00:31:04.267 "zoned": false, 00:31:04.267 "supported_io_types": { 00:31:04.267 "read": true, 00:31:04.267 "write": true, 00:31:04.267 "unmap": true, 00:31:04.267 "write_zeroes": true, 00:31:04.267 "flush": true, 00:31:04.267 "reset": true, 00:31:04.267 "compare": false, 00:31:04.267 "compare_and_write": false, 00:31:04.267 "abort": true, 00:31:04.267 "nvme_admin": false, 00:31:04.267 "nvme_io": false 00:31:04.267 }, 00:31:04.267 "memory_domains": [ 00:31:04.267 { 00:31:04.267 "dma_device_id": "system", 00:31:04.267 "dma_device_type": 1 00:31:04.267 }, 00:31:04.267 { 00:31:04.267 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:04.267 "dma_device_type": 2 00:31:04.267 } 00:31:04.267 ], 00:31:04.267 "driver_specific": {} 00:31:04.267 } 00:31:04.267 ] 00:31:04.267 11:41:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:31:04.267 11:41:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:31:04.267 11:41:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:31:04.267 11:41:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:31:04.267 11:41:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:31:04.267 11:41:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:31:04.267 11:41:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:31:04.267 11:41:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:04.267 11:41:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:04.268 11:41:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:04.268 11:41:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:04.268 11:41:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:31:04.268 11:41:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:04.268 11:41:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:04.268 "name": "Existed_Raid", 00:31:04.268 "uuid": "8fc5a82f-2747-42bc-a072-c9a770b22c2a", 00:31:04.268 "strip_size_kb": 64, 00:31:04.268 "state": "configuring", 00:31:04.268 "raid_level": "concat", 00:31:04.268 "superblock": true, 00:31:04.268 "num_base_bdevs": 3, 00:31:04.268 "num_base_bdevs_discovered": 1, 00:31:04.268 "num_base_bdevs_operational": 3, 00:31:04.268 "base_bdevs_list": [ 00:31:04.268 { 00:31:04.268 "name": "BaseBdev1", 00:31:04.268 "uuid": "386cc45d-9715-41f3-ba5a-e4de9d47b3cd", 00:31:04.268 "is_configured": true, 00:31:04.268 "data_offset": 2048, 00:31:04.268 "data_size": 63488 00:31:04.268 }, 00:31:04.268 { 00:31:04.268 "name": "BaseBdev2", 00:31:04.268 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:04.268 "is_configured": false, 00:31:04.268 "data_offset": 0, 00:31:04.268 "data_size": 0 00:31:04.268 }, 00:31:04.268 { 00:31:04.268 "name": "BaseBdev3", 00:31:04.268 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:04.268 "is_configured": false, 00:31:04.268 "data_offset": 0, 00:31:04.268 "data_size": 0 00:31:04.268 } 00:31:04.268 ] 00:31:04.268 }' 00:31:04.268 11:41:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:04.268 11:41:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:31:04.835 11:41:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:31:05.094 [2024-06-10 11:41:48.824171] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:31:05.094 [2024-06-10 11:41:48.824200] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x180be00 name Existed_Raid, state configuring 00:31:05.094 11:41:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:31:05.094 [2024-06-10 11:41:48.996642] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:31:05.094 [2024-06-10 11:41:48.997674] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:31:05.094 [2024-06-10 11:41:48.997700] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:31:05.094 [2024-06-10 11:41:48.997706] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:31:05.094 [2024-06-10 11:41:48.997714] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:31:05.094 11:41:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:31:05.094 11:41:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:31:05.094 11:41:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:31:05.094 11:41:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:31:05.094 11:41:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:31:05.094 11:41:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:31:05.094 11:41:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:31:05.094 11:41:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:31:05.094 11:41:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:05.094 11:41:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:05.094 11:41:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:05.094 11:41:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:05.094 11:41:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:05.094 11:41:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:31:05.353 11:41:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:05.353 "name": "Existed_Raid", 00:31:05.353 "uuid": "7f3ca7a4-517e-4025-a5d2-eefd949e0980", 00:31:05.353 "strip_size_kb": 64, 00:31:05.353 "state": "configuring", 00:31:05.353 "raid_level": "concat", 00:31:05.353 "superblock": true, 00:31:05.353 "num_base_bdevs": 3, 00:31:05.353 "num_base_bdevs_discovered": 1, 00:31:05.353 "num_base_bdevs_operational": 3, 00:31:05.353 "base_bdevs_list": [ 00:31:05.353 { 00:31:05.353 "name": "BaseBdev1", 00:31:05.353 "uuid": "386cc45d-9715-41f3-ba5a-e4de9d47b3cd", 00:31:05.353 "is_configured": true, 00:31:05.353 "data_offset": 2048, 00:31:05.353 "data_size": 63488 00:31:05.353 }, 00:31:05.353 { 00:31:05.353 "name": "BaseBdev2", 00:31:05.353 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:05.353 "is_configured": false, 00:31:05.353 "data_offset": 0, 00:31:05.353 "data_size": 0 00:31:05.353 }, 00:31:05.353 { 00:31:05.353 "name": "BaseBdev3", 00:31:05.353 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:05.353 "is_configured": false, 00:31:05.353 "data_offset": 0, 00:31:05.353 "data_size": 0 00:31:05.353 } 00:31:05.353 ] 00:31:05.353 }' 00:31:05.353 11:41:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:05.353 11:41:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:31:05.920 11:41:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:31:05.920 [2024-06-10 11:41:49.845624] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:31:05.920 BaseBdev2 00:31:05.920 11:41:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:31:05.920 11:41:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:31:05.920 11:41:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:31:05.920 11:41:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:31:05.920 11:41:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:31:05.920 11:41:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:31:05.920 11:41:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:31:06.179 11:41:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:31:06.438 [ 00:31:06.438 { 00:31:06.438 "name": "BaseBdev2", 00:31:06.438 "aliases": [ 00:31:06.438 "83334692-38f5-42d7-b986-046da803fef2" 00:31:06.438 ], 00:31:06.438 "product_name": "Malloc disk", 00:31:06.438 "block_size": 512, 00:31:06.438 "num_blocks": 65536, 00:31:06.438 "uuid": "83334692-38f5-42d7-b986-046da803fef2", 00:31:06.438 "assigned_rate_limits": { 00:31:06.438 "rw_ios_per_sec": 0, 00:31:06.438 "rw_mbytes_per_sec": 0, 00:31:06.438 "r_mbytes_per_sec": 0, 00:31:06.438 "w_mbytes_per_sec": 0 00:31:06.438 }, 00:31:06.438 "claimed": true, 00:31:06.438 "claim_type": "exclusive_write", 00:31:06.438 "zoned": false, 00:31:06.438 "supported_io_types": { 00:31:06.438 "read": true, 00:31:06.438 "write": true, 00:31:06.438 "unmap": true, 00:31:06.438 "write_zeroes": true, 00:31:06.438 "flush": true, 00:31:06.438 "reset": true, 00:31:06.438 "compare": false, 00:31:06.438 "compare_and_write": false, 00:31:06.438 "abort": true, 00:31:06.438 "nvme_admin": false, 00:31:06.438 "nvme_io": false 00:31:06.438 }, 00:31:06.438 "memory_domains": [ 00:31:06.438 { 00:31:06.438 "dma_device_id": "system", 00:31:06.438 "dma_device_type": 1 00:31:06.438 }, 00:31:06.438 { 00:31:06.438 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:06.438 "dma_device_type": 2 00:31:06.438 } 00:31:06.438 ], 00:31:06.438 "driver_specific": {} 00:31:06.438 } 00:31:06.438 ] 00:31:06.438 11:41:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:31:06.438 11:41:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:31:06.438 11:41:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:31:06.438 11:41:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:31:06.438 11:41:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:31:06.438 11:41:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:31:06.438 11:41:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:31:06.438 11:41:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:31:06.438 11:41:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:31:06.438 11:41:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:06.438 11:41:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:06.438 11:41:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:06.438 11:41:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:06.438 11:41:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:06.438 11:41:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:31:06.697 11:41:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:06.697 "name": "Existed_Raid", 00:31:06.697 "uuid": "7f3ca7a4-517e-4025-a5d2-eefd949e0980", 00:31:06.697 "strip_size_kb": 64, 00:31:06.697 "state": "configuring", 00:31:06.697 "raid_level": "concat", 00:31:06.697 "superblock": true, 00:31:06.697 "num_base_bdevs": 3, 00:31:06.697 "num_base_bdevs_discovered": 2, 00:31:06.697 "num_base_bdevs_operational": 3, 00:31:06.697 "base_bdevs_list": [ 00:31:06.697 { 00:31:06.697 "name": "BaseBdev1", 00:31:06.697 "uuid": "386cc45d-9715-41f3-ba5a-e4de9d47b3cd", 00:31:06.697 "is_configured": true, 00:31:06.697 "data_offset": 2048, 00:31:06.697 "data_size": 63488 00:31:06.697 }, 00:31:06.697 { 00:31:06.697 "name": "BaseBdev2", 00:31:06.697 "uuid": "83334692-38f5-42d7-b986-046da803fef2", 00:31:06.697 "is_configured": true, 00:31:06.697 "data_offset": 2048, 00:31:06.697 "data_size": 63488 00:31:06.697 }, 00:31:06.697 { 00:31:06.697 "name": "BaseBdev3", 00:31:06.697 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:06.697 "is_configured": false, 00:31:06.697 "data_offset": 0, 00:31:06.697 "data_size": 0 00:31:06.697 } 00:31:06.697 ] 00:31:06.697 }' 00:31:06.697 11:41:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:06.697 11:41:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:31:06.956 11:41:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:31:07.214 [2024-06-10 11:41:51.035590] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:31:07.214 [2024-06-10 11:41:51.035712] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x180ccf0 00:31:07.214 [2024-06-10 11:41:51.035722] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:31:07.214 [2024-06-10 11:41:51.035838] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1823be0 00:31:07.214 [2024-06-10 11:41:51.035931] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x180ccf0 00:31:07.214 [2024-06-10 11:41:51.035938] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x180ccf0 00:31:07.214 [2024-06-10 11:41:51.036002] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:31:07.214 BaseBdev3 00:31:07.214 11:41:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:31:07.214 11:41:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:31:07.214 11:41:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:31:07.214 11:41:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:31:07.214 11:41:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:31:07.214 11:41:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:31:07.214 11:41:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:31:07.473 11:41:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:31:07.473 [ 00:31:07.473 { 00:31:07.473 "name": "BaseBdev3", 00:31:07.473 "aliases": [ 00:31:07.473 "c16a767b-acf8-4407-8bd4-d5988c5dd7c5" 00:31:07.473 ], 00:31:07.473 "product_name": "Malloc disk", 00:31:07.473 "block_size": 512, 00:31:07.473 "num_blocks": 65536, 00:31:07.473 "uuid": "c16a767b-acf8-4407-8bd4-d5988c5dd7c5", 00:31:07.473 "assigned_rate_limits": { 00:31:07.473 "rw_ios_per_sec": 0, 00:31:07.473 "rw_mbytes_per_sec": 0, 00:31:07.473 "r_mbytes_per_sec": 0, 00:31:07.473 "w_mbytes_per_sec": 0 00:31:07.473 }, 00:31:07.473 "claimed": true, 00:31:07.473 "claim_type": "exclusive_write", 00:31:07.473 "zoned": false, 00:31:07.473 "supported_io_types": { 00:31:07.473 "read": true, 00:31:07.473 "write": true, 00:31:07.473 "unmap": true, 00:31:07.473 "write_zeroes": true, 00:31:07.473 "flush": true, 00:31:07.473 "reset": true, 00:31:07.473 "compare": false, 00:31:07.473 "compare_and_write": false, 00:31:07.473 "abort": true, 00:31:07.473 "nvme_admin": false, 00:31:07.473 "nvme_io": false 00:31:07.473 }, 00:31:07.473 "memory_domains": [ 00:31:07.473 { 00:31:07.473 "dma_device_id": "system", 00:31:07.473 "dma_device_type": 1 00:31:07.473 }, 00:31:07.473 { 00:31:07.473 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:07.473 "dma_device_type": 2 00:31:07.473 } 00:31:07.473 ], 00:31:07.473 "driver_specific": {} 00:31:07.473 } 00:31:07.473 ] 00:31:07.473 11:41:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:31:07.473 11:41:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:31:07.473 11:41:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:31:07.473 11:41:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:31:07.473 11:41:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:31:07.473 11:41:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:31:07.473 11:41:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:31:07.473 11:41:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:31:07.473 11:41:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:31:07.473 11:41:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:07.473 11:41:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:07.473 11:41:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:07.474 11:41:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:07.474 11:41:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:07.474 11:41:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:31:07.732 11:41:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:07.732 "name": "Existed_Raid", 00:31:07.732 "uuid": "7f3ca7a4-517e-4025-a5d2-eefd949e0980", 00:31:07.732 "strip_size_kb": 64, 00:31:07.732 "state": "online", 00:31:07.732 "raid_level": "concat", 00:31:07.732 "superblock": true, 00:31:07.733 "num_base_bdevs": 3, 00:31:07.733 "num_base_bdevs_discovered": 3, 00:31:07.733 "num_base_bdevs_operational": 3, 00:31:07.733 "base_bdevs_list": [ 00:31:07.733 { 00:31:07.733 "name": "BaseBdev1", 00:31:07.733 "uuid": "386cc45d-9715-41f3-ba5a-e4de9d47b3cd", 00:31:07.733 "is_configured": true, 00:31:07.733 "data_offset": 2048, 00:31:07.733 "data_size": 63488 00:31:07.733 }, 00:31:07.733 { 00:31:07.733 "name": "BaseBdev2", 00:31:07.733 "uuid": "83334692-38f5-42d7-b986-046da803fef2", 00:31:07.733 "is_configured": true, 00:31:07.733 "data_offset": 2048, 00:31:07.733 "data_size": 63488 00:31:07.733 }, 00:31:07.733 { 00:31:07.733 "name": "BaseBdev3", 00:31:07.733 "uuid": "c16a767b-acf8-4407-8bd4-d5988c5dd7c5", 00:31:07.733 "is_configured": true, 00:31:07.733 "data_offset": 2048, 00:31:07.733 "data_size": 63488 00:31:07.733 } 00:31:07.733 ] 00:31:07.733 }' 00:31:07.733 11:41:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:07.733 11:41:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:31:08.300 11:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:31:08.300 11:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:31:08.300 11:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:31:08.300 11:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:31:08.300 11:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:31:08.300 11:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:31:08.300 11:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:31:08.300 11:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:31:08.300 [2024-06-10 11:41:52.226860] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:31:08.560 11:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:31:08.560 "name": "Existed_Raid", 00:31:08.560 "aliases": [ 00:31:08.560 "7f3ca7a4-517e-4025-a5d2-eefd949e0980" 00:31:08.560 ], 00:31:08.560 "product_name": "Raid Volume", 00:31:08.560 "block_size": 512, 00:31:08.560 "num_blocks": 190464, 00:31:08.560 "uuid": "7f3ca7a4-517e-4025-a5d2-eefd949e0980", 00:31:08.560 "assigned_rate_limits": { 00:31:08.560 "rw_ios_per_sec": 0, 00:31:08.560 "rw_mbytes_per_sec": 0, 00:31:08.560 "r_mbytes_per_sec": 0, 00:31:08.560 "w_mbytes_per_sec": 0 00:31:08.560 }, 00:31:08.560 "claimed": false, 00:31:08.560 "zoned": false, 00:31:08.560 "supported_io_types": { 00:31:08.560 "read": true, 00:31:08.560 "write": true, 00:31:08.560 "unmap": true, 00:31:08.560 "write_zeroes": true, 00:31:08.560 "flush": true, 00:31:08.560 "reset": true, 00:31:08.560 "compare": false, 00:31:08.560 "compare_and_write": false, 00:31:08.560 "abort": false, 00:31:08.560 "nvme_admin": false, 00:31:08.560 "nvme_io": false 00:31:08.560 }, 00:31:08.560 "memory_domains": [ 00:31:08.560 { 00:31:08.560 "dma_device_id": "system", 00:31:08.560 "dma_device_type": 1 00:31:08.560 }, 00:31:08.560 { 00:31:08.560 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:08.560 "dma_device_type": 2 00:31:08.560 }, 00:31:08.560 { 00:31:08.560 "dma_device_id": "system", 00:31:08.560 "dma_device_type": 1 00:31:08.560 }, 00:31:08.560 { 00:31:08.560 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:08.560 "dma_device_type": 2 00:31:08.560 }, 00:31:08.560 { 00:31:08.560 "dma_device_id": "system", 00:31:08.560 "dma_device_type": 1 00:31:08.560 }, 00:31:08.560 { 00:31:08.560 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:08.560 "dma_device_type": 2 00:31:08.560 } 00:31:08.560 ], 00:31:08.560 "driver_specific": { 00:31:08.560 "raid": { 00:31:08.560 "uuid": "7f3ca7a4-517e-4025-a5d2-eefd949e0980", 00:31:08.560 "strip_size_kb": 64, 00:31:08.560 "state": "online", 00:31:08.560 "raid_level": "concat", 00:31:08.560 "superblock": true, 00:31:08.560 "num_base_bdevs": 3, 00:31:08.560 "num_base_bdevs_discovered": 3, 00:31:08.560 "num_base_bdevs_operational": 3, 00:31:08.560 "base_bdevs_list": [ 00:31:08.560 { 00:31:08.560 "name": "BaseBdev1", 00:31:08.560 "uuid": "386cc45d-9715-41f3-ba5a-e4de9d47b3cd", 00:31:08.560 "is_configured": true, 00:31:08.560 "data_offset": 2048, 00:31:08.560 "data_size": 63488 00:31:08.560 }, 00:31:08.560 { 00:31:08.560 "name": "BaseBdev2", 00:31:08.560 "uuid": "83334692-38f5-42d7-b986-046da803fef2", 00:31:08.560 "is_configured": true, 00:31:08.560 "data_offset": 2048, 00:31:08.560 "data_size": 63488 00:31:08.560 }, 00:31:08.560 { 00:31:08.560 "name": "BaseBdev3", 00:31:08.560 "uuid": "c16a767b-acf8-4407-8bd4-d5988c5dd7c5", 00:31:08.560 "is_configured": true, 00:31:08.560 "data_offset": 2048, 00:31:08.560 "data_size": 63488 00:31:08.560 } 00:31:08.560 ] 00:31:08.560 } 00:31:08.560 } 00:31:08.560 }' 00:31:08.560 11:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:31:08.560 11:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:31:08.560 BaseBdev2 00:31:08.560 BaseBdev3' 00:31:08.560 11:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:31:08.560 11:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:31:08.560 11:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:31:08.560 11:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:31:08.560 "name": "BaseBdev1", 00:31:08.560 "aliases": [ 00:31:08.560 "386cc45d-9715-41f3-ba5a-e4de9d47b3cd" 00:31:08.560 ], 00:31:08.560 "product_name": "Malloc disk", 00:31:08.560 "block_size": 512, 00:31:08.560 "num_blocks": 65536, 00:31:08.560 "uuid": "386cc45d-9715-41f3-ba5a-e4de9d47b3cd", 00:31:08.560 "assigned_rate_limits": { 00:31:08.560 "rw_ios_per_sec": 0, 00:31:08.560 "rw_mbytes_per_sec": 0, 00:31:08.560 "r_mbytes_per_sec": 0, 00:31:08.560 "w_mbytes_per_sec": 0 00:31:08.560 }, 00:31:08.560 "claimed": true, 00:31:08.560 "claim_type": "exclusive_write", 00:31:08.560 "zoned": false, 00:31:08.560 "supported_io_types": { 00:31:08.560 "read": true, 00:31:08.560 "write": true, 00:31:08.560 "unmap": true, 00:31:08.560 "write_zeroes": true, 00:31:08.560 "flush": true, 00:31:08.560 "reset": true, 00:31:08.560 "compare": false, 00:31:08.560 "compare_and_write": false, 00:31:08.560 "abort": true, 00:31:08.560 "nvme_admin": false, 00:31:08.560 "nvme_io": false 00:31:08.560 }, 00:31:08.560 "memory_domains": [ 00:31:08.560 { 00:31:08.560 "dma_device_id": "system", 00:31:08.560 "dma_device_type": 1 00:31:08.560 }, 00:31:08.560 { 00:31:08.560 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:08.560 "dma_device_type": 2 00:31:08.560 } 00:31:08.560 ], 00:31:08.560 "driver_specific": {} 00:31:08.560 }' 00:31:08.560 11:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:08.560 11:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:08.819 11:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:31:08.819 11:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:08.819 11:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:08.819 11:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:31:08.819 11:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:08.819 11:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:08.819 11:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:31:08.819 11:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:08.819 11:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:09.078 11:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:31:09.078 11:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:31:09.078 11:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:31:09.078 11:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:31:09.078 11:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:31:09.078 "name": "BaseBdev2", 00:31:09.078 "aliases": [ 00:31:09.078 "83334692-38f5-42d7-b986-046da803fef2" 00:31:09.078 ], 00:31:09.078 "product_name": "Malloc disk", 00:31:09.078 "block_size": 512, 00:31:09.078 "num_blocks": 65536, 00:31:09.078 "uuid": "83334692-38f5-42d7-b986-046da803fef2", 00:31:09.078 "assigned_rate_limits": { 00:31:09.078 "rw_ios_per_sec": 0, 00:31:09.078 "rw_mbytes_per_sec": 0, 00:31:09.078 "r_mbytes_per_sec": 0, 00:31:09.078 "w_mbytes_per_sec": 0 00:31:09.078 }, 00:31:09.078 "claimed": true, 00:31:09.078 "claim_type": "exclusive_write", 00:31:09.078 "zoned": false, 00:31:09.078 "supported_io_types": { 00:31:09.078 "read": true, 00:31:09.078 "write": true, 00:31:09.078 "unmap": true, 00:31:09.078 "write_zeroes": true, 00:31:09.078 "flush": true, 00:31:09.078 "reset": true, 00:31:09.078 "compare": false, 00:31:09.078 "compare_and_write": false, 00:31:09.078 "abort": true, 00:31:09.078 "nvme_admin": false, 00:31:09.078 "nvme_io": false 00:31:09.078 }, 00:31:09.078 "memory_domains": [ 00:31:09.078 { 00:31:09.078 "dma_device_id": "system", 00:31:09.078 "dma_device_type": 1 00:31:09.078 }, 00:31:09.078 { 00:31:09.078 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:09.079 "dma_device_type": 2 00:31:09.079 } 00:31:09.079 ], 00:31:09.079 "driver_specific": {} 00:31:09.079 }' 00:31:09.079 11:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:09.079 11:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:09.337 11:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:31:09.337 11:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:09.337 11:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:09.337 11:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:31:09.337 11:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:09.337 11:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:09.337 11:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:31:09.337 11:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:09.337 11:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:09.337 11:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:31:09.337 11:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:31:09.337 11:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:31:09.338 11:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:31:09.596 11:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:31:09.596 "name": "BaseBdev3", 00:31:09.596 "aliases": [ 00:31:09.596 "c16a767b-acf8-4407-8bd4-d5988c5dd7c5" 00:31:09.596 ], 00:31:09.596 "product_name": "Malloc disk", 00:31:09.596 "block_size": 512, 00:31:09.596 "num_blocks": 65536, 00:31:09.596 "uuid": "c16a767b-acf8-4407-8bd4-d5988c5dd7c5", 00:31:09.596 "assigned_rate_limits": { 00:31:09.596 "rw_ios_per_sec": 0, 00:31:09.596 "rw_mbytes_per_sec": 0, 00:31:09.596 "r_mbytes_per_sec": 0, 00:31:09.596 "w_mbytes_per_sec": 0 00:31:09.596 }, 00:31:09.596 "claimed": true, 00:31:09.596 "claim_type": "exclusive_write", 00:31:09.596 "zoned": false, 00:31:09.596 "supported_io_types": { 00:31:09.596 "read": true, 00:31:09.596 "write": true, 00:31:09.596 "unmap": true, 00:31:09.596 "write_zeroes": true, 00:31:09.596 "flush": true, 00:31:09.596 "reset": true, 00:31:09.596 "compare": false, 00:31:09.596 "compare_and_write": false, 00:31:09.596 "abort": true, 00:31:09.596 "nvme_admin": false, 00:31:09.597 "nvme_io": false 00:31:09.597 }, 00:31:09.597 "memory_domains": [ 00:31:09.597 { 00:31:09.597 "dma_device_id": "system", 00:31:09.597 "dma_device_type": 1 00:31:09.597 }, 00:31:09.597 { 00:31:09.597 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:09.597 "dma_device_type": 2 00:31:09.597 } 00:31:09.597 ], 00:31:09.597 "driver_specific": {} 00:31:09.597 }' 00:31:09.597 11:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:09.597 11:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:09.597 11:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:31:09.597 11:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:09.855 11:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:09.855 11:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:31:09.855 11:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:09.855 11:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:09.855 11:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:31:09.855 11:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:09.855 11:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:09.855 11:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:31:09.855 11:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:31:10.113 [2024-06-10 11:41:53.883048] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:31:10.113 [2024-06-10 11:41:53.883070] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:31:10.113 [2024-06-10 11:41:53.883101] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:31:10.113 11:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:31:10.113 11:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:31:10.114 11:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:31:10.114 11:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:31:10.114 11:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:31:10.114 11:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 2 00:31:10.114 11:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:31:10.114 11:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:31:10.114 11:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:31:10.114 11:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:31:10.114 11:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:31:10.114 11:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:10.114 11:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:10.114 11:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:10.114 11:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:10.114 11:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:10.114 11:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:31:10.372 11:41:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:10.372 "name": "Existed_Raid", 00:31:10.372 "uuid": "7f3ca7a4-517e-4025-a5d2-eefd949e0980", 00:31:10.372 "strip_size_kb": 64, 00:31:10.372 "state": "offline", 00:31:10.372 "raid_level": "concat", 00:31:10.372 "superblock": true, 00:31:10.372 "num_base_bdevs": 3, 00:31:10.372 "num_base_bdevs_discovered": 2, 00:31:10.372 "num_base_bdevs_operational": 2, 00:31:10.372 "base_bdevs_list": [ 00:31:10.372 { 00:31:10.372 "name": null, 00:31:10.372 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:10.372 "is_configured": false, 00:31:10.372 "data_offset": 2048, 00:31:10.372 "data_size": 63488 00:31:10.372 }, 00:31:10.372 { 00:31:10.372 "name": "BaseBdev2", 00:31:10.372 "uuid": "83334692-38f5-42d7-b986-046da803fef2", 00:31:10.372 "is_configured": true, 00:31:10.372 "data_offset": 2048, 00:31:10.372 "data_size": 63488 00:31:10.372 }, 00:31:10.372 { 00:31:10.372 "name": "BaseBdev3", 00:31:10.372 "uuid": "c16a767b-acf8-4407-8bd4-d5988c5dd7c5", 00:31:10.372 "is_configured": true, 00:31:10.372 "data_offset": 2048, 00:31:10.372 "data_size": 63488 00:31:10.372 } 00:31:10.372 ] 00:31:10.372 }' 00:31:10.372 11:41:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:10.372 11:41:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:31:10.630 11:41:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:31:10.630 11:41:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:31:10.630 11:41:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:10.630 11:41:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:31:10.888 11:41:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:31:10.888 11:41:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:31:10.888 11:41:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:31:11.146 [2024-06-10 11:41:54.882462] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:31:11.146 11:41:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:31:11.146 11:41:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:31:11.146 11:41:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:11.146 11:41:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:31:11.404 11:41:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:31:11.404 11:41:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:31:11.404 11:41:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:31:11.404 [2024-06-10 11:41:55.253512] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:31:11.404 [2024-06-10 11:41:55.253546] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x180ccf0 name Existed_Raid, state offline 00:31:11.404 11:41:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:31:11.404 11:41:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:31:11.404 11:41:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:11.404 11:41:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:31:11.663 11:41:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:31:11.663 11:41:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:31:11.663 11:41:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:31:11.663 11:41:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:31:11.663 11:41:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:31:11.664 11:41:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:31:11.664 BaseBdev2 00:31:11.664 11:41:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:31:11.664 11:41:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:31:11.664 11:41:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:31:11.664 11:41:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:31:11.664 11:41:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:31:11.664 11:41:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:31:11.664 11:41:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:31:11.922 11:41:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:31:12.181 [ 00:31:12.181 { 00:31:12.181 "name": "BaseBdev2", 00:31:12.181 "aliases": [ 00:31:12.181 "a68f9352-2674-4ebb-944e-ebde930b3836" 00:31:12.181 ], 00:31:12.181 "product_name": "Malloc disk", 00:31:12.181 "block_size": 512, 00:31:12.181 "num_blocks": 65536, 00:31:12.181 "uuid": "a68f9352-2674-4ebb-944e-ebde930b3836", 00:31:12.181 "assigned_rate_limits": { 00:31:12.181 "rw_ios_per_sec": 0, 00:31:12.181 "rw_mbytes_per_sec": 0, 00:31:12.181 "r_mbytes_per_sec": 0, 00:31:12.181 "w_mbytes_per_sec": 0 00:31:12.181 }, 00:31:12.181 "claimed": false, 00:31:12.181 "zoned": false, 00:31:12.181 "supported_io_types": { 00:31:12.181 "read": true, 00:31:12.181 "write": true, 00:31:12.181 "unmap": true, 00:31:12.181 "write_zeroes": true, 00:31:12.181 "flush": true, 00:31:12.181 "reset": true, 00:31:12.181 "compare": false, 00:31:12.181 "compare_and_write": false, 00:31:12.181 "abort": true, 00:31:12.181 "nvme_admin": false, 00:31:12.181 "nvme_io": false 00:31:12.181 }, 00:31:12.181 "memory_domains": [ 00:31:12.181 { 00:31:12.181 "dma_device_id": "system", 00:31:12.181 "dma_device_type": 1 00:31:12.181 }, 00:31:12.181 { 00:31:12.181 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:12.181 "dma_device_type": 2 00:31:12.181 } 00:31:12.181 ], 00:31:12.181 "driver_specific": {} 00:31:12.181 } 00:31:12.181 ] 00:31:12.181 11:41:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:31:12.181 11:41:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:31:12.181 11:41:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:31:12.181 11:41:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:31:12.439 BaseBdev3 00:31:12.439 11:41:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:31:12.439 11:41:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:31:12.439 11:41:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:31:12.439 11:41:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:31:12.439 11:41:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:31:12.439 11:41:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:31:12.439 11:41:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:31:12.439 11:41:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:31:12.697 [ 00:31:12.697 { 00:31:12.697 "name": "BaseBdev3", 00:31:12.697 "aliases": [ 00:31:12.697 "137bef49-f131-4964-9048-ac8f3f1fbb89" 00:31:12.697 ], 00:31:12.697 "product_name": "Malloc disk", 00:31:12.697 "block_size": 512, 00:31:12.697 "num_blocks": 65536, 00:31:12.697 "uuid": "137bef49-f131-4964-9048-ac8f3f1fbb89", 00:31:12.697 "assigned_rate_limits": { 00:31:12.697 "rw_ios_per_sec": 0, 00:31:12.697 "rw_mbytes_per_sec": 0, 00:31:12.697 "r_mbytes_per_sec": 0, 00:31:12.697 "w_mbytes_per_sec": 0 00:31:12.697 }, 00:31:12.697 "claimed": false, 00:31:12.697 "zoned": false, 00:31:12.697 "supported_io_types": { 00:31:12.697 "read": true, 00:31:12.697 "write": true, 00:31:12.697 "unmap": true, 00:31:12.697 "write_zeroes": true, 00:31:12.697 "flush": true, 00:31:12.698 "reset": true, 00:31:12.698 "compare": false, 00:31:12.698 "compare_and_write": false, 00:31:12.698 "abort": true, 00:31:12.698 "nvme_admin": false, 00:31:12.698 "nvme_io": false 00:31:12.698 }, 00:31:12.698 "memory_domains": [ 00:31:12.698 { 00:31:12.698 "dma_device_id": "system", 00:31:12.698 "dma_device_type": 1 00:31:12.698 }, 00:31:12.698 { 00:31:12.698 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:12.698 "dma_device_type": 2 00:31:12.698 } 00:31:12.698 ], 00:31:12.698 "driver_specific": {} 00:31:12.698 } 00:31:12.698 ] 00:31:12.698 11:41:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:31:12.698 11:41:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:31:12.698 11:41:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:31:12.698 11:41:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:31:12.956 [2024-06-10 11:41:56.656341] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:31:12.956 [2024-06-10 11:41:56.656372] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:31:12.956 [2024-06-10 11:41:56.656385] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:31:12.956 [2024-06-10 11:41:56.657365] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:31:12.956 11:41:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:31:12.956 11:41:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:31:12.956 11:41:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:31:12.956 11:41:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:31:12.956 11:41:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:31:12.956 11:41:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:31:12.956 11:41:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:12.956 11:41:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:12.956 11:41:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:12.956 11:41:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:12.956 11:41:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:31:12.956 11:41:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:12.956 11:41:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:12.956 "name": "Existed_Raid", 00:31:12.956 "uuid": "8d703fb7-12d0-431c-b48c-90ca7efdd923", 00:31:12.956 "strip_size_kb": 64, 00:31:12.956 "state": "configuring", 00:31:12.956 "raid_level": "concat", 00:31:12.956 "superblock": true, 00:31:12.956 "num_base_bdevs": 3, 00:31:12.956 "num_base_bdevs_discovered": 2, 00:31:12.956 "num_base_bdevs_operational": 3, 00:31:12.956 "base_bdevs_list": [ 00:31:12.956 { 00:31:12.956 "name": "BaseBdev1", 00:31:12.956 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:12.956 "is_configured": false, 00:31:12.956 "data_offset": 0, 00:31:12.956 "data_size": 0 00:31:12.956 }, 00:31:12.956 { 00:31:12.956 "name": "BaseBdev2", 00:31:12.956 "uuid": "a68f9352-2674-4ebb-944e-ebde930b3836", 00:31:12.956 "is_configured": true, 00:31:12.956 "data_offset": 2048, 00:31:12.956 "data_size": 63488 00:31:12.956 }, 00:31:12.956 { 00:31:12.956 "name": "BaseBdev3", 00:31:12.956 "uuid": "137bef49-f131-4964-9048-ac8f3f1fbb89", 00:31:12.957 "is_configured": true, 00:31:12.957 "data_offset": 2048, 00:31:12.957 "data_size": 63488 00:31:12.957 } 00:31:12.957 ] 00:31:12.957 }' 00:31:12.957 11:41:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:12.957 11:41:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:31:13.523 11:41:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:31:13.782 [2024-06-10 11:41:57.486459] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:31:13.782 11:41:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:31:13.782 11:41:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:31:13.782 11:41:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:31:13.782 11:41:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:31:13.782 11:41:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:31:13.782 11:41:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:31:13.782 11:41:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:13.782 11:41:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:13.782 11:41:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:13.782 11:41:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:13.782 11:41:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:13.782 11:41:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:31:13.782 11:41:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:13.782 "name": "Existed_Raid", 00:31:13.782 "uuid": "8d703fb7-12d0-431c-b48c-90ca7efdd923", 00:31:13.782 "strip_size_kb": 64, 00:31:13.782 "state": "configuring", 00:31:13.782 "raid_level": "concat", 00:31:13.782 "superblock": true, 00:31:13.782 "num_base_bdevs": 3, 00:31:13.782 "num_base_bdevs_discovered": 1, 00:31:13.782 "num_base_bdevs_operational": 3, 00:31:13.782 "base_bdevs_list": [ 00:31:13.782 { 00:31:13.782 "name": "BaseBdev1", 00:31:13.782 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:13.782 "is_configured": false, 00:31:13.782 "data_offset": 0, 00:31:13.782 "data_size": 0 00:31:13.782 }, 00:31:13.782 { 00:31:13.782 "name": null, 00:31:13.782 "uuid": "a68f9352-2674-4ebb-944e-ebde930b3836", 00:31:13.782 "is_configured": false, 00:31:13.782 "data_offset": 2048, 00:31:13.782 "data_size": 63488 00:31:13.782 }, 00:31:13.782 { 00:31:13.782 "name": "BaseBdev3", 00:31:13.782 "uuid": "137bef49-f131-4964-9048-ac8f3f1fbb89", 00:31:13.782 "is_configured": true, 00:31:13.782 "data_offset": 2048, 00:31:13.782 "data_size": 63488 00:31:13.782 } 00:31:13.782 ] 00:31:13.782 }' 00:31:13.782 11:41:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:13.782 11:41:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:31:14.349 11:41:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:31:14.349 11:41:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:14.607 11:41:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:31:14.607 11:41:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:31:14.607 [2024-06-10 11:41:58.480764] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:31:14.607 BaseBdev1 00:31:14.608 11:41:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:31:14.608 11:41:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:31:14.608 11:41:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:31:14.608 11:41:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:31:14.608 11:41:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:31:14.608 11:41:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:31:14.608 11:41:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:31:14.866 11:41:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:31:15.125 [ 00:31:15.125 { 00:31:15.125 "name": "BaseBdev1", 00:31:15.125 "aliases": [ 00:31:15.125 "392eefeb-750c-4579-8c8b-f37c83e5a906" 00:31:15.125 ], 00:31:15.125 "product_name": "Malloc disk", 00:31:15.125 "block_size": 512, 00:31:15.125 "num_blocks": 65536, 00:31:15.125 "uuid": "392eefeb-750c-4579-8c8b-f37c83e5a906", 00:31:15.125 "assigned_rate_limits": { 00:31:15.125 "rw_ios_per_sec": 0, 00:31:15.125 "rw_mbytes_per_sec": 0, 00:31:15.125 "r_mbytes_per_sec": 0, 00:31:15.125 "w_mbytes_per_sec": 0 00:31:15.125 }, 00:31:15.125 "claimed": true, 00:31:15.125 "claim_type": "exclusive_write", 00:31:15.125 "zoned": false, 00:31:15.125 "supported_io_types": { 00:31:15.125 "read": true, 00:31:15.125 "write": true, 00:31:15.125 "unmap": true, 00:31:15.125 "write_zeroes": true, 00:31:15.125 "flush": true, 00:31:15.125 "reset": true, 00:31:15.125 "compare": false, 00:31:15.125 "compare_and_write": false, 00:31:15.125 "abort": true, 00:31:15.125 "nvme_admin": false, 00:31:15.125 "nvme_io": false 00:31:15.125 }, 00:31:15.125 "memory_domains": [ 00:31:15.125 { 00:31:15.125 "dma_device_id": "system", 00:31:15.125 "dma_device_type": 1 00:31:15.125 }, 00:31:15.125 { 00:31:15.125 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:15.125 "dma_device_type": 2 00:31:15.125 } 00:31:15.125 ], 00:31:15.125 "driver_specific": {} 00:31:15.125 } 00:31:15.125 ] 00:31:15.125 11:41:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:31:15.125 11:41:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:31:15.125 11:41:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:31:15.125 11:41:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:31:15.125 11:41:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:31:15.125 11:41:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:31:15.125 11:41:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:31:15.125 11:41:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:15.125 11:41:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:15.125 11:41:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:15.125 11:41:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:15.125 11:41:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:15.125 11:41:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:31:15.125 11:41:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:15.125 "name": "Existed_Raid", 00:31:15.125 "uuid": "8d703fb7-12d0-431c-b48c-90ca7efdd923", 00:31:15.126 "strip_size_kb": 64, 00:31:15.126 "state": "configuring", 00:31:15.126 "raid_level": "concat", 00:31:15.126 "superblock": true, 00:31:15.126 "num_base_bdevs": 3, 00:31:15.126 "num_base_bdevs_discovered": 2, 00:31:15.126 "num_base_bdevs_operational": 3, 00:31:15.126 "base_bdevs_list": [ 00:31:15.126 { 00:31:15.126 "name": "BaseBdev1", 00:31:15.126 "uuid": "392eefeb-750c-4579-8c8b-f37c83e5a906", 00:31:15.126 "is_configured": true, 00:31:15.126 "data_offset": 2048, 00:31:15.126 "data_size": 63488 00:31:15.126 }, 00:31:15.126 { 00:31:15.126 "name": null, 00:31:15.126 "uuid": "a68f9352-2674-4ebb-944e-ebde930b3836", 00:31:15.126 "is_configured": false, 00:31:15.126 "data_offset": 2048, 00:31:15.126 "data_size": 63488 00:31:15.126 }, 00:31:15.126 { 00:31:15.126 "name": "BaseBdev3", 00:31:15.126 "uuid": "137bef49-f131-4964-9048-ac8f3f1fbb89", 00:31:15.126 "is_configured": true, 00:31:15.126 "data_offset": 2048, 00:31:15.126 "data_size": 63488 00:31:15.126 } 00:31:15.126 ] 00:31:15.126 }' 00:31:15.126 11:41:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:15.126 11:41:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:31:15.692 11:41:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:15.692 11:41:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:31:15.951 11:41:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:31:15.951 11:41:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:31:15.951 [2024-06-10 11:41:59.820237] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:31:15.951 11:41:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:31:15.951 11:41:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:31:15.951 11:41:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:31:15.951 11:41:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:31:15.951 11:41:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:31:15.951 11:41:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:31:15.951 11:41:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:15.951 11:41:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:15.951 11:41:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:15.951 11:41:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:15.951 11:41:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:15.951 11:41:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:31:16.209 11:42:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:16.209 "name": "Existed_Raid", 00:31:16.209 "uuid": "8d703fb7-12d0-431c-b48c-90ca7efdd923", 00:31:16.209 "strip_size_kb": 64, 00:31:16.209 "state": "configuring", 00:31:16.209 "raid_level": "concat", 00:31:16.209 "superblock": true, 00:31:16.209 "num_base_bdevs": 3, 00:31:16.209 "num_base_bdevs_discovered": 1, 00:31:16.209 "num_base_bdevs_operational": 3, 00:31:16.209 "base_bdevs_list": [ 00:31:16.209 { 00:31:16.209 "name": "BaseBdev1", 00:31:16.209 "uuid": "392eefeb-750c-4579-8c8b-f37c83e5a906", 00:31:16.209 "is_configured": true, 00:31:16.209 "data_offset": 2048, 00:31:16.209 "data_size": 63488 00:31:16.209 }, 00:31:16.209 { 00:31:16.209 "name": null, 00:31:16.209 "uuid": "a68f9352-2674-4ebb-944e-ebde930b3836", 00:31:16.209 "is_configured": false, 00:31:16.209 "data_offset": 2048, 00:31:16.209 "data_size": 63488 00:31:16.209 }, 00:31:16.209 { 00:31:16.209 "name": null, 00:31:16.209 "uuid": "137bef49-f131-4964-9048-ac8f3f1fbb89", 00:31:16.209 "is_configured": false, 00:31:16.209 "data_offset": 2048, 00:31:16.209 "data_size": 63488 00:31:16.209 } 00:31:16.209 ] 00:31:16.209 }' 00:31:16.209 11:42:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:16.209 11:42:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:31:16.773 11:42:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:16.773 11:42:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:31:17.031 11:42:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:31:17.031 11:42:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:31:17.031 [2024-06-10 11:42:00.879019] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:31:17.031 11:42:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:31:17.031 11:42:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:31:17.031 11:42:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:31:17.031 11:42:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:31:17.031 11:42:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:31:17.031 11:42:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:31:17.031 11:42:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:17.031 11:42:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:17.031 11:42:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:17.031 11:42:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:17.031 11:42:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:17.031 11:42:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:31:17.288 11:42:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:17.288 "name": "Existed_Raid", 00:31:17.288 "uuid": "8d703fb7-12d0-431c-b48c-90ca7efdd923", 00:31:17.288 "strip_size_kb": 64, 00:31:17.288 "state": "configuring", 00:31:17.288 "raid_level": "concat", 00:31:17.288 "superblock": true, 00:31:17.288 "num_base_bdevs": 3, 00:31:17.288 "num_base_bdevs_discovered": 2, 00:31:17.288 "num_base_bdevs_operational": 3, 00:31:17.288 "base_bdevs_list": [ 00:31:17.288 { 00:31:17.288 "name": "BaseBdev1", 00:31:17.288 "uuid": "392eefeb-750c-4579-8c8b-f37c83e5a906", 00:31:17.288 "is_configured": true, 00:31:17.288 "data_offset": 2048, 00:31:17.288 "data_size": 63488 00:31:17.288 }, 00:31:17.288 { 00:31:17.288 "name": null, 00:31:17.288 "uuid": "a68f9352-2674-4ebb-944e-ebde930b3836", 00:31:17.288 "is_configured": false, 00:31:17.288 "data_offset": 2048, 00:31:17.288 "data_size": 63488 00:31:17.288 }, 00:31:17.288 { 00:31:17.288 "name": "BaseBdev3", 00:31:17.288 "uuid": "137bef49-f131-4964-9048-ac8f3f1fbb89", 00:31:17.288 "is_configured": true, 00:31:17.288 "data_offset": 2048, 00:31:17.288 "data_size": 63488 00:31:17.288 } 00:31:17.288 ] 00:31:17.288 }' 00:31:17.288 11:42:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:17.288 11:42:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:31:17.853 11:42:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:17.853 11:42:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:31:17.853 11:42:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:31:17.853 11:42:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:31:18.110 [2024-06-10 11:42:01.865580] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:31:18.110 11:42:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:31:18.110 11:42:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:31:18.110 11:42:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:31:18.110 11:42:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:31:18.110 11:42:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:31:18.110 11:42:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:31:18.110 11:42:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:18.110 11:42:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:18.110 11:42:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:18.110 11:42:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:18.110 11:42:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:31:18.110 11:42:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:18.368 11:42:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:18.368 "name": "Existed_Raid", 00:31:18.368 "uuid": "8d703fb7-12d0-431c-b48c-90ca7efdd923", 00:31:18.368 "strip_size_kb": 64, 00:31:18.368 "state": "configuring", 00:31:18.368 "raid_level": "concat", 00:31:18.368 "superblock": true, 00:31:18.368 "num_base_bdevs": 3, 00:31:18.368 "num_base_bdevs_discovered": 1, 00:31:18.368 "num_base_bdevs_operational": 3, 00:31:18.368 "base_bdevs_list": [ 00:31:18.368 { 00:31:18.368 "name": null, 00:31:18.368 "uuid": "392eefeb-750c-4579-8c8b-f37c83e5a906", 00:31:18.368 "is_configured": false, 00:31:18.368 "data_offset": 2048, 00:31:18.368 "data_size": 63488 00:31:18.368 }, 00:31:18.368 { 00:31:18.368 "name": null, 00:31:18.368 "uuid": "a68f9352-2674-4ebb-944e-ebde930b3836", 00:31:18.368 "is_configured": false, 00:31:18.368 "data_offset": 2048, 00:31:18.368 "data_size": 63488 00:31:18.368 }, 00:31:18.368 { 00:31:18.368 "name": "BaseBdev3", 00:31:18.368 "uuid": "137bef49-f131-4964-9048-ac8f3f1fbb89", 00:31:18.368 "is_configured": true, 00:31:18.368 "data_offset": 2048, 00:31:18.368 "data_size": 63488 00:31:18.368 } 00:31:18.368 ] 00:31:18.368 }' 00:31:18.368 11:42:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:18.368 11:42:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:31:18.626 11:42:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:18.626 11:42:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:31:18.884 11:42:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:31:18.884 11:42:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:31:19.142 [2024-06-10 11:42:02.903937] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:31:19.142 11:42:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:31:19.142 11:42:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:31:19.142 11:42:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:31:19.142 11:42:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:31:19.142 11:42:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:31:19.142 11:42:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:31:19.142 11:42:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:19.142 11:42:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:19.142 11:42:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:19.142 11:42:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:19.142 11:42:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:19.142 11:42:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:31:19.400 11:42:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:19.400 "name": "Existed_Raid", 00:31:19.400 "uuid": "8d703fb7-12d0-431c-b48c-90ca7efdd923", 00:31:19.400 "strip_size_kb": 64, 00:31:19.400 "state": "configuring", 00:31:19.400 "raid_level": "concat", 00:31:19.400 "superblock": true, 00:31:19.400 "num_base_bdevs": 3, 00:31:19.400 "num_base_bdevs_discovered": 2, 00:31:19.400 "num_base_bdevs_operational": 3, 00:31:19.400 "base_bdevs_list": [ 00:31:19.400 { 00:31:19.400 "name": null, 00:31:19.400 "uuid": "392eefeb-750c-4579-8c8b-f37c83e5a906", 00:31:19.400 "is_configured": false, 00:31:19.400 "data_offset": 2048, 00:31:19.400 "data_size": 63488 00:31:19.400 }, 00:31:19.400 { 00:31:19.400 "name": "BaseBdev2", 00:31:19.400 "uuid": "a68f9352-2674-4ebb-944e-ebde930b3836", 00:31:19.400 "is_configured": true, 00:31:19.400 "data_offset": 2048, 00:31:19.400 "data_size": 63488 00:31:19.400 }, 00:31:19.400 { 00:31:19.400 "name": "BaseBdev3", 00:31:19.400 "uuid": "137bef49-f131-4964-9048-ac8f3f1fbb89", 00:31:19.400 "is_configured": true, 00:31:19.400 "data_offset": 2048, 00:31:19.400 "data_size": 63488 00:31:19.400 } 00:31:19.400 ] 00:31:19.400 }' 00:31:19.400 11:42:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:19.400 11:42:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:31:19.662 11:42:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:19.662 11:42:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:31:20.014 11:42:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:31:20.014 11:42:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:20.014 11:42:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:31:20.014 11:42:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 392eefeb-750c-4579-8c8b-f37c83e5a906 00:31:20.272 [2024-06-10 11:42:04.109898] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:31:20.272 [2024-06-10 11:42:04.110012] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x180e340 00:31:20.272 [2024-06-10 11:42:04.110022] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:31:20.272 [2024-06-10 11:42:04.110144] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19c0140 00:31:20.272 [2024-06-10 11:42:04.110222] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x180e340 00:31:20.272 [2024-06-10 11:42:04.110229] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x180e340 00:31:20.272 [2024-06-10 11:42:04.110292] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:31:20.272 NewBaseBdev 00:31:20.272 11:42:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:31:20.272 11:42:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=NewBaseBdev 00:31:20.272 11:42:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:31:20.272 11:42:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:31:20.272 11:42:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:31:20.272 11:42:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:31:20.272 11:42:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:31:20.530 11:42:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:31:20.530 [ 00:31:20.530 { 00:31:20.530 "name": "NewBaseBdev", 00:31:20.530 "aliases": [ 00:31:20.530 "392eefeb-750c-4579-8c8b-f37c83e5a906" 00:31:20.530 ], 00:31:20.530 "product_name": "Malloc disk", 00:31:20.530 "block_size": 512, 00:31:20.530 "num_blocks": 65536, 00:31:20.530 "uuid": "392eefeb-750c-4579-8c8b-f37c83e5a906", 00:31:20.530 "assigned_rate_limits": { 00:31:20.530 "rw_ios_per_sec": 0, 00:31:20.530 "rw_mbytes_per_sec": 0, 00:31:20.530 "r_mbytes_per_sec": 0, 00:31:20.530 "w_mbytes_per_sec": 0 00:31:20.530 }, 00:31:20.530 "claimed": true, 00:31:20.530 "claim_type": "exclusive_write", 00:31:20.530 "zoned": false, 00:31:20.530 "supported_io_types": { 00:31:20.530 "read": true, 00:31:20.530 "write": true, 00:31:20.530 "unmap": true, 00:31:20.530 "write_zeroes": true, 00:31:20.530 "flush": true, 00:31:20.530 "reset": true, 00:31:20.530 "compare": false, 00:31:20.530 "compare_and_write": false, 00:31:20.530 "abort": true, 00:31:20.530 "nvme_admin": false, 00:31:20.530 "nvme_io": false 00:31:20.530 }, 00:31:20.530 "memory_domains": [ 00:31:20.530 { 00:31:20.530 "dma_device_id": "system", 00:31:20.530 "dma_device_type": 1 00:31:20.530 }, 00:31:20.530 { 00:31:20.530 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:20.530 "dma_device_type": 2 00:31:20.530 } 00:31:20.530 ], 00:31:20.530 "driver_specific": {} 00:31:20.530 } 00:31:20.530 ] 00:31:20.789 11:42:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:31:20.789 11:42:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:31:20.789 11:42:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:31:20.789 11:42:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:31:20.789 11:42:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:31:20.789 11:42:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:31:20.789 11:42:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:31:20.789 11:42:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:20.789 11:42:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:20.789 11:42:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:20.789 11:42:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:20.789 11:42:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:20.789 11:42:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:31:20.789 11:42:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:20.789 "name": "Existed_Raid", 00:31:20.789 "uuid": "8d703fb7-12d0-431c-b48c-90ca7efdd923", 00:31:20.789 "strip_size_kb": 64, 00:31:20.789 "state": "online", 00:31:20.789 "raid_level": "concat", 00:31:20.789 "superblock": true, 00:31:20.789 "num_base_bdevs": 3, 00:31:20.789 "num_base_bdevs_discovered": 3, 00:31:20.789 "num_base_bdevs_operational": 3, 00:31:20.789 "base_bdevs_list": [ 00:31:20.789 { 00:31:20.790 "name": "NewBaseBdev", 00:31:20.790 "uuid": "392eefeb-750c-4579-8c8b-f37c83e5a906", 00:31:20.790 "is_configured": true, 00:31:20.790 "data_offset": 2048, 00:31:20.790 "data_size": 63488 00:31:20.790 }, 00:31:20.790 { 00:31:20.790 "name": "BaseBdev2", 00:31:20.790 "uuid": "a68f9352-2674-4ebb-944e-ebde930b3836", 00:31:20.790 "is_configured": true, 00:31:20.790 "data_offset": 2048, 00:31:20.790 "data_size": 63488 00:31:20.790 }, 00:31:20.790 { 00:31:20.790 "name": "BaseBdev3", 00:31:20.790 "uuid": "137bef49-f131-4964-9048-ac8f3f1fbb89", 00:31:20.790 "is_configured": true, 00:31:20.790 "data_offset": 2048, 00:31:20.790 "data_size": 63488 00:31:20.790 } 00:31:20.790 ] 00:31:20.790 }' 00:31:20.790 11:42:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:20.790 11:42:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:31:21.355 11:42:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:31:21.355 11:42:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:31:21.355 11:42:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:31:21.355 11:42:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:31:21.355 11:42:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:31:21.355 11:42:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:31:21.355 11:42:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:31:21.355 11:42:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:31:21.613 [2024-06-10 11:42:05.313179] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:31:21.613 11:42:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:31:21.613 "name": "Existed_Raid", 00:31:21.613 "aliases": [ 00:31:21.613 "8d703fb7-12d0-431c-b48c-90ca7efdd923" 00:31:21.613 ], 00:31:21.613 "product_name": "Raid Volume", 00:31:21.613 "block_size": 512, 00:31:21.613 "num_blocks": 190464, 00:31:21.613 "uuid": "8d703fb7-12d0-431c-b48c-90ca7efdd923", 00:31:21.613 "assigned_rate_limits": { 00:31:21.613 "rw_ios_per_sec": 0, 00:31:21.613 "rw_mbytes_per_sec": 0, 00:31:21.613 "r_mbytes_per_sec": 0, 00:31:21.613 "w_mbytes_per_sec": 0 00:31:21.613 }, 00:31:21.613 "claimed": false, 00:31:21.613 "zoned": false, 00:31:21.613 "supported_io_types": { 00:31:21.613 "read": true, 00:31:21.613 "write": true, 00:31:21.613 "unmap": true, 00:31:21.613 "write_zeroes": true, 00:31:21.613 "flush": true, 00:31:21.613 "reset": true, 00:31:21.613 "compare": false, 00:31:21.613 "compare_and_write": false, 00:31:21.613 "abort": false, 00:31:21.613 "nvme_admin": false, 00:31:21.613 "nvme_io": false 00:31:21.613 }, 00:31:21.613 "memory_domains": [ 00:31:21.613 { 00:31:21.613 "dma_device_id": "system", 00:31:21.613 "dma_device_type": 1 00:31:21.613 }, 00:31:21.613 { 00:31:21.613 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:21.613 "dma_device_type": 2 00:31:21.613 }, 00:31:21.613 { 00:31:21.613 "dma_device_id": "system", 00:31:21.613 "dma_device_type": 1 00:31:21.613 }, 00:31:21.613 { 00:31:21.614 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:21.614 "dma_device_type": 2 00:31:21.614 }, 00:31:21.614 { 00:31:21.614 "dma_device_id": "system", 00:31:21.614 "dma_device_type": 1 00:31:21.614 }, 00:31:21.614 { 00:31:21.614 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:21.614 "dma_device_type": 2 00:31:21.614 } 00:31:21.614 ], 00:31:21.614 "driver_specific": { 00:31:21.614 "raid": { 00:31:21.614 "uuid": "8d703fb7-12d0-431c-b48c-90ca7efdd923", 00:31:21.614 "strip_size_kb": 64, 00:31:21.614 "state": "online", 00:31:21.614 "raid_level": "concat", 00:31:21.614 "superblock": true, 00:31:21.614 "num_base_bdevs": 3, 00:31:21.614 "num_base_bdevs_discovered": 3, 00:31:21.614 "num_base_bdevs_operational": 3, 00:31:21.614 "base_bdevs_list": [ 00:31:21.614 { 00:31:21.614 "name": "NewBaseBdev", 00:31:21.614 "uuid": "392eefeb-750c-4579-8c8b-f37c83e5a906", 00:31:21.614 "is_configured": true, 00:31:21.614 "data_offset": 2048, 00:31:21.614 "data_size": 63488 00:31:21.614 }, 00:31:21.614 { 00:31:21.614 "name": "BaseBdev2", 00:31:21.614 "uuid": "a68f9352-2674-4ebb-944e-ebde930b3836", 00:31:21.614 "is_configured": true, 00:31:21.614 "data_offset": 2048, 00:31:21.614 "data_size": 63488 00:31:21.614 }, 00:31:21.614 { 00:31:21.614 "name": "BaseBdev3", 00:31:21.614 "uuid": "137bef49-f131-4964-9048-ac8f3f1fbb89", 00:31:21.614 "is_configured": true, 00:31:21.614 "data_offset": 2048, 00:31:21.614 "data_size": 63488 00:31:21.614 } 00:31:21.614 ] 00:31:21.614 } 00:31:21.614 } 00:31:21.614 }' 00:31:21.614 11:42:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:31:21.614 11:42:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:31:21.614 BaseBdev2 00:31:21.614 BaseBdev3' 00:31:21.614 11:42:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:31:21.614 11:42:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:31:21.614 11:42:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:31:21.614 11:42:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:31:21.614 "name": "NewBaseBdev", 00:31:21.614 "aliases": [ 00:31:21.614 "392eefeb-750c-4579-8c8b-f37c83e5a906" 00:31:21.614 ], 00:31:21.614 "product_name": "Malloc disk", 00:31:21.614 "block_size": 512, 00:31:21.614 "num_blocks": 65536, 00:31:21.614 "uuid": "392eefeb-750c-4579-8c8b-f37c83e5a906", 00:31:21.614 "assigned_rate_limits": { 00:31:21.614 "rw_ios_per_sec": 0, 00:31:21.614 "rw_mbytes_per_sec": 0, 00:31:21.614 "r_mbytes_per_sec": 0, 00:31:21.614 "w_mbytes_per_sec": 0 00:31:21.614 }, 00:31:21.614 "claimed": true, 00:31:21.614 "claim_type": "exclusive_write", 00:31:21.614 "zoned": false, 00:31:21.614 "supported_io_types": { 00:31:21.614 "read": true, 00:31:21.614 "write": true, 00:31:21.614 "unmap": true, 00:31:21.614 "write_zeroes": true, 00:31:21.614 "flush": true, 00:31:21.614 "reset": true, 00:31:21.614 "compare": false, 00:31:21.614 "compare_and_write": false, 00:31:21.614 "abort": true, 00:31:21.614 "nvme_admin": false, 00:31:21.614 "nvme_io": false 00:31:21.614 }, 00:31:21.614 "memory_domains": [ 00:31:21.614 { 00:31:21.614 "dma_device_id": "system", 00:31:21.614 "dma_device_type": 1 00:31:21.614 }, 00:31:21.614 { 00:31:21.614 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:21.614 "dma_device_type": 2 00:31:21.614 } 00:31:21.614 ], 00:31:21.614 "driver_specific": {} 00:31:21.614 }' 00:31:21.614 11:42:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:21.872 11:42:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:21.872 11:42:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:31:21.872 11:42:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:21.872 11:42:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:21.872 11:42:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:31:21.872 11:42:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:21.872 11:42:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:21.872 11:42:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:31:21.872 11:42:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:21.872 11:42:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:22.130 11:42:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:31:22.130 11:42:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:31:22.130 11:42:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:31:22.130 11:42:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:31:22.130 11:42:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:31:22.130 "name": "BaseBdev2", 00:31:22.130 "aliases": [ 00:31:22.130 "a68f9352-2674-4ebb-944e-ebde930b3836" 00:31:22.130 ], 00:31:22.130 "product_name": "Malloc disk", 00:31:22.130 "block_size": 512, 00:31:22.130 "num_blocks": 65536, 00:31:22.130 "uuid": "a68f9352-2674-4ebb-944e-ebde930b3836", 00:31:22.130 "assigned_rate_limits": { 00:31:22.130 "rw_ios_per_sec": 0, 00:31:22.130 "rw_mbytes_per_sec": 0, 00:31:22.130 "r_mbytes_per_sec": 0, 00:31:22.130 "w_mbytes_per_sec": 0 00:31:22.130 }, 00:31:22.130 "claimed": true, 00:31:22.130 "claim_type": "exclusive_write", 00:31:22.130 "zoned": false, 00:31:22.130 "supported_io_types": { 00:31:22.130 "read": true, 00:31:22.130 "write": true, 00:31:22.130 "unmap": true, 00:31:22.130 "write_zeroes": true, 00:31:22.130 "flush": true, 00:31:22.130 "reset": true, 00:31:22.130 "compare": false, 00:31:22.130 "compare_and_write": false, 00:31:22.130 "abort": true, 00:31:22.130 "nvme_admin": false, 00:31:22.130 "nvme_io": false 00:31:22.130 }, 00:31:22.130 "memory_domains": [ 00:31:22.130 { 00:31:22.130 "dma_device_id": "system", 00:31:22.130 "dma_device_type": 1 00:31:22.130 }, 00:31:22.130 { 00:31:22.130 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:22.130 "dma_device_type": 2 00:31:22.130 } 00:31:22.130 ], 00:31:22.130 "driver_specific": {} 00:31:22.130 }' 00:31:22.130 11:42:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:22.387 11:42:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:22.387 11:42:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:31:22.387 11:42:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:22.387 11:42:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:22.387 11:42:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:31:22.387 11:42:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:22.387 11:42:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:22.387 11:42:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:31:22.387 11:42:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:22.387 11:42:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:22.645 11:42:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:31:22.645 11:42:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:31:22.645 11:42:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:31:22.645 11:42:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:31:22.645 11:42:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:31:22.645 "name": "BaseBdev3", 00:31:22.645 "aliases": [ 00:31:22.645 "137bef49-f131-4964-9048-ac8f3f1fbb89" 00:31:22.645 ], 00:31:22.645 "product_name": "Malloc disk", 00:31:22.645 "block_size": 512, 00:31:22.645 "num_blocks": 65536, 00:31:22.645 "uuid": "137bef49-f131-4964-9048-ac8f3f1fbb89", 00:31:22.645 "assigned_rate_limits": { 00:31:22.645 "rw_ios_per_sec": 0, 00:31:22.645 "rw_mbytes_per_sec": 0, 00:31:22.645 "r_mbytes_per_sec": 0, 00:31:22.645 "w_mbytes_per_sec": 0 00:31:22.645 }, 00:31:22.645 "claimed": true, 00:31:22.645 "claim_type": "exclusive_write", 00:31:22.645 "zoned": false, 00:31:22.645 "supported_io_types": { 00:31:22.645 "read": true, 00:31:22.645 "write": true, 00:31:22.645 "unmap": true, 00:31:22.645 "write_zeroes": true, 00:31:22.645 "flush": true, 00:31:22.645 "reset": true, 00:31:22.645 "compare": false, 00:31:22.645 "compare_and_write": false, 00:31:22.645 "abort": true, 00:31:22.645 "nvme_admin": false, 00:31:22.645 "nvme_io": false 00:31:22.645 }, 00:31:22.645 "memory_domains": [ 00:31:22.645 { 00:31:22.645 "dma_device_id": "system", 00:31:22.645 "dma_device_type": 1 00:31:22.645 }, 00:31:22.645 { 00:31:22.645 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:22.645 "dma_device_type": 2 00:31:22.645 } 00:31:22.645 ], 00:31:22.645 "driver_specific": {} 00:31:22.645 }' 00:31:22.645 11:42:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:22.645 11:42:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:22.903 11:42:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:31:22.903 11:42:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:22.903 11:42:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:22.903 11:42:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:31:22.903 11:42:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:22.903 11:42:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:22.903 11:42:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:31:22.903 11:42:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:22.903 11:42:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:23.162 11:42:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:31:23.162 11:42:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:31:23.162 [2024-06-10 11:42:07.005441] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:31:23.162 [2024-06-10 11:42:07.005459] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:31:23.162 [2024-06-10 11:42:07.005496] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:31:23.162 [2024-06-10 11:42:07.005529] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:31:23.162 [2024-06-10 11:42:07.005537] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x180e340 name Existed_Raid, state offline 00:31:23.162 11:42:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 161031 00:31:23.162 11:42:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@949 -- # '[' -z 161031 ']' 00:31:23.162 11:42:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # kill -0 161031 00:31:23.162 11:42:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # uname 00:31:23.162 11:42:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:31:23.162 11:42:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 161031 00:31:23.162 11:42:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:31:23.162 11:42:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:31:23.162 11:42:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # echo 'killing process with pid 161031' 00:31:23.162 killing process with pid 161031 00:31:23.162 11:42:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # kill 161031 00:31:23.162 [2024-06-10 11:42:07.071674] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:31:23.162 11:42:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@973 -- # wait 161031 00:31:23.162 [2024-06-10 11:42:07.097364] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:31:23.420 11:42:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:31:23.420 00:31:23.420 real 0m21.833s 00:31:23.420 user 0m39.796s 00:31:23.420 sys 0m4.159s 00:31:23.420 11:42:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # xtrace_disable 00:31:23.420 11:42:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:31:23.420 ************************************ 00:31:23.420 END TEST raid_state_function_test_sb 00:31:23.420 ************************************ 00:31:23.420 11:42:07 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 3 00:31:23.420 11:42:07 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:31:23.420 11:42:07 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:31:23.420 11:42:07 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:31:23.679 ************************************ 00:31:23.679 START TEST raid_superblock_test 00:31:23.679 ************************************ 00:31:23.679 11:42:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # raid_superblock_test concat 3 00:31:23.679 11:42:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:31:23.679 11:42:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:31:23.679 11:42:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:31:23.679 11:42:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:31:23.679 11:42:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:31:23.679 11:42:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:31:23.679 11:42:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:31:23.679 11:42:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:31:23.679 11:42:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:31:23.679 11:42:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:31:23.679 11:42:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:31:23.679 11:42:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:31:23.679 11:42:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:31:23.679 11:42:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:31:23.679 11:42:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:31:23.679 11:42:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:31:23.679 11:42:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=165100 00:31:23.679 11:42:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 165100 /var/tmp/spdk-raid.sock 00:31:23.679 11:42:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@830 -- # '[' -z 165100 ']' 00:31:23.679 11:42:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:31:23.679 11:42:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:31:23.679 11:42:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:31:23.679 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:31:23.679 11:42:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:31:23.679 11:42:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:31:23.679 11:42:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:31:23.679 [2024-06-10 11:42:07.428614] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:31:23.679 [2024-06-10 11:42:07.428661] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid165100 ] 00:31:23.679 [2024-06-10 11:42:07.513256] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:23.679 [2024-06-10 11:42:07.592565] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:31:23.938 [2024-06-10 11:42:07.650966] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:31:23.938 [2024-06-10 11:42:07.650994] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:31:24.506 11:42:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:31:24.506 11:42:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@863 -- # return 0 00:31:24.507 11:42:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:31:24.507 11:42:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:31:24.507 11:42:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:31:24.507 11:42:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:31:24.507 11:42:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:31:24.507 11:42:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:31:24.507 11:42:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:31:24.507 11:42:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:31:24.507 11:42:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:31:24.507 malloc1 00:31:24.507 11:42:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:31:24.766 [2024-06-10 11:42:08.541993] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:31:24.766 [2024-06-10 11:42:08.542030] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:31:24.766 [2024-06-10 11:42:08.542044] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xbcc100 00:31:24.766 [2024-06-10 11:42:08.542052] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:31:24.766 [2024-06-10 11:42:08.543266] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:31:24.766 [2024-06-10 11:42:08.543288] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:31:24.766 pt1 00:31:24.766 11:42:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:31:24.766 11:42:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:31:24.766 11:42:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:31:24.766 11:42:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:31:24.766 11:42:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:31:24.766 11:42:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:31:24.766 11:42:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:31:24.766 11:42:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:31:24.766 11:42:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:31:25.037 malloc2 00:31:25.037 11:42:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:31:25.037 [2024-06-10 11:42:08.898919] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:31:25.037 [2024-06-10 11:42:08.898961] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:31:25.037 [2024-06-10 11:42:08.898976] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xbcd500 00:31:25.037 [2024-06-10 11:42:08.898985] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:31:25.037 [2024-06-10 11:42:08.900107] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:31:25.037 [2024-06-10 11:42:08.900129] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:31:25.037 pt2 00:31:25.037 11:42:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:31:25.037 11:42:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:31:25.037 11:42:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:31:25.037 11:42:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:31:25.037 11:42:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:31:25.037 11:42:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:31:25.037 11:42:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:31:25.037 11:42:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:31:25.037 11:42:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:31:25.296 malloc3 00:31:25.296 11:42:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:31:25.556 [2024-06-10 11:42:09.252661] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:31:25.556 [2024-06-10 11:42:09.252698] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:31:25.556 [2024-06-10 11:42:09.252727] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd777a0 00:31:25.556 [2024-06-10 11:42:09.252735] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:31:25.556 [2024-06-10 11:42:09.253912] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:31:25.556 [2024-06-10 11:42:09.253934] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:31:25.556 pt3 00:31:25.556 11:42:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:31:25.556 11:42:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:31:25.556 11:42:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:31:25.556 [2024-06-10 11:42:09.425183] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:31:25.556 [2024-06-10 11:42:09.426187] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:31:25.556 [2024-06-10 11:42:09.426228] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:31:25.556 [2024-06-10 11:42:09.426340] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xd7ad40 00:31:25.556 [2024-06-10 11:42:09.426348] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:31:25.556 [2024-06-10 11:42:09.426488] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xbcce90 00:31:25.556 [2024-06-10 11:42:09.426592] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xd7ad40 00:31:25.556 [2024-06-10 11:42:09.426598] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xd7ad40 00:31:25.556 [2024-06-10 11:42:09.426668] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:31:25.556 11:42:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:31:25.556 11:42:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:31:25.556 11:42:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:31:25.556 11:42:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:31:25.556 11:42:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:31:25.556 11:42:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:31:25.556 11:42:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:25.556 11:42:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:25.556 11:42:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:25.556 11:42:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:25.556 11:42:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:25.556 11:42:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:25.821 11:42:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:25.821 "name": "raid_bdev1", 00:31:25.821 "uuid": "7dee1aa3-7c5d-48fa-aa0c-83b8f29b8726", 00:31:25.821 "strip_size_kb": 64, 00:31:25.821 "state": "online", 00:31:25.821 "raid_level": "concat", 00:31:25.821 "superblock": true, 00:31:25.821 "num_base_bdevs": 3, 00:31:25.821 "num_base_bdevs_discovered": 3, 00:31:25.821 "num_base_bdevs_operational": 3, 00:31:25.821 "base_bdevs_list": [ 00:31:25.821 { 00:31:25.821 "name": "pt1", 00:31:25.821 "uuid": "00000000-0000-0000-0000-000000000001", 00:31:25.821 "is_configured": true, 00:31:25.821 "data_offset": 2048, 00:31:25.821 "data_size": 63488 00:31:25.821 }, 00:31:25.821 { 00:31:25.821 "name": "pt2", 00:31:25.821 "uuid": "00000000-0000-0000-0000-000000000002", 00:31:25.821 "is_configured": true, 00:31:25.821 "data_offset": 2048, 00:31:25.821 "data_size": 63488 00:31:25.821 }, 00:31:25.821 { 00:31:25.821 "name": "pt3", 00:31:25.821 "uuid": "00000000-0000-0000-0000-000000000003", 00:31:25.821 "is_configured": true, 00:31:25.821 "data_offset": 2048, 00:31:25.821 "data_size": 63488 00:31:25.821 } 00:31:25.821 ] 00:31:25.821 }' 00:31:25.821 11:42:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:25.821 11:42:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:31:26.389 11:42:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:31:26.389 11:42:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:31:26.389 11:42:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:31:26.389 11:42:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:31:26.389 11:42:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:31:26.389 11:42:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:31:26.389 11:42:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:31:26.389 11:42:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:31:26.389 [2024-06-10 11:42:10.251425] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:31:26.389 11:42:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:31:26.389 "name": "raid_bdev1", 00:31:26.389 "aliases": [ 00:31:26.389 "7dee1aa3-7c5d-48fa-aa0c-83b8f29b8726" 00:31:26.389 ], 00:31:26.389 "product_name": "Raid Volume", 00:31:26.389 "block_size": 512, 00:31:26.389 "num_blocks": 190464, 00:31:26.389 "uuid": "7dee1aa3-7c5d-48fa-aa0c-83b8f29b8726", 00:31:26.389 "assigned_rate_limits": { 00:31:26.389 "rw_ios_per_sec": 0, 00:31:26.389 "rw_mbytes_per_sec": 0, 00:31:26.389 "r_mbytes_per_sec": 0, 00:31:26.389 "w_mbytes_per_sec": 0 00:31:26.389 }, 00:31:26.389 "claimed": false, 00:31:26.389 "zoned": false, 00:31:26.389 "supported_io_types": { 00:31:26.389 "read": true, 00:31:26.389 "write": true, 00:31:26.389 "unmap": true, 00:31:26.389 "write_zeroes": true, 00:31:26.389 "flush": true, 00:31:26.389 "reset": true, 00:31:26.389 "compare": false, 00:31:26.390 "compare_and_write": false, 00:31:26.390 "abort": false, 00:31:26.390 "nvme_admin": false, 00:31:26.390 "nvme_io": false 00:31:26.390 }, 00:31:26.390 "memory_domains": [ 00:31:26.390 { 00:31:26.390 "dma_device_id": "system", 00:31:26.390 "dma_device_type": 1 00:31:26.390 }, 00:31:26.390 { 00:31:26.390 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:26.390 "dma_device_type": 2 00:31:26.390 }, 00:31:26.390 { 00:31:26.390 "dma_device_id": "system", 00:31:26.390 "dma_device_type": 1 00:31:26.390 }, 00:31:26.390 { 00:31:26.390 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:26.390 "dma_device_type": 2 00:31:26.390 }, 00:31:26.390 { 00:31:26.390 "dma_device_id": "system", 00:31:26.390 "dma_device_type": 1 00:31:26.390 }, 00:31:26.390 { 00:31:26.390 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:26.390 "dma_device_type": 2 00:31:26.390 } 00:31:26.390 ], 00:31:26.390 "driver_specific": { 00:31:26.390 "raid": { 00:31:26.390 "uuid": "7dee1aa3-7c5d-48fa-aa0c-83b8f29b8726", 00:31:26.390 "strip_size_kb": 64, 00:31:26.390 "state": "online", 00:31:26.390 "raid_level": "concat", 00:31:26.390 "superblock": true, 00:31:26.390 "num_base_bdevs": 3, 00:31:26.390 "num_base_bdevs_discovered": 3, 00:31:26.390 "num_base_bdevs_operational": 3, 00:31:26.390 "base_bdevs_list": [ 00:31:26.390 { 00:31:26.390 "name": "pt1", 00:31:26.390 "uuid": "00000000-0000-0000-0000-000000000001", 00:31:26.390 "is_configured": true, 00:31:26.390 "data_offset": 2048, 00:31:26.390 "data_size": 63488 00:31:26.390 }, 00:31:26.390 { 00:31:26.390 "name": "pt2", 00:31:26.390 "uuid": "00000000-0000-0000-0000-000000000002", 00:31:26.390 "is_configured": true, 00:31:26.390 "data_offset": 2048, 00:31:26.390 "data_size": 63488 00:31:26.390 }, 00:31:26.390 { 00:31:26.390 "name": "pt3", 00:31:26.390 "uuid": "00000000-0000-0000-0000-000000000003", 00:31:26.390 "is_configured": true, 00:31:26.390 "data_offset": 2048, 00:31:26.390 "data_size": 63488 00:31:26.390 } 00:31:26.390 ] 00:31:26.390 } 00:31:26.390 } 00:31:26.390 }' 00:31:26.390 11:42:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:31:26.390 11:42:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:31:26.390 pt2 00:31:26.390 pt3' 00:31:26.390 11:42:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:31:26.390 11:42:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:31:26.390 11:42:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:31:26.648 11:42:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:31:26.648 "name": "pt1", 00:31:26.648 "aliases": [ 00:31:26.648 "00000000-0000-0000-0000-000000000001" 00:31:26.648 ], 00:31:26.648 "product_name": "passthru", 00:31:26.648 "block_size": 512, 00:31:26.648 "num_blocks": 65536, 00:31:26.648 "uuid": "00000000-0000-0000-0000-000000000001", 00:31:26.648 "assigned_rate_limits": { 00:31:26.648 "rw_ios_per_sec": 0, 00:31:26.648 "rw_mbytes_per_sec": 0, 00:31:26.648 "r_mbytes_per_sec": 0, 00:31:26.648 "w_mbytes_per_sec": 0 00:31:26.648 }, 00:31:26.648 "claimed": true, 00:31:26.648 "claim_type": "exclusive_write", 00:31:26.648 "zoned": false, 00:31:26.648 "supported_io_types": { 00:31:26.648 "read": true, 00:31:26.648 "write": true, 00:31:26.648 "unmap": true, 00:31:26.648 "write_zeroes": true, 00:31:26.648 "flush": true, 00:31:26.648 "reset": true, 00:31:26.648 "compare": false, 00:31:26.648 "compare_and_write": false, 00:31:26.648 "abort": true, 00:31:26.648 "nvme_admin": false, 00:31:26.648 "nvme_io": false 00:31:26.648 }, 00:31:26.648 "memory_domains": [ 00:31:26.648 { 00:31:26.649 "dma_device_id": "system", 00:31:26.649 "dma_device_type": 1 00:31:26.649 }, 00:31:26.649 { 00:31:26.649 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:26.649 "dma_device_type": 2 00:31:26.649 } 00:31:26.649 ], 00:31:26.649 "driver_specific": { 00:31:26.649 "passthru": { 00:31:26.649 "name": "pt1", 00:31:26.649 "base_bdev_name": "malloc1" 00:31:26.649 } 00:31:26.649 } 00:31:26.649 }' 00:31:26.649 11:42:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:26.649 11:42:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:26.649 11:42:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:31:26.649 11:42:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:26.907 11:42:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:26.907 11:42:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:31:26.907 11:42:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:26.907 11:42:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:26.907 11:42:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:31:26.907 11:42:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:26.907 11:42:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:26.907 11:42:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:31:26.907 11:42:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:31:26.907 11:42:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:31:26.907 11:42:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:31:27.167 11:42:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:31:27.167 "name": "pt2", 00:31:27.167 "aliases": [ 00:31:27.167 "00000000-0000-0000-0000-000000000002" 00:31:27.167 ], 00:31:27.167 "product_name": "passthru", 00:31:27.167 "block_size": 512, 00:31:27.167 "num_blocks": 65536, 00:31:27.167 "uuid": "00000000-0000-0000-0000-000000000002", 00:31:27.167 "assigned_rate_limits": { 00:31:27.167 "rw_ios_per_sec": 0, 00:31:27.167 "rw_mbytes_per_sec": 0, 00:31:27.167 "r_mbytes_per_sec": 0, 00:31:27.167 "w_mbytes_per_sec": 0 00:31:27.167 }, 00:31:27.167 "claimed": true, 00:31:27.167 "claim_type": "exclusive_write", 00:31:27.167 "zoned": false, 00:31:27.167 "supported_io_types": { 00:31:27.167 "read": true, 00:31:27.167 "write": true, 00:31:27.167 "unmap": true, 00:31:27.167 "write_zeroes": true, 00:31:27.167 "flush": true, 00:31:27.167 "reset": true, 00:31:27.167 "compare": false, 00:31:27.167 "compare_and_write": false, 00:31:27.167 "abort": true, 00:31:27.167 "nvme_admin": false, 00:31:27.167 "nvme_io": false 00:31:27.167 }, 00:31:27.167 "memory_domains": [ 00:31:27.167 { 00:31:27.167 "dma_device_id": "system", 00:31:27.167 "dma_device_type": 1 00:31:27.167 }, 00:31:27.167 { 00:31:27.167 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:27.167 "dma_device_type": 2 00:31:27.167 } 00:31:27.167 ], 00:31:27.167 "driver_specific": { 00:31:27.167 "passthru": { 00:31:27.167 "name": "pt2", 00:31:27.167 "base_bdev_name": "malloc2" 00:31:27.167 } 00:31:27.167 } 00:31:27.167 }' 00:31:27.167 11:42:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:27.167 11:42:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:27.167 11:42:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:31:27.167 11:42:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:27.167 11:42:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:27.426 11:42:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:31:27.426 11:42:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:27.426 11:42:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:27.426 11:42:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:31:27.426 11:42:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:27.426 11:42:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:27.426 11:42:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:31:27.426 11:42:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:31:27.426 11:42:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:31:27.426 11:42:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:31:27.685 11:42:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:31:27.685 "name": "pt3", 00:31:27.685 "aliases": [ 00:31:27.685 "00000000-0000-0000-0000-000000000003" 00:31:27.685 ], 00:31:27.685 "product_name": "passthru", 00:31:27.685 "block_size": 512, 00:31:27.685 "num_blocks": 65536, 00:31:27.685 "uuid": "00000000-0000-0000-0000-000000000003", 00:31:27.685 "assigned_rate_limits": { 00:31:27.685 "rw_ios_per_sec": 0, 00:31:27.685 "rw_mbytes_per_sec": 0, 00:31:27.685 "r_mbytes_per_sec": 0, 00:31:27.685 "w_mbytes_per_sec": 0 00:31:27.685 }, 00:31:27.685 "claimed": true, 00:31:27.685 "claim_type": "exclusive_write", 00:31:27.685 "zoned": false, 00:31:27.685 "supported_io_types": { 00:31:27.685 "read": true, 00:31:27.685 "write": true, 00:31:27.685 "unmap": true, 00:31:27.685 "write_zeroes": true, 00:31:27.685 "flush": true, 00:31:27.685 "reset": true, 00:31:27.685 "compare": false, 00:31:27.685 "compare_and_write": false, 00:31:27.685 "abort": true, 00:31:27.685 "nvme_admin": false, 00:31:27.685 "nvme_io": false 00:31:27.685 }, 00:31:27.685 "memory_domains": [ 00:31:27.685 { 00:31:27.685 "dma_device_id": "system", 00:31:27.685 "dma_device_type": 1 00:31:27.685 }, 00:31:27.685 { 00:31:27.685 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:27.685 "dma_device_type": 2 00:31:27.685 } 00:31:27.685 ], 00:31:27.685 "driver_specific": { 00:31:27.685 "passthru": { 00:31:27.685 "name": "pt3", 00:31:27.685 "base_bdev_name": "malloc3" 00:31:27.685 } 00:31:27.685 } 00:31:27.685 }' 00:31:27.685 11:42:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:27.685 11:42:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:27.685 11:42:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:31:27.685 11:42:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:27.685 11:42:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:27.685 11:42:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:31:27.685 11:42:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:27.944 11:42:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:27.944 11:42:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:31:27.944 11:42:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:27.944 11:42:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:27.945 11:42:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:31:27.945 11:42:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:31:27.945 11:42:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:31:28.204 [2024-06-10 11:42:11.935791] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:31:28.204 11:42:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=7dee1aa3-7c5d-48fa-aa0c-83b8f29b8726 00:31:28.204 11:42:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 7dee1aa3-7c5d-48fa-aa0c-83b8f29b8726 ']' 00:31:28.204 11:42:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:31:28.204 [2024-06-10 11:42:12.116098] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:31:28.204 [2024-06-10 11:42:12.116119] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:31:28.204 [2024-06-10 11:42:12.116155] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:31:28.204 [2024-06-10 11:42:12.116193] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:31:28.204 [2024-06-10 11:42:12.116201] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd7ad40 name raid_bdev1, state offline 00:31:28.204 11:42:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:28.204 11:42:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:31:28.462 11:42:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:31:28.462 11:42:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:31:28.462 11:42:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:31:28.462 11:42:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:31:28.721 11:42:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:31:28.721 11:42:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:31:28.721 11:42:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:31:28.721 11:42:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:31:28.984 11:42:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:31:28.984 11:42:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:31:29.248 11:42:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:31:29.248 11:42:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:31:29.248 11:42:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@649 -- # local es=0 00:31:29.248 11:42:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:31:29.248 11:42:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:31:29.248 11:42:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:31:29.248 11:42:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:31:29.248 11:42:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:31:29.248 11:42:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:31:29.248 11:42:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:31:29.248 11:42:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:31:29.248 11:42:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:31:29.248 11:42:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:31:29.248 [2024-06-10 11:42:13.142737] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:31:29.248 [2024-06-10 11:42:13.143732] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:31:29.248 [2024-06-10 11:42:13.143763] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:31:29.248 [2024-06-10 11:42:13.143796] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:31:29.248 [2024-06-10 11:42:13.143826] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:31:29.248 [2024-06-10 11:42:13.143857] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:31:29.248 [2024-06-10 11:42:13.143876] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:31:29.248 [2024-06-10 11:42:13.143884] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xbcb810 name raid_bdev1, state configuring 00:31:29.248 request: 00:31:29.248 { 00:31:29.248 "name": "raid_bdev1", 00:31:29.248 "raid_level": "concat", 00:31:29.248 "base_bdevs": [ 00:31:29.248 "malloc1", 00:31:29.248 "malloc2", 00:31:29.248 "malloc3" 00:31:29.248 ], 00:31:29.248 "superblock": false, 00:31:29.248 "strip_size_kb": 64, 00:31:29.248 "method": "bdev_raid_create", 00:31:29.248 "req_id": 1 00:31:29.248 } 00:31:29.248 Got JSON-RPC error response 00:31:29.248 response: 00:31:29.248 { 00:31:29.248 "code": -17, 00:31:29.248 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:31:29.248 } 00:31:29.248 11:42:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # es=1 00:31:29.248 11:42:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:31:29.248 11:42:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:31:29.248 11:42:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:31:29.248 11:42:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:29.248 11:42:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:31:29.507 11:42:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:31:29.507 11:42:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:31:29.507 11:42:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:31:29.766 [2024-06-10 11:42:13.479567] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:31:29.766 [2024-06-10 11:42:13.479607] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:31:29.766 [2024-06-10 11:42:13.479636] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd78ad0 00:31:29.766 [2024-06-10 11:42:13.479644] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:31:29.766 [2024-06-10 11:42:13.480874] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:31:29.766 [2024-06-10 11:42:13.480897] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:31:29.766 [2024-06-10 11:42:13.480949] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:31:29.766 [2024-06-10 11:42:13.480973] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:31:29.766 pt1 00:31:29.767 11:42:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 3 00:31:29.767 11:42:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:31:29.767 11:42:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:31:29.767 11:42:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:31:29.767 11:42:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:31:29.767 11:42:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:31:29.767 11:42:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:29.767 11:42:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:29.767 11:42:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:29.767 11:42:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:29.767 11:42:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:29.767 11:42:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:29.767 11:42:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:29.767 "name": "raid_bdev1", 00:31:29.767 "uuid": "7dee1aa3-7c5d-48fa-aa0c-83b8f29b8726", 00:31:29.767 "strip_size_kb": 64, 00:31:29.767 "state": "configuring", 00:31:29.767 "raid_level": "concat", 00:31:29.767 "superblock": true, 00:31:29.767 "num_base_bdevs": 3, 00:31:29.767 "num_base_bdevs_discovered": 1, 00:31:29.767 "num_base_bdevs_operational": 3, 00:31:29.767 "base_bdevs_list": [ 00:31:29.767 { 00:31:29.767 "name": "pt1", 00:31:29.767 "uuid": "00000000-0000-0000-0000-000000000001", 00:31:29.767 "is_configured": true, 00:31:29.767 "data_offset": 2048, 00:31:29.767 "data_size": 63488 00:31:29.767 }, 00:31:29.767 { 00:31:29.767 "name": null, 00:31:29.767 "uuid": "00000000-0000-0000-0000-000000000002", 00:31:29.767 "is_configured": false, 00:31:29.767 "data_offset": 2048, 00:31:29.767 "data_size": 63488 00:31:29.767 }, 00:31:29.767 { 00:31:29.767 "name": null, 00:31:29.767 "uuid": "00000000-0000-0000-0000-000000000003", 00:31:29.767 "is_configured": false, 00:31:29.767 "data_offset": 2048, 00:31:29.767 "data_size": 63488 00:31:29.767 } 00:31:29.767 ] 00:31:29.767 }' 00:31:29.767 11:42:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:29.767 11:42:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:31:30.335 11:42:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:31:30.335 11:42:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:31:30.594 [2024-06-10 11:42:14.325762] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:31:30.594 [2024-06-10 11:42:14.325797] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:31:30.594 [2024-06-10 11:42:14.325812] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd75370 00:31:30.594 [2024-06-10 11:42:14.325821] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:31:30.594 [2024-06-10 11:42:14.326069] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:31:30.594 [2024-06-10 11:42:14.326081] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:31:30.594 [2024-06-10 11:42:14.326131] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:31:30.594 [2024-06-10 11:42:14.326145] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:31:30.594 pt2 00:31:30.594 11:42:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:31:30.594 [2024-06-10 11:42:14.494210] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:31:30.594 11:42:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 3 00:31:30.594 11:42:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:31:30.594 11:42:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:31:30.594 11:42:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:31:30.594 11:42:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:31:30.594 11:42:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:31:30.594 11:42:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:30.594 11:42:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:30.594 11:42:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:30.594 11:42:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:30.595 11:42:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:30.595 11:42:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:30.854 11:42:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:30.854 "name": "raid_bdev1", 00:31:30.854 "uuid": "7dee1aa3-7c5d-48fa-aa0c-83b8f29b8726", 00:31:30.854 "strip_size_kb": 64, 00:31:30.854 "state": "configuring", 00:31:30.854 "raid_level": "concat", 00:31:30.854 "superblock": true, 00:31:30.854 "num_base_bdevs": 3, 00:31:30.854 "num_base_bdevs_discovered": 1, 00:31:30.854 "num_base_bdevs_operational": 3, 00:31:30.854 "base_bdevs_list": [ 00:31:30.854 { 00:31:30.854 "name": "pt1", 00:31:30.854 "uuid": "00000000-0000-0000-0000-000000000001", 00:31:30.854 "is_configured": true, 00:31:30.854 "data_offset": 2048, 00:31:30.854 "data_size": 63488 00:31:30.854 }, 00:31:30.854 { 00:31:30.854 "name": null, 00:31:30.854 "uuid": "00000000-0000-0000-0000-000000000002", 00:31:30.854 "is_configured": false, 00:31:30.854 "data_offset": 2048, 00:31:30.854 "data_size": 63488 00:31:30.854 }, 00:31:30.854 { 00:31:30.854 "name": null, 00:31:30.854 "uuid": "00000000-0000-0000-0000-000000000003", 00:31:30.854 "is_configured": false, 00:31:30.854 "data_offset": 2048, 00:31:30.854 "data_size": 63488 00:31:30.854 } 00:31:30.854 ] 00:31:30.854 }' 00:31:30.854 11:42:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:30.854 11:42:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:31:31.422 11:42:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:31:31.422 11:42:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:31:31.422 11:42:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:31:31.422 [2024-06-10 11:42:15.304295] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:31:31.422 [2024-06-10 11:42:15.304333] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:31:31.422 [2024-06-10 11:42:15.304348] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd7b400 00:31:31.422 [2024-06-10 11:42:15.304357] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:31:31.422 [2024-06-10 11:42:15.304621] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:31:31.422 [2024-06-10 11:42:15.304636] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:31:31.422 [2024-06-10 11:42:15.304684] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:31:31.422 [2024-06-10 11:42:15.304700] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:31:31.422 pt2 00:31:31.422 11:42:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:31:31.422 11:42:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:31:31.423 11:42:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:31:31.682 [2024-06-10 11:42:15.488763] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:31:31.682 [2024-06-10 11:42:15.488786] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:31:31.682 [2024-06-10 11:42:15.488814] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xbcc4d0 00:31:31.682 [2024-06-10 11:42:15.488822] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:31:31.682 [2024-06-10 11:42:15.489023] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:31:31.682 [2024-06-10 11:42:15.489034] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:31:31.682 [2024-06-10 11:42:15.489069] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:31:31.682 [2024-06-10 11:42:15.489080] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:31:31.682 [2024-06-10 11:42:15.489149] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xd767f0 00:31:31.682 [2024-06-10 11:42:15.489155] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:31:31.682 [2024-06-10 11:42:15.489263] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd7c9f0 00:31:31.682 [2024-06-10 11:42:15.489349] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xd767f0 00:31:31.682 [2024-06-10 11:42:15.489356] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xd767f0 00:31:31.682 [2024-06-10 11:42:15.489418] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:31:31.682 pt3 00:31:31.682 11:42:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:31:31.682 11:42:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:31:31.682 11:42:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:31:31.682 11:42:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:31:31.682 11:42:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:31:31.682 11:42:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:31:31.682 11:42:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:31:31.682 11:42:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:31:31.682 11:42:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:31.682 11:42:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:31.682 11:42:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:31.682 11:42:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:31.682 11:42:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:31.682 11:42:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:31.941 11:42:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:31.941 "name": "raid_bdev1", 00:31:31.941 "uuid": "7dee1aa3-7c5d-48fa-aa0c-83b8f29b8726", 00:31:31.941 "strip_size_kb": 64, 00:31:31.941 "state": "online", 00:31:31.941 "raid_level": "concat", 00:31:31.941 "superblock": true, 00:31:31.941 "num_base_bdevs": 3, 00:31:31.941 "num_base_bdevs_discovered": 3, 00:31:31.941 "num_base_bdevs_operational": 3, 00:31:31.941 "base_bdevs_list": [ 00:31:31.941 { 00:31:31.941 "name": "pt1", 00:31:31.941 "uuid": "00000000-0000-0000-0000-000000000001", 00:31:31.941 "is_configured": true, 00:31:31.941 "data_offset": 2048, 00:31:31.941 "data_size": 63488 00:31:31.941 }, 00:31:31.941 { 00:31:31.941 "name": "pt2", 00:31:31.941 "uuid": "00000000-0000-0000-0000-000000000002", 00:31:31.941 "is_configured": true, 00:31:31.941 "data_offset": 2048, 00:31:31.941 "data_size": 63488 00:31:31.941 }, 00:31:31.941 { 00:31:31.941 "name": "pt3", 00:31:31.941 "uuid": "00000000-0000-0000-0000-000000000003", 00:31:31.941 "is_configured": true, 00:31:31.941 "data_offset": 2048, 00:31:31.941 "data_size": 63488 00:31:31.941 } 00:31:31.941 ] 00:31:31.941 }' 00:31:31.941 11:42:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:31.941 11:42:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:31:32.509 11:42:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:31:32.509 11:42:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:31:32.509 11:42:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:31:32.509 11:42:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:31:32.509 11:42:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:31:32.509 11:42:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:31:32.509 11:42:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:31:32.509 11:42:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:31:32.509 [2024-06-10 11:42:16.327107] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:31:32.509 11:42:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:31:32.509 "name": "raid_bdev1", 00:31:32.509 "aliases": [ 00:31:32.509 "7dee1aa3-7c5d-48fa-aa0c-83b8f29b8726" 00:31:32.509 ], 00:31:32.509 "product_name": "Raid Volume", 00:31:32.509 "block_size": 512, 00:31:32.509 "num_blocks": 190464, 00:31:32.509 "uuid": "7dee1aa3-7c5d-48fa-aa0c-83b8f29b8726", 00:31:32.509 "assigned_rate_limits": { 00:31:32.509 "rw_ios_per_sec": 0, 00:31:32.509 "rw_mbytes_per_sec": 0, 00:31:32.509 "r_mbytes_per_sec": 0, 00:31:32.509 "w_mbytes_per_sec": 0 00:31:32.509 }, 00:31:32.509 "claimed": false, 00:31:32.509 "zoned": false, 00:31:32.509 "supported_io_types": { 00:31:32.509 "read": true, 00:31:32.509 "write": true, 00:31:32.509 "unmap": true, 00:31:32.509 "write_zeroes": true, 00:31:32.509 "flush": true, 00:31:32.509 "reset": true, 00:31:32.509 "compare": false, 00:31:32.509 "compare_and_write": false, 00:31:32.509 "abort": false, 00:31:32.509 "nvme_admin": false, 00:31:32.509 "nvme_io": false 00:31:32.509 }, 00:31:32.509 "memory_domains": [ 00:31:32.509 { 00:31:32.509 "dma_device_id": "system", 00:31:32.509 "dma_device_type": 1 00:31:32.509 }, 00:31:32.509 { 00:31:32.509 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:32.509 "dma_device_type": 2 00:31:32.509 }, 00:31:32.509 { 00:31:32.509 "dma_device_id": "system", 00:31:32.509 "dma_device_type": 1 00:31:32.509 }, 00:31:32.509 { 00:31:32.509 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:32.509 "dma_device_type": 2 00:31:32.509 }, 00:31:32.509 { 00:31:32.509 "dma_device_id": "system", 00:31:32.509 "dma_device_type": 1 00:31:32.509 }, 00:31:32.509 { 00:31:32.509 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:32.509 "dma_device_type": 2 00:31:32.509 } 00:31:32.509 ], 00:31:32.509 "driver_specific": { 00:31:32.509 "raid": { 00:31:32.509 "uuid": "7dee1aa3-7c5d-48fa-aa0c-83b8f29b8726", 00:31:32.509 "strip_size_kb": 64, 00:31:32.509 "state": "online", 00:31:32.509 "raid_level": "concat", 00:31:32.509 "superblock": true, 00:31:32.509 "num_base_bdevs": 3, 00:31:32.509 "num_base_bdevs_discovered": 3, 00:31:32.509 "num_base_bdevs_operational": 3, 00:31:32.509 "base_bdevs_list": [ 00:31:32.509 { 00:31:32.509 "name": "pt1", 00:31:32.509 "uuid": "00000000-0000-0000-0000-000000000001", 00:31:32.509 "is_configured": true, 00:31:32.509 "data_offset": 2048, 00:31:32.509 "data_size": 63488 00:31:32.509 }, 00:31:32.509 { 00:31:32.509 "name": "pt2", 00:31:32.509 "uuid": "00000000-0000-0000-0000-000000000002", 00:31:32.509 "is_configured": true, 00:31:32.509 "data_offset": 2048, 00:31:32.509 "data_size": 63488 00:31:32.509 }, 00:31:32.509 { 00:31:32.509 "name": "pt3", 00:31:32.509 "uuid": "00000000-0000-0000-0000-000000000003", 00:31:32.509 "is_configured": true, 00:31:32.509 "data_offset": 2048, 00:31:32.509 "data_size": 63488 00:31:32.509 } 00:31:32.509 ] 00:31:32.509 } 00:31:32.509 } 00:31:32.509 }' 00:31:32.509 11:42:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:31:32.509 11:42:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:31:32.509 pt2 00:31:32.509 pt3' 00:31:32.509 11:42:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:31:32.509 11:42:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:31:32.509 11:42:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:31:32.769 11:42:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:31:32.769 "name": "pt1", 00:31:32.769 "aliases": [ 00:31:32.769 "00000000-0000-0000-0000-000000000001" 00:31:32.769 ], 00:31:32.769 "product_name": "passthru", 00:31:32.769 "block_size": 512, 00:31:32.769 "num_blocks": 65536, 00:31:32.769 "uuid": "00000000-0000-0000-0000-000000000001", 00:31:32.769 "assigned_rate_limits": { 00:31:32.769 "rw_ios_per_sec": 0, 00:31:32.769 "rw_mbytes_per_sec": 0, 00:31:32.769 "r_mbytes_per_sec": 0, 00:31:32.769 "w_mbytes_per_sec": 0 00:31:32.769 }, 00:31:32.769 "claimed": true, 00:31:32.769 "claim_type": "exclusive_write", 00:31:32.769 "zoned": false, 00:31:32.769 "supported_io_types": { 00:31:32.769 "read": true, 00:31:32.769 "write": true, 00:31:32.769 "unmap": true, 00:31:32.769 "write_zeroes": true, 00:31:32.769 "flush": true, 00:31:32.769 "reset": true, 00:31:32.769 "compare": false, 00:31:32.769 "compare_and_write": false, 00:31:32.769 "abort": true, 00:31:32.769 "nvme_admin": false, 00:31:32.769 "nvme_io": false 00:31:32.769 }, 00:31:32.769 "memory_domains": [ 00:31:32.769 { 00:31:32.769 "dma_device_id": "system", 00:31:32.769 "dma_device_type": 1 00:31:32.769 }, 00:31:32.769 { 00:31:32.769 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:32.769 "dma_device_type": 2 00:31:32.769 } 00:31:32.769 ], 00:31:32.769 "driver_specific": { 00:31:32.769 "passthru": { 00:31:32.769 "name": "pt1", 00:31:32.769 "base_bdev_name": "malloc1" 00:31:32.769 } 00:31:32.769 } 00:31:32.769 }' 00:31:32.769 11:42:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:32.769 11:42:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:32.769 11:42:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:31:32.769 11:42:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:32.769 11:42:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:32.769 11:42:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:31:32.769 11:42:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:33.028 11:42:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:33.028 11:42:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:31:33.028 11:42:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:33.028 11:42:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:33.028 11:42:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:31:33.028 11:42:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:31:33.028 11:42:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:31:33.028 11:42:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:31:33.287 11:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:31:33.287 "name": "pt2", 00:31:33.287 "aliases": [ 00:31:33.287 "00000000-0000-0000-0000-000000000002" 00:31:33.287 ], 00:31:33.287 "product_name": "passthru", 00:31:33.287 "block_size": 512, 00:31:33.287 "num_blocks": 65536, 00:31:33.287 "uuid": "00000000-0000-0000-0000-000000000002", 00:31:33.287 "assigned_rate_limits": { 00:31:33.287 "rw_ios_per_sec": 0, 00:31:33.287 "rw_mbytes_per_sec": 0, 00:31:33.287 "r_mbytes_per_sec": 0, 00:31:33.287 "w_mbytes_per_sec": 0 00:31:33.287 }, 00:31:33.287 "claimed": true, 00:31:33.287 "claim_type": "exclusive_write", 00:31:33.287 "zoned": false, 00:31:33.287 "supported_io_types": { 00:31:33.287 "read": true, 00:31:33.287 "write": true, 00:31:33.287 "unmap": true, 00:31:33.287 "write_zeroes": true, 00:31:33.287 "flush": true, 00:31:33.287 "reset": true, 00:31:33.287 "compare": false, 00:31:33.287 "compare_and_write": false, 00:31:33.287 "abort": true, 00:31:33.287 "nvme_admin": false, 00:31:33.287 "nvme_io": false 00:31:33.287 }, 00:31:33.287 "memory_domains": [ 00:31:33.287 { 00:31:33.287 "dma_device_id": "system", 00:31:33.287 "dma_device_type": 1 00:31:33.287 }, 00:31:33.287 { 00:31:33.287 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:33.287 "dma_device_type": 2 00:31:33.287 } 00:31:33.287 ], 00:31:33.287 "driver_specific": { 00:31:33.287 "passthru": { 00:31:33.287 "name": "pt2", 00:31:33.287 "base_bdev_name": "malloc2" 00:31:33.287 } 00:31:33.287 } 00:31:33.287 }' 00:31:33.287 11:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:33.287 11:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:33.287 11:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:31:33.287 11:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:33.287 11:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:33.287 11:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:31:33.287 11:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:33.287 11:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:33.546 11:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:31:33.546 11:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:33.546 11:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:33.546 11:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:31:33.546 11:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:31:33.546 11:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:31:33.546 11:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:31:33.546 11:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:31:33.546 "name": "pt3", 00:31:33.546 "aliases": [ 00:31:33.546 "00000000-0000-0000-0000-000000000003" 00:31:33.546 ], 00:31:33.546 "product_name": "passthru", 00:31:33.546 "block_size": 512, 00:31:33.547 "num_blocks": 65536, 00:31:33.547 "uuid": "00000000-0000-0000-0000-000000000003", 00:31:33.547 "assigned_rate_limits": { 00:31:33.547 "rw_ios_per_sec": 0, 00:31:33.547 "rw_mbytes_per_sec": 0, 00:31:33.547 "r_mbytes_per_sec": 0, 00:31:33.547 "w_mbytes_per_sec": 0 00:31:33.547 }, 00:31:33.547 "claimed": true, 00:31:33.547 "claim_type": "exclusive_write", 00:31:33.547 "zoned": false, 00:31:33.547 "supported_io_types": { 00:31:33.547 "read": true, 00:31:33.547 "write": true, 00:31:33.547 "unmap": true, 00:31:33.547 "write_zeroes": true, 00:31:33.547 "flush": true, 00:31:33.547 "reset": true, 00:31:33.547 "compare": false, 00:31:33.547 "compare_and_write": false, 00:31:33.547 "abort": true, 00:31:33.547 "nvme_admin": false, 00:31:33.547 "nvme_io": false 00:31:33.547 }, 00:31:33.547 "memory_domains": [ 00:31:33.547 { 00:31:33.547 "dma_device_id": "system", 00:31:33.547 "dma_device_type": 1 00:31:33.547 }, 00:31:33.547 { 00:31:33.547 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:33.547 "dma_device_type": 2 00:31:33.547 } 00:31:33.547 ], 00:31:33.547 "driver_specific": { 00:31:33.547 "passthru": { 00:31:33.547 "name": "pt3", 00:31:33.547 "base_bdev_name": "malloc3" 00:31:33.547 } 00:31:33.547 } 00:31:33.547 }' 00:31:33.547 11:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:33.806 11:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:33.806 11:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:31:33.806 11:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:33.806 11:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:33.806 11:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:31:33.806 11:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:33.806 11:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:33.806 11:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:31:33.806 11:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:34.065 11:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:34.065 11:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:31:34.065 11:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:31:34.065 11:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:31:34.065 [2024-06-10 11:42:17.951326] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:31:34.065 11:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 7dee1aa3-7c5d-48fa-aa0c-83b8f29b8726 '!=' 7dee1aa3-7c5d-48fa-aa0c-83b8f29b8726 ']' 00:31:34.065 11:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:31:34.065 11:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:31:34.065 11:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:31:34.065 11:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 165100 00:31:34.065 11:42:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@949 -- # '[' -z 165100 ']' 00:31:34.065 11:42:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # kill -0 165100 00:31:34.065 11:42:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # uname 00:31:34.065 11:42:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:31:34.065 11:42:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 165100 00:31:34.324 11:42:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:31:34.324 11:42:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:31:34.324 11:42:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 165100' 00:31:34.324 killing process with pid 165100 00:31:34.324 11:42:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # kill 165100 00:31:34.324 [2024-06-10 11:42:18.021152] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:31:34.324 [2024-06-10 11:42:18.021191] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:31:34.324 [2024-06-10 11:42:18.021228] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:31:34.324 [2024-06-10 11:42:18.021237] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd767f0 name raid_bdev1, state offline 00:31:34.324 11:42:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@973 -- # wait 165100 00:31:34.324 [2024-06-10 11:42:18.048825] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:31:34.324 11:42:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:31:34.324 00:31:34.324 real 0m10.868s 00:31:34.324 user 0m19.415s 00:31:34.324 sys 0m2.047s 00:31:34.324 11:42:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:31:34.324 11:42:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:31:34.324 ************************************ 00:31:34.325 END TEST raid_superblock_test 00:31:34.325 ************************************ 00:31:34.584 11:42:18 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 3 read 00:31:34.584 11:42:18 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:31:34.584 11:42:18 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:31:34.584 11:42:18 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:31:34.584 ************************************ 00:31:34.584 START TEST raid_read_error_test 00:31:34.584 ************************************ 00:31:34.584 11:42:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test concat 3 read 00:31:34.584 11:42:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:31:34.584 11:42:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:31:34.584 11:42:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:31:34.584 11:42:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:31:34.584 11:42:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:31:34.584 11:42:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:31:34.584 11:42:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:31:34.584 11:42:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:31:34.584 11:42:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:31:34.584 11:42:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:31:34.584 11:42:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:31:34.584 11:42:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:31:34.584 11:42:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:31:34.584 11:42:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:31:34.584 11:42:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:31:34.584 11:42:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:31:34.584 11:42:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:31:34.585 11:42:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:31:34.585 11:42:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:31:34.585 11:42:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:31:34.585 11:42:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:31:34.585 11:42:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:31:34.585 11:42:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:31:34.585 11:42:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:31:34.585 11:42:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:31:34.585 11:42:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.8tjtpy0aaG 00:31:34.585 11:42:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=166845 00:31:34.585 11:42:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 166845 /var/tmp/spdk-raid.sock 00:31:34.585 11:42:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:31:34.585 11:42:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@830 -- # '[' -z 166845 ']' 00:31:34.585 11:42:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:31:34.585 11:42:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:31:34.585 11:42:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:31:34.585 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:31:34.585 11:42:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:31:34.585 11:42:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:31:34.585 [2024-06-10 11:42:18.394389] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:31:34.585 [2024-06-10 11:42:18.394440] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid166845 ] 00:31:34.585 [2024-06-10 11:42:18.481651] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:34.844 [2024-06-10 11:42:18.569847] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:31:34.844 [2024-06-10 11:42:18.629661] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:31:34.844 [2024-06-10 11:42:18.629693] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:31:35.412 11:42:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:31:35.412 11:42:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@863 -- # return 0 00:31:35.412 11:42:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:31:35.412 11:42:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:31:35.412 BaseBdev1_malloc 00:31:35.672 11:42:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:31:35.672 true 00:31:35.672 11:42:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:31:35.931 [2024-06-10 11:42:19.702102] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:31:35.931 [2024-06-10 11:42:19.702139] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:31:35.931 [2024-06-10 11:42:19.702153] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13deb10 00:31:35.931 [2024-06-10 11:42:19.702178] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:31:35.931 [2024-06-10 11:42:19.703546] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:31:35.931 [2024-06-10 11:42:19.703568] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:31:35.931 BaseBdev1 00:31:35.931 11:42:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:31:35.931 11:42:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:31:36.190 BaseBdev2_malloc 00:31:36.190 11:42:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:31:36.190 true 00:31:36.190 11:42:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:31:36.449 [2024-06-10 11:42:20.232474] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:31:36.449 [2024-06-10 11:42:20.232510] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:31:36.449 [2024-06-10 11:42:20.232524] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13e3280 00:31:36.449 [2024-06-10 11:42:20.232532] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:31:36.449 [2024-06-10 11:42:20.233613] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:31:36.449 [2024-06-10 11:42:20.233633] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:31:36.449 BaseBdev2 00:31:36.449 11:42:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:31:36.449 11:42:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:31:36.708 BaseBdev3_malloc 00:31:36.708 11:42:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:31:36.708 true 00:31:36.708 11:42:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:31:36.968 [2024-06-10 11:42:20.765672] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:31:36.968 [2024-06-10 11:42:20.765708] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:31:36.968 [2024-06-10 11:42:20.765740] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13e5ab0 00:31:36.968 [2024-06-10 11:42:20.765749] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:31:36.968 [2024-06-10 11:42:20.766916] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:31:36.968 [2024-06-10 11:42:20.766938] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:31:36.968 BaseBdev3 00:31:36.968 11:42:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:31:37.228 [2024-06-10 11:42:20.942157] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:31:37.228 [2024-06-10 11:42:20.943165] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:31:37.228 [2024-06-10 11:42:20.943210] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:31:37.228 [2024-06-10 11:42:20.943357] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x13e40b0 00:31:37.228 [2024-06-10 11:42:20.943365] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:31:37.228 [2024-06-10 11:42:20.943503] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13e0520 00:31:37.228 [2024-06-10 11:42:20.943608] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x13e40b0 00:31:37.228 [2024-06-10 11:42:20.943618] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x13e40b0 00:31:37.228 [2024-06-10 11:42:20.943689] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:31:37.228 11:42:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:31:37.228 11:42:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:31:37.228 11:42:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:31:37.228 11:42:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:31:37.228 11:42:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:31:37.228 11:42:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:31:37.228 11:42:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:37.228 11:42:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:37.228 11:42:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:37.228 11:42:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:37.228 11:42:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:37.228 11:42:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:37.228 11:42:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:37.228 "name": "raid_bdev1", 00:31:37.228 "uuid": "06560917-bf7d-432f-9a65-6a04c2918f87", 00:31:37.228 "strip_size_kb": 64, 00:31:37.228 "state": "online", 00:31:37.228 "raid_level": "concat", 00:31:37.228 "superblock": true, 00:31:37.228 "num_base_bdevs": 3, 00:31:37.228 "num_base_bdevs_discovered": 3, 00:31:37.228 "num_base_bdevs_operational": 3, 00:31:37.228 "base_bdevs_list": [ 00:31:37.228 { 00:31:37.228 "name": "BaseBdev1", 00:31:37.228 "uuid": "7cb3aa33-6d69-5145-9fb1-b1ecfe7440b4", 00:31:37.228 "is_configured": true, 00:31:37.228 "data_offset": 2048, 00:31:37.228 "data_size": 63488 00:31:37.228 }, 00:31:37.228 { 00:31:37.228 "name": "BaseBdev2", 00:31:37.228 "uuid": "4310680f-1669-54f7-9092-e55bd042c1ab", 00:31:37.228 "is_configured": true, 00:31:37.228 "data_offset": 2048, 00:31:37.228 "data_size": 63488 00:31:37.228 }, 00:31:37.228 { 00:31:37.228 "name": "BaseBdev3", 00:31:37.228 "uuid": "1f07369b-5b86-5273-8870-6f94e098b23d", 00:31:37.228 "is_configured": true, 00:31:37.228 "data_offset": 2048, 00:31:37.228 "data_size": 63488 00:31:37.228 } 00:31:37.228 ] 00:31:37.228 }' 00:31:37.228 11:42:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:37.228 11:42:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:31:37.797 11:42:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:31:37.797 11:42:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:31:37.797 [2024-06-10 11:42:21.676243] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13e6600 00:31:38.736 11:42:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:31:38.996 11:42:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:31:38.996 11:42:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:31:38.996 11:42:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:31:38.996 11:42:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:31:38.996 11:42:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:31:38.996 11:42:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:31:38.996 11:42:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:31:38.996 11:42:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:31:38.996 11:42:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:31:38.996 11:42:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:38.996 11:42:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:38.996 11:42:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:38.996 11:42:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:38.996 11:42:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:38.996 11:42:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:39.255 11:42:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:39.255 "name": "raid_bdev1", 00:31:39.255 "uuid": "06560917-bf7d-432f-9a65-6a04c2918f87", 00:31:39.255 "strip_size_kb": 64, 00:31:39.255 "state": "online", 00:31:39.255 "raid_level": "concat", 00:31:39.255 "superblock": true, 00:31:39.255 "num_base_bdevs": 3, 00:31:39.255 "num_base_bdevs_discovered": 3, 00:31:39.255 "num_base_bdevs_operational": 3, 00:31:39.255 "base_bdevs_list": [ 00:31:39.255 { 00:31:39.255 "name": "BaseBdev1", 00:31:39.255 "uuid": "7cb3aa33-6d69-5145-9fb1-b1ecfe7440b4", 00:31:39.255 "is_configured": true, 00:31:39.255 "data_offset": 2048, 00:31:39.255 "data_size": 63488 00:31:39.255 }, 00:31:39.255 { 00:31:39.255 "name": "BaseBdev2", 00:31:39.255 "uuid": "4310680f-1669-54f7-9092-e55bd042c1ab", 00:31:39.255 "is_configured": true, 00:31:39.255 "data_offset": 2048, 00:31:39.255 "data_size": 63488 00:31:39.255 }, 00:31:39.255 { 00:31:39.255 "name": "BaseBdev3", 00:31:39.255 "uuid": "1f07369b-5b86-5273-8870-6f94e098b23d", 00:31:39.255 "is_configured": true, 00:31:39.255 "data_offset": 2048, 00:31:39.255 "data_size": 63488 00:31:39.255 } 00:31:39.255 ] 00:31:39.255 }' 00:31:39.255 11:42:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:39.255 11:42:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:31:39.514 11:42:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:31:39.773 [2024-06-10 11:42:23.576273] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:31:39.773 [2024-06-10 11:42:23.576308] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:31:39.773 [2024-06-10 11:42:23.578294] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:31:39.773 [2024-06-10 11:42:23.578317] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:31:39.773 [2024-06-10 11:42:23.578338] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:31:39.773 [2024-06-10 11:42:23.578345] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13e40b0 name raid_bdev1, state offline 00:31:39.773 0 00:31:39.773 11:42:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 166845 00:31:39.773 11:42:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@949 -- # '[' -z 166845 ']' 00:31:39.773 11:42:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # kill -0 166845 00:31:39.773 11:42:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # uname 00:31:39.773 11:42:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:31:39.773 11:42:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 166845 00:31:39.773 11:42:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:31:39.773 11:42:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:31:39.773 11:42:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 166845' 00:31:39.773 killing process with pid 166845 00:31:39.773 11:42:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # kill 166845 00:31:39.773 [2024-06-10 11:42:23.644225] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:31:39.773 11:42:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@973 -- # wait 166845 00:31:39.773 [2024-06-10 11:42:23.664281] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:31:40.033 11:42:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.8tjtpy0aaG 00:31:40.033 11:42:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:31:40.033 11:42:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:31:40.033 11:42:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.53 00:31:40.033 11:42:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:31:40.033 11:42:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:31:40.033 11:42:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:31:40.033 11:42:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.53 != \0\.\0\0 ]] 00:31:40.033 00:31:40.033 real 0m5.535s 00:31:40.033 user 0m8.429s 00:31:40.033 sys 0m0.998s 00:31:40.033 11:42:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:31:40.033 11:42:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:31:40.033 ************************************ 00:31:40.033 END TEST raid_read_error_test 00:31:40.033 ************************************ 00:31:40.033 11:42:23 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 3 write 00:31:40.033 11:42:23 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:31:40.033 11:42:23 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:31:40.033 11:42:23 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:31:40.033 ************************************ 00:31:40.033 START TEST raid_write_error_test 00:31:40.033 ************************************ 00:31:40.033 11:42:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test concat 3 write 00:31:40.033 11:42:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:31:40.033 11:42:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:31:40.033 11:42:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:31:40.033 11:42:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:31:40.033 11:42:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:31:40.033 11:42:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:31:40.034 11:42:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:31:40.034 11:42:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:31:40.034 11:42:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:31:40.034 11:42:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:31:40.034 11:42:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:31:40.034 11:42:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:31:40.034 11:42:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:31:40.034 11:42:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:31:40.034 11:42:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:31:40.034 11:42:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:31:40.034 11:42:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:31:40.034 11:42:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:31:40.034 11:42:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:31:40.034 11:42:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:31:40.034 11:42:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:31:40.034 11:42:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:31:40.034 11:42:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:31:40.034 11:42:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:31:40.034 11:42:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:31:40.034 11:42:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.FeGVEXV4tv 00:31:40.034 11:42:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=167653 00:31:40.034 11:42:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 167653 /var/tmp/spdk-raid.sock 00:31:40.034 11:42:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:31:40.034 11:42:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@830 -- # '[' -z 167653 ']' 00:31:40.034 11:42:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:31:40.034 11:42:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:31:40.034 11:42:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:31:40.034 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:31:40.034 11:42:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:31:40.034 11:42:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:31:40.293 [2024-06-10 11:42:23.998409] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:31:40.293 [2024-06-10 11:42:23.998459] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid167653 ] 00:31:40.293 [2024-06-10 11:42:24.084642] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:40.293 [2024-06-10 11:42:24.171819] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:31:40.293 [2024-06-10 11:42:24.229871] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:31:40.293 [2024-06-10 11:42:24.229899] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:31:40.943 11:42:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:31:40.944 11:42:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@863 -- # return 0 00:31:40.944 11:42:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:31:40.944 11:42:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:31:41.203 BaseBdev1_malloc 00:31:41.203 11:42:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:31:41.203 true 00:31:41.203 11:42:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:31:41.463 [2024-06-10 11:42:25.278405] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:31:41.463 [2024-06-10 11:42:25.278443] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:31:41.463 [2024-06-10 11:42:25.278472] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1fb9b10 00:31:41.463 [2024-06-10 11:42:25.278481] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:31:41.463 [2024-06-10 11:42:25.279878] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:31:41.463 [2024-06-10 11:42:25.279901] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:31:41.463 BaseBdev1 00:31:41.463 11:42:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:31:41.463 11:42:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:31:41.722 BaseBdev2_malloc 00:31:41.722 11:42:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:31:41.722 true 00:31:41.722 11:42:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:31:41.981 [2024-06-10 11:42:25.776601] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:31:41.981 [2024-06-10 11:42:25.776636] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:31:41.981 [2024-06-10 11:42:25.776649] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1fbe280 00:31:41.981 [2024-06-10 11:42:25.776674] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:31:41.981 [2024-06-10 11:42:25.777842] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:31:41.981 [2024-06-10 11:42:25.777863] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:31:41.981 BaseBdev2 00:31:41.981 11:42:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:31:41.981 11:42:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:31:42.240 BaseBdev3_malloc 00:31:42.241 11:42:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:31:42.241 true 00:31:42.241 11:42:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:31:42.500 [2024-06-10 11:42:26.274641] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:31:42.500 [2024-06-10 11:42:26.274674] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:31:42.500 [2024-06-10 11:42:26.274689] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1fc0ab0 00:31:42.500 [2024-06-10 11:42:26.274698] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:31:42.500 [2024-06-10 11:42:26.275787] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:31:42.500 [2024-06-10 11:42:26.275808] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:31:42.500 BaseBdev3 00:31:42.500 11:42:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:31:42.500 [2024-06-10 11:42:26.443105] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:31:42.500 [2024-06-10 11:42:26.444046] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:31:42.500 [2024-06-10 11:42:26.444095] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:31:42.500 [2024-06-10 11:42:26.444246] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1fbf0b0 00:31:42.500 [2024-06-10 11:42:26.444254] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:31:42.500 [2024-06-10 11:42:26.444390] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1fbb520 00:31:42.500 [2024-06-10 11:42:26.444497] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1fbf0b0 00:31:42.500 [2024-06-10 11:42:26.444504] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1fbf0b0 00:31:42.500 [2024-06-10 11:42:26.444575] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:31:42.759 11:42:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:31:42.759 11:42:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:31:42.759 11:42:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:31:42.759 11:42:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:31:42.759 11:42:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:31:42.760 11:42:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:31:42.760 11:42:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:42.760 11:42:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:42.760 11:42:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:42.760 11:42:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:42.760 11:42:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:42.760 11:42:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:42.760 11:42:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:42.760 "name": "raid_bdev1", 00:31:42.760 "uuid": "f2cfbd51-43aa-4557-a03e-ac6e864bfea2", 00:31:42.760 "strip_size_kb": 64, 00:31:42.760 "state": "online", 00:31:42.760 "raid_level": "concat", 00:31:42.760 "superblock": true, 00:31:42.760 "num_base_bdevs": 3, 00:31:42.760 "num_base_bdevs_discovered": 3, 00:31:42.760 "num_base_bdevs_operational": 3, 00:31:42.760 "base_bdevs_list": [ 00:31:42.760 { 00:31:42.760 "name": "BaseBdev1", 00:31:42.760 "uuid": "78f8a016-debd-5437-96a0-fda4a90fd891", 00:31:42.760 "is_configured": true, 00:31:42.760 "data_offset": 2048, 00:31:42.760 "data_size": 63488 00:31:42.760 }, 00:31:42.760 { 00:31:42.760 "name": "BaseBdev2", 00:31:42.760 "uuid": "83a49357-e304-5586-9609-027679f1af1d", 00:31:42.760 "is_configured": true, 00:31:42.760 "data_offset": 2048, 00:31:42.760 "data_size": 63488 00:31:42.760 }, 00:31:42.760 { 00:31:42.760 "name": "BaseBdev3", 00:31:42.760 "uuid": "f9073f61-3080-53d5-b2f4-b55b2c0bae50", 00:31:42.760 "is_configured": true, 00:31:42.760 "data_offset": 2048, 00:31:42.760 "data_size": 63488 00:31:42.760 } 00:31:42.760 ] 00:31:42.760 }' 00:31:42.760 11:42:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:42.760 11:42:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:31:43.386 11:42:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:31:43.386 11:42:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:31:43.386 [2024-06-10 11:42:27.233323] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1fc1600 00:31:44.324 11:42:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:31:44.583 11:42:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:31:44.583 11:42:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:31:44.583 11:42:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:31:44.583 11:42:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:31:44.583 11:42:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:31:44.583 11:42:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:31:44.583 11:42:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:31:44.583 11:42:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:31:44.583 11:42:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:31:44.583 11:42:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:44.583 11:42:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:44.583 11:42:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:44.583 11:42:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:44.583 11:42:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:44.583 11:42:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:44.583 11:42:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:44.583 "name": "raid_bdev1", 00:31:44.583 "uuid": "f2cfbd51-43aa-4557-a03e-ac6e864bfea2", 00:31:44.583 "strip_size_kb": 64, 00:31:44.583 "state": "online", 00:31:44.583 "raid_level": "concat", 00:31:44.583 "superblock": true, 00:31:44.583 "num_base_bdevs": 3, 00:31:44.583 "num_base_bdevs_discovered": 3, 00:31:44.583 "num_base_bdevs_operational": 3, 00:31:44.583 "base_bdevs_list": [ 00:31:44.583 { 00:31:44.583 "name": "BaseBdev1", 00:31:44.583 "uuid": "78f8a016-debd-5437-96a0-fda4a90fd891", 00:31:44.583 "is_configured": true, 00:31:44.583 "data_offset": 2048, 00:31:44.583 "data_size": 63488 00:31:44.583 }, 00:31:44.583 { 00:31:44.583 "name": "BaseBdev2", 00:31:44.583 "uuid": "83a49357-e304-5586-9609-027679f1af1d", 00:31:44.583 "is_configured": true, 00:31:44.583 "data_offset": 2048, 00:31:44.583 "data_size": 63488 00:31:44.583 }, 00:31:44.583 { 00:31:44.583 "name": "BaseBdev3", 00:31:44.583 "uuid": "f9073f61-3080-53d5-b2f4-b55b2c0bae50", 00:31:44.583 "is_configured": true, 00:31:44.583 "data_offset": 2048, 00:31:44.583 "data_size": 63488 00:31:44.583 } 00:31:44.583 ] 00:31:44.583 }' 00:31:44.583 11:42:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:44.583 11:42:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:31:45.151 11:42:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:31:45.411 [2024-06-10 11:42:29.153903] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:31:45.411 [2024-06-10 11:42:29.153932] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:31:45.411 [2024-06-10 11:42:29.156031] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:31:45.411 [2024-06-10 11:42:29.156054] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:31:45.411 [2024-06-10 11:42:29.156077] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:31:45.411 [2024-06-10 11:42:29.156084] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1fbf0b0 name raid_bdev1, state offline 00:31:45.411 0 00:31:45.411 11:42:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 167653 00:31:45.411 11:42:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@949 -- # '[' -z 167653 ']' 00:31:45.411 11:42:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # kill -0 167653 00:31:45.411 11:42:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # uname 00:31:45.411 11:42:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:31:45.411 11:42:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 167653 00:31:45.411 11:42:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:31:45.411 11:42:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:31:45.411 11:42:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 167653' 00:31:45.411 killing process with pid 167653 00:31:45.411 11:42:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # kill 167653 00:31:45.411 [2024-06-10 11:42:29.223767] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:31:45.411 11:42:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@973 -- # wait 167653 00:31:45.411 [2024-06-10 11:42:29.244305] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:31:45.670 11:42:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.FeGVEXV4tv 00:31:45.670 11:42:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:31:45.670 11:42:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:31:45.670 11:42:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.52 00:31:45.670 11:42:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:31:45.670 11:42:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:31:45.670 11:42:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:31:45.670 11:42:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.52 != \0\.\0\0 ]] 00:31:45.670 00:31:45.670 real 0m5.527s 00:31:45.670 user 0m8.421s 00:31:45.670 sys 0m0.973s 00:31:45.670 11:42:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:31:45.670 11:42:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:31:45.670 ************************************ 00:31:45.670 END TEST raid_write_error_test 00:31:45.670 ************************************ 00:31:45.670 11:42:29 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:31:45.670 11:42:29 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 3 false 00:31:45.670 11:42:29 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:31:45.670 11:42:29 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:31:45.670 11:42:29 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:31:45.670 ************************************ 00:31:45.670 START TEST raid_state_function_test 00:31:45.670 ************************************ 00:31:45.670 11:42:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # raid_state_function_test raid1 3 false 00:31:45.670 11:42:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:31:45.670 11:42:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:31:45.670 11:42:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:31:45.670 11:42:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:31:45.670 11:42:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:31:45.670 11:42:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:31:45.670 11:42:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:31:45.670 11:42:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:31:45.670 11:42:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:31:45.670 11:42:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:31:45.670 11:42:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:31:45.670 11:42:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:31:45.670 11:42:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:31:45.670 11:42:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:31:45.670 11:42:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:31:45.670 11:42:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:31:45.670 11:42:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:31:45.670 11:42:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:31:45.670 11:42:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:31:45.670 11:42:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:31:45.670 11:42:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:31:45.670 11:42:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:31:45.670 11:42:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:31:45.670 11:42:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:31:45.670 11:42:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:31:45.670 11:42:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=168460 00:31:45.670 11:42:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 168460' 00:31:45.670 Process raid pid: 168460 00:31:45.670 11:42:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 168460 /var/tmp/spdk-raid.sock 00:31:45.670 11:42:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@830 -- # '[' -z 168460 ']' 00:31:45.671 11:42:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:31:45.671 11:42:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:31:45.671 11:42:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:31:45.671 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:31:45.671 11:42:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:31:45.671 11:42:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:31:45.671 11:42:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:31:45.671 [2024-06-10 11:42:29.600311] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:31:45.671 [2024-06-10 11:42:29.600362] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:31:45.930 [2024-06-10 11:42:29.688658] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:45.930 [2024-06-10 11:42:29.775856] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:31:45.930 [2024-06-10 11:42:29.835126] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:31:45.930 [2024-06-10 11:42:29.835152] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:31:46.497 11:42:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:31:46.497 11:42:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@863 -- # return 0 00:31:46.497 11:42:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:31:46.756 [2024-06-10 11:42:30.557789] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:31:46.756 [2024-06-10 11:42:30.557827] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:31:46.756 [2024-06-10 11:42:30.557836] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:31:46.756 [2024-06-10 11:42:30.557860] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:31:46.756 [2024-06-10 11:42:30.557870] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:31:46.756 [2024-06-10 11:42:30.557878] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:31:46.756 11:42:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:31:46.756 11:42:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:31:46.756 11:42:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:31:46.756 11:42:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:31:46.756 11:42:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:31:46.756 11:42:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:31:46.756 11:42:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:46.756 11:42:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:46.756 11:42:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:46.756 11:42:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:46.756 11:42:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:46.757 11:42:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:31:47.015 11:42:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:47.015 "name": "Existed_Raid", 00:31:47.015 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:47.015 "strip_size_kb": 0, 00:31:47.015 "state": "configuring", 00:31:47.015 "raid_level": "raid1", 00:31:47.015 "superblock": false, 00:31:47.015 "num_base_bdevs": 3, 00:31:47.015 "num_base_bdevs_discovered": 0, 00:31:47.015 "num_base_bdevs_operational": 3, 00:31:47.015 "base_bdevs_list": [ 00:31:47.015 { 00:31:47.015 "name": "BaseBdev1", 00:31:47.015 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:47.015 "is_configured": false, 00:31:47.015 "data_offset": 0, 00:31:47.015 "data_size": 0 00:31:47.015 }, 00:31:47.015 { 00:31:47.015 "name": "BaseBdev2", 00:31:47.015 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:47.015 "is_configured": false, 00:31:47.015 "data_offset": 0, 00:31:47.015 "data_size": 0 00:31:47.015 }, 00:31:47.015 { 00:31:47.015 "name": "BaseBdev3", 00:31:47.015 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:47.015 "is_configured": false, 00:31:47.015 "data_offset": 0, 00:31:47.015 "data_size": 0 00:31:47.015 } 00:31:47.015 ] 00:31:47.015 }' 00:31:47.015 11:42:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:47.015 11:42:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:31:47.583 11:42:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:31:47.583 [2024-06-10 11:42:31.383852] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:31:47.583 [2024-06-10 11:42:31.383876] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25f1530 name Existed_Raid, state configuring 00:31:47.583 11:42:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:31:47.843 [2024-06-10 11:42:31.564335] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:31:47.843 [2024-06-10 11:42:31.564355] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:31:47.843 [2024-06-10 11:42:31.564361] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:31:47.843 [2024-06-10 11:42:31.564368] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:31:47.843 [2024-06-10 11:42:31.564389] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:31:47.843 [2024-06-10 11:42:31.564396] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:31:47.843 11:42:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:31:47.843 [2024-06-10 11:42:31.757386] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:31:47.843 BaseBdev1 00:31:47.843 11:42:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:31:47.843 11:42:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:31:47.843 11:42:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:31:47.843 11:42:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:31:47.843 11:42:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:31:47.843 11:42:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:31:47.843 11:42:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:31:48.102 11:42:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:31:48.360 [ 00:31:48.360 { 00:31:48.360 "name": "BaseBdev1", 00:31:48.360 "aliases": [ 00:31:48.360 "a0797cc8-8a3e-4bce-85a1-aad3d6593cd1" 00:31:48.360 ], 00:31:48.360 "product_name": "Malloc disk", 00:31:48.360 "block_size": 512, 00:31:48.360 "num_blocks": 65536, 00:31:48.360 "uuid": "a0797cc8-8a3e-4bce-85a1-aad3d6593cd1", 00:31:48.360 "assigned_rate_limits": { 00:31:48.360 "rw_ios_per_sec": 0, 00:31:48.360 "rw_mbytes_per_sec": 0, 00:31:48.360 "r_mbytes_per_sec": 0, 00:31:48.360 "w_mbytes_per_sec": 0 00:31:48.360 }, 00:31:48.360 "claimed": true, 00:31:48.360 "claim_type": "exclusive_write", 00:31:48.360 "zoned": false, 00:31:48.360 "supported_io_types": { 00:31:48.360 "read": true, 00:31:48.360 "write": true, 00:31:48.360 "unmap": true, 00:31:48.360 "write_zeroes": true, 00:31:48.360 "flush": true, 00:31:48.360 "reset": true, 00:31:48.360 "compare": false, 00:31:48.360 "compare_and_write": false, 00:31:48.360 "abort": true, 00:31:48.360 "nvme_admin": false, 00:31:48.360 "nvme_io": false 00:31:48.360 }, 00:31:48.360 "memory_domains": [ 00:31:48.360 { 00:31:48.360 "dma_device_id": "system", 00:31:48.360 "dma_device_type": 1 00:31:48.360 }, 00:31:48.360 { 00:31:48.360 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:48.360 "dma_device_type": 2 00:31:48.360 } 00:31:48.360 ], 00:31:48.360 "driver_specific": {} 00:31:48.360 } 00:31:48.360 ] 00:31:48.360 11:42:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:31:48.360 11:42:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:31:48.360 11:42:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:31:48.360 11:42:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:31:48.360 11:42:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:31:48.360 11:42:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:31:48.360 11:42:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:31:48.360 11:42:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:48.360 11:42:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:48.360 11:42:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:48.360 11:42:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:48.360 11:42:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:48.360 11:42:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:31:48.360 11:42:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:48.360 "name": "Existed_Raid", 00:31:48.360 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:48.360 "strip_size_kb": 0, 00:31:48.360 "state": "configuring", 00:31:48.360 "raid_level": "raid1", 00:31:48.360 "superblock": false, 00:31:48.360 "num_base_bdevs": 3, 00:31:48.360 "num_base_bdevs_discovered": 1, 00:31:48.360 "num_base_bdevs_operational": 3, 00:31:48.360 "base_bdevs_list": [ 00:31:48.360 { 00:31:48.360 "name": "BaseBdev1", 00:31:48.360 "uuid": "a0797cc8-8a3e-4bce-85a1-aad3d6593cd1", 00:31:48.360 "is_configured": true, 00:31:48.360 "data_offset": 0, 00:31:48.360 "data_size": 65536 00:31:48.360 }, 00:31:48.360 { 00:31:48.360 "name": "BaseBdev2", 00:31:48.360 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:48.360 "is_configured": false, 00:31:48.360 "data_offset": 0, 00:31:48.360 "data_size": 0 00:31:48.360 }, 00:31:48.360 { 00:31:48.360 "name": "BaseBdev3", 00:31:48.360 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:48.360 "is_configured": false, 00:31:48.360 "data_offset": 0, 00:31:48.360 "data_size": 0 00:31:48.360 } 00:31:48.360 ] 00:31:48.360 }' 00:31:48.360 11:42:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:48.360 11:42:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:31:48.927 11:42:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:31:49.186 [2024-06-10 11:42:32.904352] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:31:49.186 [2024-06-10 11:42:32.904381] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25f0e00 name Existed_Raid, state configuring 00:31:49.186 11:42:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:31:49.186 [2024-06-10 11:42:33.088848] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:31:49.186 [2024-06-10 11:42:33.089993] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:31:49.186 [2024-06-10 11:42:33.090018] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:31:49.186 [2024-06-10 11:42:33.090024] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:31:49.186 [2024-06-10 11:42:33.090032] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:31:49.186 11:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:31:49.186 11:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:31:49.186 11:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:31:49.186 11:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:31:49.186 11:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:31:49.186 11:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:31:49.186 11:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:31:49.186 11:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:31:49.186 11:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:49.186 11:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:49.186 11:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:49.186 11:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:49.186 11:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:49.186 11:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:31:49.445 11:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:49.445 "name": "Existed_Raid", 00:31:49.445 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:49.445 "strip_size_kb": 0, 00:31:49.445 "state": "configuring", 00:31:49.445 "raid_level": "raid1", 00:31:49.445 "superblock": false, 00:31:49.445 "num_base_bdevs": 3, 00:31:49.445 "num_base_bdevs_discovered": 1, 00:31:49.445 "num_base_bdevs_operational": 3, 00:31:49.445 "base_bdevs_list": [ 00:31:49.445 { 00:31:49.445 "name": "BaseBdev1", 00:31:49.445 "uuid": "a0797cc8-8a3e-4bce-85a1-aad3d6593cd1", 00:31:49.445 "is_configured": true, 00:31:49.445 "data_offset": 0, 00:31:49.445 "data_size": 65536 00:31:49.445 }, 00:31:49.445 { 00:31:49.445 "name": "BaseBdev2", 00:31:49.445 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:49.445 "is_configured": false, 00:31:49.445 "data_offset": 0, 00:31:49.445 "data_size": 0 00:31:49.445 }, 00:31:49.445 { 00:31:49.445 "name": "BaseBdev3", 00:31:49.445 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:49.445 "is_configured": false, 00:31:49.445 "data_offset": 0, 00:31:49.445 "data_size": 0 00:31:49.445 } 00:31:49.445 ] 00:31:49.445 }' 00:31:49.445 11:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:49.445 11:42:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:31:50.012 11:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:31:50.012 [2024-06-10 11:42:33.905770] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:31:50.012 BaseBdev2 00:31:50.012 11:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:31:50.012 11:42:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:31:50.012 11:42:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:31:50.012 11:42:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:31:50.012 11:42:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:31:50.012 11:42:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:31:50.012 11:42:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:31:50.269 11:42:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:31:50.526 [ 00:31:50.526 { 00:31:50.526 "name": "BaseBdev2", 00:31:50.526 "aliases": [ 00:31:50.526 "7df3031f-76a1-40f3-8659-8db310c4ec16" 00:31:50.526 ], 00:31:50.526 "product_name": "Malloc disk", 00:31:50.526 "block_size": 512, 00:31:50.526 "num_blocks": 65536, 00:31:50.526 "uuid": "7df3031f-76a1-40f3-8659-8db310c4ec16", 00:31:50.526 "assigned_rate_limits": { 00:31:50.526 "rw_ios_per_sec": 0, 00:31:50.526 "rw_mbytes_per_sec": 0, 00:31:50.526 "r_mbytes_per_sec": 0, 00:31:50.526 "w_mbytes_per_sec": 0 00:31:50.526 }, 00:31:50.526 "claimed": true, 00:31:50.526 "claim_type": "exclusive_write", 00:31:50.526 "zoned": false, 00:31:50.526 "supported_io_types": { 00:31:50.526 "read": true, 00:31:50.526 "write": true, 00:31:50.526 "unmap": true, 00:31:50.526 "write_zeroes": true, 00:31:50.526 "flush": true, 00:31:50.526 "reset": true, 00:31:50.526 "compare": false, 00:31:50.526 "compare_and_write": false, 00:31:50.526 "abort": true, 00:31:50.526 "nvme_admin": false, 00:31:50.526 "nvme_io": false 00:31:50.526 }, 00:31:50.526 "memory_domains": [ 00:31:50.526 { 00:31:50.526 "dma_device_id": "system", 00:31:50.526 "dma_device_type": 1 00:31:50.526 }, 00:31:50.526 { 00:31:50.526 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:50.526 "dma_device_type": 2 00:31:50.526 } 00:31:50.526 ], 00:31:50.526 "driver_specific": {} 00:31:50.526 } 00:31:50.526 ] 00:31:50.526 11:42:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:31:50.526 11:42:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:31:50.526 11:42:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:31:50.526 11:42:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:31:50.526 11:42:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:31:50.526 11:42:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:31:50.526 11:42:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:31:50.526 11:42:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:31:50.526 11:42:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:31:50.526 11:42:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:50.526 11:42:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:50.526 11:42:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:50.526 11:42:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:50.526 11:42:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:50.526 11:42:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:31:50.526 11:42:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:50.526 "name": "Existed_Raid", 00:31:50.526 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:50.526 "strip_size_kb": 0, 00:31:50.526 "state": "configuring", 00:31:50.526 "raid_level": "raid1", 00:31:50.526 "superblock": false, 00:31:50.526 "num_base_bdevs": 3, 00:31:50.526 "num_base_bdevs_discovered": 2, 00:31:50.526 "num_base_bdevs_operational": 3, 00:31:50.526 "base_bdevs_list": [ 00:31:50.526 { 00:31:50.526 "name": "BaseBdev1", 00:31:50.526 "uuid": "a0797cc8-8a3e-4bce-85a1-aad3d6593cd1", 00:31:50.526 "is_configured": true, 00:31:50.526 "data_offset": 0, 00:31:50.526 "data_size": 65536 00:31:50.526 }, 00:31:50.526 { 00:31:50.526 "name": "BaseBdev2", 00:31:50.526 "uuid": "7df3031f-76a1-40f3-8659-8db310c4ec16", 00:31:50.526 "is_configured": true, 00:31:50.526 "data_offset": 0, 00:31:50.526 "data_size": 65536 00:31:50.526 }, 00:31:50.526 { 00:31:50.526 "name": "BaseBdev3", 00:31:50.526 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:50.526 "is_configured": false, 00:31:50.526 "data_offset": 0, 00:31:50.526 "data_size": 0 00:31:50.526 } 00:31:50.526 ] 00:31:50.526 }' 00:31:50.526 11:42:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:50.526 11:42:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:31:51.093 11:42:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:31:51.351 [2024-06-10 11:42:35.103673] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:31:51.351 [2024-06-10 11:42:35.103702] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x25f1cf0 00:31:51.351 [2024-06-10 11:42:35.103708] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:31:51.351 [2024-06-10 11:42:35.103841] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25e9850 00:31:51.351 [2024-06-10 11:42:35.103933] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x25f1cf0 00:31:51.351 [2024-06-10 11:42:35.103940] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x25f1cf0 00:31:51.351 [2024-06-10 11:42:35.104063] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:31:51.351 BaseBdev3 00:31:51.351 11:42:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:31:51.351 11:42:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:31:51.351 11:42:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:31:51.351 11:42:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:31:51.351 11:42:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:31:51.351 11:42:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:31:51.351 11:42:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:31:51.351 11:42:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:31:51.609 [ 00:31:51.609 { 00:31:51.609 "name": "BaseBdev3", 00:31:51.609 "aliases": [ 00:31:51.609 "7785865c-9b45-4b3b-8f87-5c8355830e95" 00:31:51.609 ], 00:31:51.609 "product_name": "Malloc disk", 00:31:51.609 "block_size": 512, 00:31:51.609 "num_blocks": 65536, 00:31:51.609 "uuid": "7785865c-9b45-4b3b-8f87-5c8355830e95", 00:31:51.609 "assigned_rate_limits": { 00:31:51.609 "rw_ios_per_sec": 0, 00:31:51.609 "rw_mbytes_per_sec": 0, 00:31:51.609 "r_mbytes_per_sec": 0, 00:31:51.609 "w_mbytes_per_sec": 0 00:31:51.609 }, 00:31:51.609 "claimed": true, 00:31:51.609 "claim_type": "exclusive_write", 00:31:51.609 "zoned": false, 00:31:51.609 "supported_io_types": { 00:31:51.609 "read": true, 00:31:51.609 "write": true, 00:31:51.609 "unmap": true, 00:31:51.609 "write_zeroes": true, 00:31:51.609 "flush": true, 00:31:51.609 "reset": true, 00:31:51.609 "compare": false, 00:31:51.609 "compare_and_write": false, 00:31:51.609 "abort": true, 00:31:51.609 "nvme_admin": false, 00:31:51.609 "nvme_io": false 00:31:51.609 }, 00:31:51.609 "memory_domains": [ 00:31:51.609 { 00:31:51.609 "dma_device_id": "system", 00:31:51.609 "dma_device_type": 1 00:31:51.609 }, 00:31:51.609 { 00:31:51.609 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:51.609 "dma_device_type": 2 00:31:51.609 } 00:31:51.609 ], 00:31:51.609 "driver_specific": {} 00:31:51.609 } 00:31:51.609 ] 00:31:51.609 11:42:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:31:51.609 11:42:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:31:51.609 11:42:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:31:51.609 11:42:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:31:51.609 11:42:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:31:51.609 11:42:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:31:51.609 11:42:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:31:51.609 11:42:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:31:51.609 11:42:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:31:51.609 11:42:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:51.609 11:42:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:51.609 11:42:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:51.609 11:42:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:51.609 11:42:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:31:51.609 11:42:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:51.867 11:42:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:51.867 "name": "Existed_Raid", 00:31:51.867 "uuid": "6bd40d08-ce80-436c-af41-4b3ba22654d3", 00:31:51.867 "strip_size_kb": 0, 00:31:51.867 "state": "online", 00:31:51.867 "raid_level": "raid1", 00:31:51.867 "superblock": false, 00:31:51.867 "num_base_bdevs": 3, 00:31:51.867 "num_base_bdevs_discovered": 3, 00:31:51.867 "num_base_bdevs_operational": 3, 00:31:51.867 "base_bdevs_list": [ 00:31:51.867 { 00:31:51.867 "name": "BaseBdev1", 00:31:51.867 "uuid": "a0797cc8-8a3e-4bce-85a1-aad3d6593cd1", 00:31:51.867 "is_configured": true, 00:31:51.867 "data_offset": 0, 00:31:51.867 "data_size": 65536 00:31:51.867 }, 00:31:51.867 { 00:31:51.867 "name": "BaseBdev2", 00:31:51.867 "uuid": "7df3031f-76a1-40f3-8659-8db310c4ec16", 00:31:51.867 "is_configured": true, 00:31:51.867 "data_offset": 0, 00:31:51.867 "data_size": 65536 00:31:51.867 }, 00:31:51.867 { 00:31:51.867 "name": "BaseBdev3", 00:31:51.867 "uuid": "7785865c-9b45-4b3b-8f87-5c8355830e95", 00:31:51.867 "is_configured": true, 00:31:51.867 "data_offset": 0, 00:31:51.867 "data_size": 65536 00:31:51.867 } 00:31:51.867 ] 00:31:51.867 }' 00:31:51.867 11:42:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:51.867 11:42:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:31:52.434 11:42:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:31:52.434 11:42:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:31:52.434 11:42:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:31:52.434 11:42:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:31:52.434 11:42:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:31:52.434 11:42:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:31:52.434 11:42:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:31:52.434 11:42:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:31:52.434 [2024-06-10 11:42:36.287072] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:31:52.434 11:42:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:31:52.434 "name": "Existed_Raid", 00:31:52.434 "aliases": [ 00:31:52.434 "6bd40d08-ce80-436c-af41-4b3ba22654d3" 00:31:52.434 ], 00:31:52.434 "product_name": "Raid Volume", 00:31:52.434 "block_size": 512, 00:31:52.434 "num_blocks": 65536, 00:31:52.434 "uuid": "6bd40d08-ce80-436c-af41-4b3ba22654d3", 00:31:52.434 "assigned_rate_limits": { 00:31:52.434 "rw_ios_per_sec": 0, 00:31:52.434 "rw_mbytes_per_sec": 0, 00:31:52.434 "r_mbytes_per_sec": 0, 00:31:52.434 "w_mbytes_per_sec": 0 00:31:52.434 }, 00:31:52.434 "claimed": false, 00:31:52.434 "zoned": false, 00:31:52.434 "supported_io_types": { 00:31:52.434 "read": true, 00:31:52.434 "write": true, 00:31:52.434 "unmap": false, 00:31:52.434 "write_zeroes": true, 00:31:52.434 "flush": false, 00:31:52.434 "reset": true, 00:31:52.434 "compare": false, 00:31:52.434 "compare_and_write": false, 00:31:52.434 "abort": false, 00:31:52.434 "nvme_admin": false, 00:31:52.434 "nvme_io": false 00:31:52.434 }, 00:31:52.434 "memory_domains": [ 00:31:52.434 { 00:31:52.434 "dma_device_id": "system", 00:31:52.434 "dma_device_type": 1 00:31:52.434 }, 00:31:52.434 { 00:31:52.434 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:52.434 "dma_device_type": 2 00:31:52.434 }, 00:31:52.434 { 00:31:52.434 "dma_device_id": "system", 00:31:52.434 "dma_device_type": 1 00:31:52.434 }, 00:31:52.434 { 00:31:52.434 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:52.434 "dma_device_type": 2 00:31:52.434 }, 00:31:52.434 { 00:31:52.434 "dma_device_id": "system", 00:31:52.434 "dma_device_type": 1 00:31:52.434 }, 00:31:52.434 { 00:31:52.434 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:52.434 "dma_device_type": 2 00:31:52.434 } 00:31:52.434 ], 00:31:52.434 "driver_specific": { 00:31:52.434 "raid": { 00:31:52.434 "uuid": "6bd40d08-ce80-436c-af41-4b3ba22654d3", 00:31:52.434 "strip_size_kb": 0, 00:31:52.434 "state": "online", 00:31:52.434 "raid_level": "raid1", 00:31:52.434 "superblock": false, 00:31:52.434 "num_base_bdevs": 3, 00:31:52.434 "num_base_bdevs_discovered": 3, 00:31:52.434 "num_base_bdevs_operational": 3, 00:31:52.434 "base_bdevs_list": [ 00:31:52.434 { 00:31:52.434 "name": "BaseBdev1", 00:31:52.434 "uuid": "a0797cc8-8a3e-4bce-85a1-aad3d6593cd1", 00:31:52.434 "is_configured": true, 00:31:52.434 "data_offset": 0, 00:31:52.434 "data_size": 65536 00:31:52.434 }, 00:31:52.434 { 00:31:52.434 "name": "BaseBdev2", 00:31:52.434 "uuid": "7df3031f-76a1-40f3-8659-8db310c4ec16", 00:31:52.434 "is_configured": true, 00:31:52.434 "data_offset": 0, 00:31:52.434 "data_size": 65536 00:31:52.434 }, 00:31:52.434 { 00:31:52.434 "name": "BaseBdev3", 00:31:52.434 "uuid": "7785865c-9b45-4b3b-8f87-5c8355830e95", 00:31:52.434 "is_configured": true, 00:31:52.434 "data_offset": 0, 00:31:52.434 "data_size": 65536 00:31:52.434 } 00:31:52.434 ] 00:31:52.434 } 00:31:52.434 } 00:31:52.434 }' 00:31:52.434 11:42:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:31:52.434 11:42:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:31:52.434 BaseBdev2 00:31:52.434 BaseBdev3' 00:31:52.434 11:42:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:31:52.434 11:42:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:31:52.434 11:42:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:31:52.692 11:42:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:31:52.692 "name": "BaseBdev1", 00:31:52.692 "aliases": [ 00:31:52.692 "a0797cc8-8a3e-4bce-85a1-aad3d6593cd1" 00:31:52.692 ], 00:31:52.692 "product_name": "Malloc disk", 00:31:52.692 "block_size": 512, 00:31:52.692 "num_blocks": 65536, 00:31:52.692 "uuid": "a0797cc8-8a3e-4bce-85a1-aad3d6593cd1", 00:31:52.692 "assigned_rate_limits": { 00:31:52.692 "rw_ios_per_sec": 0, 00:31:52.692 "rw_mbytes_per_sec": 0, 00:31:52.692 "r_mbytes_per_sec": 0, 00:31:52.692 "w_mbytes_per_sec": 0 00:31:52.692 }, 00:31:52.692 "claimed": true, 00:31:52.692 "claim_type": "exclusive_write", 00:31:52.693 "zoned": false, 00:31:52.693 "supported_io_types": { 00:31:52.693 "read": true, 00:31:52.693 "write": true, 00:31:52.693 "unmap": true, 00:31:52.693 "write_zeroes": true, 00:31:52.693 "flush": true, 00:31:52.693 "reset": true, 00:31:52.693 "compare": false, 00:31:52.693 "compare_and_write": false, 00:31:52.693 "abort": true, 00:31:52.693 "nvme_admin": false, 00:31:52.693 "nvme_io": false 00:31:52.693 }, 00:31:52.693 "memory_domains": [ 00:31:52.693 { 00:31:52.693 "dma_device_id": "system", 00:31:52.693 "dma_device_type": 1 00:31:52.693 }, 00:31:52.693 { 00:31:52.693 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:52.693 "dma_device_type": 2 00:31:52.693 } 00:31:52.693 ], 00:31:52.693 "driver_specific": {} 00:31:52.693 }' 00:31:52.693 11:42:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:52.693 11:42:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:52.693 11:42:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:31:52.693 11:42:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:52.951 11:42:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:52.951 11:42:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:31:52.951 11:42:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:52.951 11:42:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:52.951 11:42:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:31:52.951 11:42:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:52.951 11:42:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:52.951 11:42:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:31:52.951 11:42:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:31:52.951 11:42:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:31:52.951 11:42:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:31:53.208 11:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:31:53.209 "name": "BaseBdev2", 00:31:53.209 "aliases": [ 00:31:53.209 "7df3031f-76a1-40f3-8659-8db310c4ec16" 00:31:53.209 ], 00:31:53.209 "product_name": "Malloc disk", 00:31:53.209 "block_size": 512, 00:31:53.209 "num_blocks": 65536, 00:31:53.209 "uuid": "7df3031f-76a1-40f3-8659-8db310c4ec16", 00:31:53.209 "assigned_rate_limits": { 00:31:53.209 "rw_ios_per_sec": 0, 00:31:53.209 "rw_mbytes_per_sec": 0, 00:31:53.209 "r_mbytes_per_sec": 0, 00:31:53.209 "w_mbytes_per_sec": 0 00:31:53.209 }, 00:31:53.209 "claimed": true, 00:31:53.209 "claim_type": "exclusive_write", 00:31:53.209 "zoned": false, 00:31:53.209 "supported_io_types": { 00:31:53.209 "read": true, 00:31:53.209 "write": true, 00:31:53.209 "unmap": true, 00:31:53.209 "write_zeroes": true, 00:31:53.209 "flush": true, 00:31:53.209 "reset": true, 00:31:53.209 "compare": false, 00:31:53.209 "compare_and_write": false, 00:31:53.209 "abort": true, 00:31:53.209 "nvme_admin": false, 00:31:53.209 "nvme_io": false 00:31:53.209 }, 00:31:53.209 "memory_domains": [ 00:31:53.209 { 00:31:53.209 "dma_device_id": "system", 00:31:53.209 "dma_device_type": 1 00:31:53.209 }, 00:31:53.209 { 00:31:53.209 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:53.209 "dma_device_type": 2 00:31:53.209 } 00:31:53.209 ], 00:31:53.209 "driver_specific": {} 00:31:53.209 }' 00:31:53.209 11:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:53.209 11:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:53.209 11:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:31:53.209 11:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:53.209 11:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:53.467 11:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:31:53.467 11:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:53.467 11:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:53.467 11:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:31:53.467 11:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:53.467 11:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:53.467 11:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:31:53.467 11:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:31:53.467 11:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:31:53.467 11:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:31:53.725 11:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:31:53.725 "name": "BaseBdev3", 00:31:53.725 "aliases": [ 00:31:53.725 "7785865c-9b45-4b3b-8f87-5c8355830e95" 00:31:53.725 ], 00:31:53.725 "product_name": "Malloc disk", 00:31:53.725 "block_size": 512, 00:31:53.725 "num_blocks": 65536, 00:31:53.725 "uuid": "7785865c-9b45-4b3b-8f87-5c8355830e95", 00:31:53.725 "assigned_rate_limits": { 00:31:53.725 "rw_ios_per_sec": 0, 00:31:53.725 "rw_mbytes_per_sec": 0, 00:31:53.725 "r_mbytes_per_sec": 0, 00:31:53.725 "w_mbytes_per_sec": 0 00:31:53.725 }, 00:31:53.725 "claimed": true, 00:31:53.725 "claim_type": "exclusive_write", 00:31:53.725 "zoned": false, 00:31:53.725 "supported_io_types": { 00:31:53.725 "read": true, 00:31:53.725 "write": true, 00:31:53.725 "unmap": true, 00:31:53.725 "write_zeroes": true, 00:31:53.725 "flush": true, 00:31:53.725 "reset": true, 00:31:53.725 "compare": false, 00:31:53.725 "compare_and_write": false, 00:31:53.725 "abort": true, 00:31:53.725 "nvme_admin": false, 00:31:53.725 "nvme_io": false 00:31:53.725 }, 00:31:53.725 "memory_domains": [ 00:31:53.725 { 00:31:53.725 "dma_device_id": "system", 00:31:53.725 "dma_device_type": 1 00:31:53.725 }, 00:31:53.725 { 00:31:53.725 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:53.725 "dma_device_type": 2 00:31:53.725 } 00:31:53.725 ], 00:31:53.725 "driver_specific": {} 00:31:53.725 }' 00:31:53.725 11:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:53.725 11:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:53.725 11:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:31:53.725 11:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:53.725 11:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:53.725 11:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:31:53.725 11:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:53.983 11:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:53.983 11:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:31:53.983 11:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:53.983 11:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:53.983 11:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:31:53.983 11:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:31:54.241 [2024-06-10 11:42:37.963257] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:31:54.241 11:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:31:54.241 11:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:31:54.241 11:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:31:54.241 11:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:31:54.241 11:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:31:54.241 11:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:31:54.241 11:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:31:54.241 11:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:31:54.241 11:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:31:54.241 11:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:31:54.241 11:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:31:54.241 11:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:54.241 11:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:54.241 11:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:54.241 11:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:54.241 11:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:54.241 11:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:31:54.241 11:42:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:54.241 "name": "Existed_Raid", 00:31:54.241 "uuid": "6bd40d08-ce80-436c-af41-4b3ba22654d3", 00:31:54.241 "strip_size_kb": 0, 00:31:54.241 "state": "online", 00:31:54.241 "raid_level": "raid1", 00:31:54.241 "superblock": false, 00:31:54.241 "num_base_bdevs": 3, 00:31:54.241 "num_base_bdevs_discovered": 2, 00:31:54.241 "num_base_bdevs_operational": 2, 00:31:54.241 "base_bdevs_list": [ 00:31:54.241 { 00:31:54.241 "name": null, 00:31:54.241 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:54.241 "is_configured": false, 00:31:54.241 "data_offset": 0, 00:31:54.241 "data_size": 65536 00:31:54.241 }, 00:31:54.241 { 00:31:54.241 "name": "BaseBdev2", 00:31:54.241 "uuid": "7df3031f-76a1-40f3-8659-8db310c4ec16", 00:31:54.241 "is_configured": true, 00:31:54.241 "data_offset": 0, 00:31:54.241 "data_size": 65536 00:31:54.241 }, 00:31:54.241 { 00:31:54.241 "name": "BaseBdev3", 00:31:54.241 "uuid": "7785865c-9b45-4b3b-8f87-5c8355830e95", 00:31:54.241 "is_configured": true, 00:31:54.241 "data_offset": 0, 00:31:54.241 "data_size": 65536 00:31:54.241 } 00:31:54.241 ] 00:31:54.241 }' 00:31:54.241 11:42:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:54.241 11:42:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:31:54.806 11:42:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:31:54.806 11:42:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:31:54.806 11:42:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:54.806 11:42:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:31:55.064 11:42:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:31:55.064 11:42:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:31:55.065 11:42:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:31:55.323 [2024-06-10 11:42:39.014679] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:31:55.323 11:42:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:31:55.323 11:42:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:31:55.323 11:42:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:55.323 11:42:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:31:55.323 11:42:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:31:55.323 11:42:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:31:55.323 11:42:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:31:55.581 [2024-06-10 11:42:39.379498] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:31:55.581 [2024-06-10 11:42:39.379557] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:31:55.581 [2024-06-10 11:42:39.391530] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:31:55.581 [2024-06-10 11:42:39.391557] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:31:55.581 [2024-06-10 11:42:39.391565] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25f1cf0 name Existed_Raid, state offline 00:31:55.581 11:42:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:31:55.581 11:42:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:31:55.581 11:42:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:55.581 11:42:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:31:55.839 11:42:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:31:55.839 11:42:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:31:55.839 11:42:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:31:55.839 11:42:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:31:55.839 11:42:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:31:55.839 11:42:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:31:55.839 BaseBdev2 00:31:55.839 11:42:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:31:55.839 11:42:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:31:55.839 11:42:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:31:55.839 11:42:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:31:55.839 11:42:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:31:55.839 11:42:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:31:55.839 11:42:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:31:56.097 11:42:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:31:56.356 [ 00:31:56.356 { 00:31:56.356 "name": "BaseBdev2", 00:31:56.356 "aliases": [ 00:31:56.356 "6566273b-ce24-4060-8cde-3ea06e48a9c3" 00:31:56.356 ], 00:31:56.356 "product_name": "Malloc disk", 00:31:56.356 "block_size": 512, 00:31:56.356 "num_blocks": 65536, 00:31:56.356 "uuid": "6566273b-ce24-4060-8cde-3ea06e48a9c3", 00:31:56.356 "assigned_rate_limits": { 00:31:56.356 "rw_ios_per_sec": 0, 00:31:56.356 "rw_mbytes_per_sec": 0, 00:31:56.356 "r_mbytes_per_sec": 0, 00:31:56.356 "w_mbytes_per_sec": 0 00:31:56.356 }, 00:31:56.356 "claimed": false, 00:31:56.356 "zoned": false, 00:31:56.356 "supported_io_types": { 00:31:56.356 "read": true, 00:31:56.356 "write": true, 00:31:56.356 "unmap": true, 00:31:56.356 "write_zeroes": true, 00:31:56.356 "flush": true, 00:31:56.356 "reset": true, 00:31:56.356 "compare": false, 00:31:56.356 "compare_and_write": false, 00:31:56.356 "abort": true, 00:31:56.356 "nvme_admin": false, 00:31:56.356 "nvme_io": false 00:31:56.356 }, 00:31:56.356 "memory_domains": [ 00:31:56.356 { 00:31:56.356 "dma_device_id": "system", 00:31:56.356 "dma_device_type": 1 00:31:56.356 }, 00:31:56.356 { 00:31:56.356 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:56.356 "dma_device_type": 2 00:31:56.356 } 00:31:56.356 ], 00:31:56.356 "driver_specific": {} 00:31:56.356 } 00:31:56.356 ] 00:31:56.356 11:42:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:31:56.356 11:42:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:31:56.356 11:42:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:31:56.356 11:42:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:31:56.356 BaseBdev3 00:31:56.614 11:42:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:31:56.614 11:42:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:31:56.614 11:42:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:31:56.614 11:42:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:31:56.614 11:42:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:31:56.614 11:42:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:31:56.614 11:42:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:31:56.614 11:42:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:31:56.871 [ 00:31:56.871 { 00:31:56.871 "name": "BaseBdev3", 00:31:56.871 "aliases": [ 00:31:56.871 "559fc6b8-53d8-4b68-ac4c-3fc6b027b6cc" 00:31:56.871 ], 00:31:56.871 "product_name": "Malloc disk", 00:31:56.872 "block_size": 512, 00:31:56.872 "num_blocks": 65536, 00:31:56.872 "uuid": "559fc6b8-53d8-4b68-ac4c-3fc6b027b6cc", 00:31:56.872 "assigned_rate_limits": { 00:31:56.872 "rw_ios_per_sec": 0, 00:31:56.872 "rw_mbytes_per_sec": 0, 00:31:56.872 "r_mbytes_per_sec": 0, 00:31:56.872 "w_mbytes_per_sec": 0 00:31:56.872 }, 00:31:56.872 "claimed": false, 00:31:56.872 "zoned": false, 00:31:56.872 "supported_io_types": { 00:31:56.872 "read": true, 00:31:56.872 "write": true, 00:31:56.872 "unmap": true, 00:31:56.872 "write_zeroes": true, 00:31:56.872 "flush": true, 00:31:56.872 "reset": true, 00:31:56.872 "compare": false, 00:31:56.872 "compare_and_write": false, 00:31:56.872 "abort": true, 00:31:56.872 "nvme_admin": false, 00:31:56.872 "nvme_io": false 00:31:56.872 }, 00:31:56.872 "memory_domains": [ 00:31:56.872 { 00:31:56.872 "dma_device_id": "system", 00:31:56.872 "dma_device_type": 1 00:31:56.872 }, 00:31:56.872 { 00:31:56.872 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:56.872 "dma_device_type": 2 00:31:56.872 } 00:31:56.872 ], 00:31:56.872 "driver_specific": {} 00:31:56.872 } 00:31:56.872 ] 00:31:56.872 11:42:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:31:56.872 11:42:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:31:56.872 11:42:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:31:56.872 11:42:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:31:56.872 [2024-06-10 11:42:40.796166] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:31:56.872 [2024-06-10 11:42:40.796200] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:31:56.872 [2024-06-10 11:42:40.796213] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:31:56.872 [2024-06-10 11:42:40.797231] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:31:56.872 11:42:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:31:56.872 11:42:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:31:56.872 11:42:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:31:56.872 11:42:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:31:56.872 11:42:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:31:56.872 11:42:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:31:56.872 11:42:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:56.872 11:42:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:56.872 11:42:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:56.872 11:42:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:56.872 11:42:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:57.130 11:42:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:31:57.130 11:42:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:57.130 "name": "Existed_Raid", 00:31:57.130 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:57.130 "strip_size_kb": 0, 00:31:57.130 "state": "configuring", 00:31:57.130 "raid_level": "raid1", 00:31:57.130 "superblock": false, 00:31:57.130 "num_base_bdevs": 3, 00:31:57.130 "num_base_bdevs_discovered": 2, 00:31:57.130 "num_base_bdevs_operational": 3, 00:31:57.130 "base_bdevs_list": [ 00:31:57.130 { 00:31:57.130 "name": "BaseBdev1", 00:31:57.130 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:57.130 "is_configured": false, 00:31:57.130 "data_offset": 0, 00:31:57.130 "data_size": 0 00:31:57.130 }, 00:31:57.130 { 00:31:57.130 "name": "BaseBdev2", 00:31:57.130 "uuid": "6566273b-ce24-4060-8cde-3ea06e48a9c3", 00:31:57.130 "is_configured": true, 00:31:57.130 "data_offset": 0, 00:31:57.130 "data_size": 65536 00:31:57.130 }, 00:31:57.130 { 00:31:57.130 "name": "BaseBdev3", 00:31:57.130 "uuid": "559fc6b8-53d8-4b68-ac4c-3fc6b027b6cc", 00:31:57.130 "is_configured": true, 00:31:57.130 "data_offset": 0, 00:31:57.130 "data_size": 65536 00:31:57.130 } 00:31:57.130 ] 00:31:57.130 }' 00:31:57.130 11:42:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:57.130 11:42:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:31:57.697 11:42:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:31:57.697 [2024-06-10 11:42:41.622283] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:31:57.697 11:42:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:31:57.956 11:42:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:31:57.957 11:42:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:31:57.957 11:42:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:31:57.957 11:42:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:31:57.957 11:42:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:31:57.957 11:42:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:57.957 11:42:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:57.957 11:42:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:57.957 11:42:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:57.957 11:42:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:31:57.957 11:42:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:57.957 11:42:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:57.957 "name": "Existed_Raid", 00:31:57.957 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:57.957 "strip_size_kb": 0, 00:31:57.957 "state": "configuring", 00:31:57.957 "raid_level": "raid1", 00:31:57.957 "superblock": false, 00:31:57.957 "num_base_bdevs": 3, 00:31:57.957 "num_base_bdevs_discovered": 1, 00:31:57.957 "num_base_bdevs_operational": 3, 00:31:57.957 "base_bdevs_list": [ 00:31:57.957 { 00:31:57.957 "name": "BaseBdev1", 00:31:57.957 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:57.957 "is_configured": false, 00:31:57.957 "data_offset": 0, 00:31:57.957 "data_size": 0 00:31:57.957 }, 00:31:57.957 { 00:31:57.957 "name": null, 00:31:57.957 "uuid": "6566273b-ce24-4060-8cde-3ea06e48a9c3", 00:31:57.957 "is_configured": false, 00:31:57.957 "data_offset": 0, 00:31:57.957 "data_size": 65536 00:31:57.957 }, 00:31:57.957 { 00:31:57.957 "name": "BaseBdev3", 00:31:57.957 "uuid": "559fc6b8-53d8-4b68-ac4c-3fc6b027b6cc", 00:31:57.957 "is_configured": true, 00:31:57.957 "data_offset": 0, 00:31:57.957 "data_size": 65536 00:31:57.957 } 00:31:57.957 ] 00:31:57.957 }' 00:31:57.957 11:42:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:57.957 11:42:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:31:58.524 11:42:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:58.524 11:42:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:31:58.782 11:42:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:31:58.782 11:42:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:31:58.782 [2024-06-10 11:42:42.659884] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:31:58.782 BaseBdev1 00:31:58.782 11:42:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:31:58.782 11:42:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:31:58.782 11:42:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:31:58.782 11:42:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:31:58.782 11:42:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:31:58.782 11:42:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:31:58.782 11:42:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:31:59.041 11:42:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:31:59.300 [ 00:31:59.300 { 00:31:59.300 "name": "BaseBdev1", 00:31:59.300 "aliases": [ 00:31:59.300 "3da126dc-058c-433f-9820-44697496a938" 00:31:59.300 ], 00:31:59.300 "product_name": "Malloc disk", 00:31:59.300 "block_size": 512, 00:31:59.300 "num_blocks": 65536, 00:31:59.300 "uuid": "3da126dc-058c-433f-9820-44697496a938", 00:31:59.300 "assigned_rate_limits": { 00:31:59.300 "rw_ios_per_sec": 0, 00:31:59.300 "rw_mbytes_per_sec": 0, 00:31:59.300 "r_mbytes_per_sec": 0, 00:31:59.300 "w_mbytes_per_sec": 0 00:31:59.300 }, 00:31:59.300 "claimed": true, 00:31:59.300 "claim_type": "exclusive_write", 00:31:59.300 "zoned": false, 00:31:59.300 "supported_io_types": { 00:31:59.300 "read": true, 00:31:59.300 "write": true, 00:31:59.300 "unmap": true, 00:31:59.300 "write_zeroes": true, 00:31:59.300 "flush": true, 00:31:59.300 "reset": true, 00:31:59.300 "compare": false, 00:31:59.300 "compare_and_write": false, 00:31:59.300 "abort": true, 00:31:59.300 "nvme_admin": false, 00:31:59.300 "nvme_io": false 00:31:59.300 }, 00:31:59.300 "memory_domains": [ 00:31:59.300 { 00:31:59.300 "dma_device_id": "system", 00:31:59.300 "dma_device_type": 1 00:31:59.300 }, 00:31:59.300 { 00:31:59.300 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:59.300 "dma_device_type": 2 00:31:59.300 } 00:31:59.300 ], 00:31:59.300 "driver_specific": {} 00:31:59.300 } 00:31:59.300 ] 00:31:59.300 11:42:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:31:59.300 11:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:31:59.300 11:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:31:59.300 11:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:31:59.300 11:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:31:59.300 11:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:31:59.300 11:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:31:59.300 11:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:59.300 11:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:59.300 11:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:59.300 11:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:59.300 11:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:59.300 11:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:31:59.300 11:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:59.300 "name": "Existed_Raid", 00:31:59.300 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:59.300 "strip_size_kb": 0, 00:31:59.300 "state": "configuring", 00:31:59.300 "raid_level": "raid1", 00:31:59.300 "superblock": false, 00:31:59.300 "num_base_bdevs": 3, 00:31:59.300 "num_base_bdevs_discovered": 2, 00:31:59.300 "num_base_bdevs_operational": 3, 00:31:59.300 "base_bdevs_list": [ 00:31:59.300 { 00:31:59.300 "name": "BaseBdev1", 00:31:59.300 "uuid": "3da126dc-058c-433f-9820-44697496a938", 00:31:59.300 "is_configured": true, 00:31:59.300 "data_offset": 0, 00:31:59.300 "data_size": 65536 00:31:59.300 }, 00:31:59.300 { 00:31:59.300 "name": null, 00:31:59.300 "uuid": "6566273b-ce24-4060-8cde-3ea06e48a9c3", 00:31:59.300 "is_configured": false, 00:31:59.300 "data_offset": 0, 00:31:59.300 "data_size": 65536 00:31:59.300 }, 00:31:59.300 { 00:31:59.300 "name": "BaseBdev3", 00:31:59.300 "uuid": "559fc6b8-53d8-4b68-ac4c-3fc6b027b6cc", 00:31:59.300 "is_configured": true, 00:31:59.300 "data_offset": 0, 00:31:59.300 "data_size": 65536 00:31:59.300 } 00:31:59.300 ] 00:31:59.300 }' 00:31:59.300 11:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:59.300 11:42:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:31:59.868 11:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:59.868 11:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:32:00.126 11:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:32:00.126 11:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:32:00.126 [2024-06-10 11:42:43.991334] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:32:00.126 11:42:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:32:00.126 11:42:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:32:00.126 11:42:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:32:00.126 11:42:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:00.126 11:42:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:00.127 11:42:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:32:00.127 11:42:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:00.127 11:42:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:00.127 11:42:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:00.127 11:42:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:00.127 11:42:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:00.127 11:42:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:32:00.385 11:42:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:00.385 "name": "Existed_Raid", 00:32:00.385 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:00.385 "strip_size_kb": 0, 00:32:00.385 "state": "configuring", 00:32:00.385 "raid_level": "raid1", 00:32:00.385 "superblock": false, 00:32:00.385 "num_base_bdevs": 3, 00:32:00.385 "num_base_bdevs_discovered": 1, 00:32:00.385 "num_base_bdevs_operational": 3, 00:32:00.385 "base_bdevs_list": [ 00:32:00.385 { 00:32:00.385 "name": "BaseBdev1", 00:32:00.385 "uuid": "3da126dc-058c-433f-9820-44697496a938", 00:32:00.385 "is_configured": true, 00:32:00.385 "data_offset": 0, 00:32:00.385 "data_size": 65536 00:32:00.385 }, 00:32:00.385 { 00:32:00.385 "name": null, 00:32:00.385 "uuid": "6566273b-ce24-4060-8cde-3ea06e48a9c3", 00:32:00.385 "is_configured": false, 00:32:00.385 "data_offset": 0, 00:32:00.385 "data_size": 65536 00:32:00.385 }, 00:32:00.385 { 00:32:00.385 "name": null, 00:32:00.385 "uuid": "559fc6b8-53d8-4b68-ac4c-3fc6b027b6cc", 00:32:00.385 "is_configured": false, 00:32:00.385 "data_offset": 0, 00:32:00.385 "data_size": 65536 00:32:00.385 } 00:32:00.385 ] 00:32:00.385 }' 00:32:00.385 11:42:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:00.385 11:42:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:32:00.953 11:42:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:00.953 11:42:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:32:00.953 11:42:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:32:00.953 11:42:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:32:01.212 [2024-06-10 11:42:45.038085] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:32:01.212 11:42:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:32:01.212 11:42:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:32:01.212 11:42:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:32:01.212 11:42:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:01.212 11:42:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:01.212 11:42:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:32:01.212 11:42:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:01.212 11:42:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:01.212 11:42:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:01.212 11:42:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:01.212 11:42:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:01.212 11:42:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:32:01.471 11:42:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:01.471 "name": "Existed_Raid", 00:32:01.471 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:01.471 "strip_size_kb": 0, 00:32:01.471 "state": "configuring", 00:32:01.471 "raid_level": "raid1", 00:32:01.471 "superblock": false, 00:32:01.471 "num_base_bdevs": 3, 00:32:01.471 "num_base_bdevs_discovered": 2, 00:32:01.471 "num_base_bdevs_operational": 3, 00:32:01.471 "base_bdevs_list": [ 00:32:01.471 { 00:32:01.471 "name": "BaseBdev1", 00:32:01.471 "uuid": "3da126dc-058c-433f-9820-44697496a938", 00:32:01.471 "is_configured": true, 00:32:01.471 "data_offset": 0, 00:32:01.471 "data_size": 65536 00:32:01.471 }, 00:32:01.471 { 00:32:01.471 "name": null, 00:32:01.471 "uuid": "6566273b-ce24-4060-8cde-3ea06e48a9c3", 00:32:01.471 "is_configured": false, 00:32:01.471 "data_offset": 0, 00:32:01.471 "data_size": 65536 00:32:01.471 }, 00:32:01.471 { 00:32:01.471 "name": "BaseBdev3", 00:32:01.471 "uuid": "559fc6b8-53d8-4b68-ac4c-3fc6b027b6cc", 00:32:01.471 "is_configured": true, 00:32:01.471 "data_offset": 0, 00:32:01.471 "data_size": 65536 00:32:01.471 } 00:32:01.471 ] 00:32:01.471 }' 00:32:01.471 11:42:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:01.471 11:42:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:32:02.038 11:42:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:02.038 11:42:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:32:02.038 11:42:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:32:02.038 11:42:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:32:02.296 [2024-06-10 11:42:46.008611] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:32:02.296 11:42:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:32:02.296 11:42:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:32:02.296 11:42:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:32:02.296 11:42:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:02.296 11:42:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:02.296 11:42:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:32:02.296 11:42:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:02.296 11:42:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:02.296 11:42:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:02.296 11:42:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:02.296 11:42:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:02.296 11:42:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:32:02.296 11:42:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:02.296 "name": "Existed_Raid", 00:32:02.296 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:02.296 "strip_size_kb": 0, 00:32:02.296 "state": "configuring", 00:32:02.296 "raid_level": "raid1", 00:32:02.296 "superblock": false, 00:32:02.296 "num_base_bdevs": 3, 00:32:02.296 "num_base_bdevs_discovered": 1, 00:32:02.296 "num_base_bdevs_operational": 3, 00:32:02.296 "base_bdevs_list": [ 00:32:02.296 { 00:32:02.296 "name": null, 00:32:02.296 "uuid": "3da126dc-058c-433f-9820-44697496a938", 00:32:02.296 "is_configured": false, 00:32:02.296 "data_offset": 0, 00:32:02.296 "data_size": 65536 00:32:02.296 }, 00:32:02.296 { 00:32:02.296 "name": null, 00:32:02.296 "uuid": "6566273b-ce24-4060-8cde-3ea06e48a9c3", 00:32:02.296 "is_configured": false, 00:32:02.297 "data_offset": 0, 00:32:02.297 "data_size": 65536 00:32:02.297 }, 00:32:02.297 { 00:32:02.297 "name": "BaseBdev3", 00:32:02.297 "uuid": "559fc6b8-53d8-4b68-ac4c-3fc6b027b6cc", 00:32:02.297 "is_configured": true, 00:32:02.297 "data_offset": 0, 00:32:02.297 "data_size": 65536 00:32:02.297 } 00:32:02.297 ] 00:32:02.297 }' 00:32:02.297 11:42:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:02.297 11:42:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:32:02.863 11:42:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:02.863 11:42:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:32:03.122 11:42:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:32:03.122 11:42:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:32:03.122 [2024-06-10 11:42:47.031050] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:32:03.122 11:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:32:03.122 11:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:32:03.122 11:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:32:03.122 11:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:03.122 11:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:03.122 11:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:32:03.122 11:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:03.122 11:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:03.122 11:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:03.122 11:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:03.122 11:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:03.122 11:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:32:03.381 11:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:03.381 "name": "Existed_Raid", 00:32:03.381 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:03.381 "strip_size_kb": 0, 00:32:03.381 "state": "configuring", 00:32:03.381 "raid_level": "raid1", 00:32:03.381 "superblock": false, 00:32:03.381 "num_base_bdevs": 3, 00:32:03.381 "num_base_bdevs_discovered": 2, 00:32:03.381 "num_base_bdevs_operational": 3, 00:32:03.381 "base_bdevs_list": [ 00:32:03.381 { 00:32:03.381 "name": null, 00:32:03.381 "uuid": "3da126dc-058c-433f-9820-44697496a938", 00:32:03.381 "is_configured": false, 00:32:03.381 "data_offset": 0, 00:32:03.381 "data_size": 65536 00:32:03.381 }, 00:32:03.381 { 00:32:03.381 "name": "BaseBdev2", 00:32:03.381 "uuid": "6566273b-ce24-4060-8cde-3ea06e48a9c3", 00:32:03.381 "is_configured": true, 00:32:03.381 "data_offset": 0, 00:32:03.381 "data_size": 65536 00:32:03.381 }, 00:32:03.381 { 00:32:03.381 "name": "BaseBdev3", 00:32:03.381 "uuid": "559fc6b8-53d8-4b68-ac4c-3fc6b027b6cc", 00:32:03.381 "is_configured": true, 00:32:03.381 "data_offset": 0, 00:32:03.381 "data_size": 65536 00:32:03.381 } 00:32:03.381 ] 00:32:03.381 }' 00:32:03.381 11:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:03.381 11:42:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:32:03.948 11:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:03.948 11:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:32:04.206 11:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:32:04.206 11:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:04.206 11:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:32:04.206 11:42:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 3da126dc-058c-433f-9820-44697496a938 00:32:04.465 [2024-06-10 11:42:48.250183] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:32:04.465 [2024-06-10 11:42:48.250215] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x25e7ea0 00:32:04.465 [2024-06-10 11:42:48.250220] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:32:04.465 [2024-06-10 11:42:48.250350] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x27a5140 00:32:04.465 [2024-06-10 11:42:48.250435] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x25e7ea0 00:32:04.465 [2024-06-10 11:42:48.250442] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x25e7ea0 00:32:04.465 [2024-06-10 11:42:48.250575] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:32:04.465 NewBaseBdev 00:32:04.465 11:42:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:32:04.465 11:42:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=NewBaseBdev 00:32:04.465 11:42:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:32:04.465 11:42:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:32:04.465 11:42:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:32:04.465 11:42:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:32:04.465 11:42:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:32:04.723 11:42:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:32:04.723 [ 00:32:04.723 { 00:32:04.723 "name": "NewBaseBdev", 00:32:04.723 "aliases": [ 00:32:04.723 "3da126dc-058c-433f-9820-44697496a938" 00:32:04.723 ], 00:32:04.723 "product_name": "Malloc disk", 00:32:04.723 "block_size": 512, 00:32:04.723 "num_blocks": 65536, 00:32:04.723 "uuid": "3da126dc-058c-433f-9820-44697496a938", 00:32:04.723 "assigned_rate_limits": { 00:32:04.723 "rw_ios_per_sec": 0, 00:32:04.723 "rw_mbytes_per_sec": 0, 00:32:04.723 "r_mbytes_per_sec": 0, 00:32:04.723 "w_mbytes_per_sec": 0 00:32:04.723 }, 00:32:04.723 "claimed": true, 00:32:04.723 "claim_type": "exclusive_write", 00:32:04.723 "zoned": false, 00:32:04.723 "supported_io_types": { 00:32:04.723 "read": true, 00:32:04.723 "write": true, 00:32:04.723 "unmap": true, 00:32:04.723 "write_zeroes": true, 00:32:04.723 "flush": true, 00:32:04.723 "reset": true, 00:32:04.723 "compare": false, 00:32:04.723 "compare_and_write": false, 00:32:04.723 "abort": true, 00:32:04.723 "nvme_admin": false, 00:32:04.723 "nvme_io": false 00:32:04.723 }, 00:32:04.723 "memory_domains": [ 00:32:04.723 { 00:32:04.723 "dma_device_id": "system", 00:32:04.723 "dma_device_type": 1 00:32:04.723 }, 00:32:04.723 { 00:32:04.723 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:04.723 "dma_device_type": 2 00:32:04.723 } 00:32:04.723 ], 00:32:04.723 "driver_specific": {} 00:32:04.723 } 00:32:04.723 ] 00:32:04.723 11:42:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:32:04.723 11:42:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:32:04.723 11:42:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:32:04.723 11:42:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:32:04.723 11:42:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:04.723 11:42:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:04.723 11:42:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:32:04.723 11:42:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:04.724 11:42:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:04.724 11:42:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:04.724 11:42:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:04.724 11:42:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:04.724 11:42:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:32:04.982 11:42:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:04.982 "name": "Existed_Raid", 00:32:04.982 "uuid": "ebb72180-759b-41d4-b7ed-8fe22f5a0155", 00:32:04.982 "strip_size_kb": 0, 00:32:04.982 "state": "online", 00:32:04.982 "raid_level": "raid1", 00:32:04.982 "superblock": false, 00:32:04.982 "num_base_bdevs": 3, 00:32:04.982 "num_base_bdevs_discovered": 3, 00:32:04.982 "num_base_bdevs_operational": 3, 00:32:04.982 "base_bdevs_list": [ 00:32:04.982 { 00:32:04.982 "name": "NewBaseBdev", 00:32:04.982 "uuid": "3da126dc-058c-433f-9820-44697496a938", 00:32:04.982 "is_configured": true, 00:32:04.982 "data_offset": 0, 00:32:04.982 "data_size": 65536 00:32:04.982 }, 00:32:04.982 { 00:32:04.982 "name": "BaseBdev2", 00:32:04.982 "uuid": "6566273b-ce24-4060-8cde-3ea06e48a9c3", 00:32:04.982 "is_configured": true, 00:32:04.982 "data_offset": 0, 00:32:04.982 "data_size": 65536 00:32:04.982 }, 00:32:04.982 { 00:32:04.982 "name": "BaseBdev3", 00:32:04.982 "uuid": "559fc6b8-53d8-4b68-ac4c-3fc6b027b6cc", 00:32:04.982 "is_configured": true, 00:32:04.982 "data_offset": 0, 00:32:04.982 "data_size": 65536 00:32:04.982 } 00:32:04.982 ] 00:32:04.982 }' 00:32:04.982 11:42:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:04.982 11:42:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:32:05.549 11:42:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:32:05.549 11:42:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:32:05.549 11:42:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:32:05.549 11:42:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:32:05.549 11:42:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:32:05.549 11:42:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:32:05.549 11:42:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:32:05.549 11:42:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:32:05.549 [2024-06-10 11:42:49.389301] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:32:05.549 11:42:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:32:05.549 "name": "Existed_Raid", 00:32:05.549 "aliases": [ 00:32:05.549 "ebb72180-759b-41d4-b7ed-8fe22f5a0155" 00:32:05.549 ], 00:32:05.549 "product_name": "Raid Volume", 00:32:05.549 "block_size": 512, 00:32:05.549 "num_blocks": 65536, 00:32:05.549 "uuid": "ebb72180-759b-41d4-b7ed-8fe22f5a0155", 00:32:05.549 "assigned_rate_limits": { 00:32:05.549 "rw_ios_per_sec": 0, 00:32:05.549 "rw_mbytes_per_sec": 0, 00:32:05.549 "r_mbytes_per_sec": 0, 00:32:05.549 "w_mbytes_per_sec": 0 00:32:05.549 }, 00:32:05.549 "claimed": false, 00:32:05.549 "zoned": false, 00:32:05.549 "supported_io_types": { 00:32:05.549 "read": true, 00:32:05.549 "write": true, 00:32:05.549 "unmap": false, 00:32:05.549 "write_zeroes": true, 00:32:05.549 "flush": false, 00:32:05.549 "reset": true, 00:32:05.549 "compare": false, 00:32:05.549 "compare_and_write": false, 00:32:05.549 "abort": false, 00:32:05.549 "nvme_admin": false, 00:32:05.549 "nvme_io": false 00:32:05.549 }, 00:32:05.549 "memory_domains": [ 00:32:05.549 { 00:32:05.549 "dma_device_id": "system", 00:32:05.549 "dma_device_type": 1 00:32:05.549 }, 00:32:05.549 { 00:32:05.549 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:05.549 "dma_device_type": 2 00:32:05.549 }, 00:32:05.549 { 00:32:05.549 "dma_device_id": "system", 00:32:05.549 "dma_device_type": 1 00:32:05.549 }, 00:32:05.549 { 00:32:05.549 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:05.549 "dma_device_type": 2 00:32:05.549 }, 00:32:05.549 { 00:32:05.549 "dma_device_id": "system", 00:32:05.549 "dma_device_type": 1 00:32:05.549 }, 00:32:05.549 { 00:32:05.549 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:05.549 "dma_device_type": 2 00:32:05.549 } 00:32:05.549 ], 00:32:05.549 "driver_specific": { 00:32:05.549 "raid": { 00:32:05.549 "uuid": "ebb72180-759b-41d4-b7ed-8fe22f5a0155", 00:32:05.549 "strip_size_kb": 0, 00:32:05.549 "state": "online", 00:32:05.549 "raid_level": "raid1", 00:32:05.549 "superblock": false, 00:32:05.549 "num_base_bdevs": 3, 00:32:05.549 "num_base_bdevs_discovered": 3, 00:32:05.549 "num_base_bdevs_operational": 3, 00:32:05.549 "base_bdevs_list": [ 00:32:05.549 { 00:32:05.549 "name": "NewBaseBdev", 00:32:05.549 "uuid": "3da126dc-058c-433f-9820-44697496a938", 00:32:05.549 "is_configured": true, 00:32:05.549 "data_offset": 0, 00:32:05.549 "data_size": 65536 00:32:05.549 }, 00:32:05.549 { 00:32:05.549 "name": "BaseBdev2", 00:32:05.549 "uuid": "6566273b-ce24-4060-8cde-3ea06e48a9c3", 00:32:05.549 "is_configured": true, 00:32:05.549 "data_offset": 0, 00:32:05.549 "data_size": 65536 00:32:05.549 }, 00:32:05.549 { 00:32:05.549 "name": "BaseBdev3", 00:32:05.549 "uuid": "559fc6b8-53d8-4b68-ac4c-3fc6b027b6cc", 00:32:05.549 "is_configured": true, 00:32:05.549 "data_offset": 0, 00:32:05.549 "data_size": 65536 00:32:05.549 } 00:32:05.549 ] 00:32:05.549 } 00:32:05.549 } 00:32:05.549 }' 00:32:05.549 11:42:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:32:05.549 11:42:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:32:05.549 BaseBdev2 00:32:05.549 BaseBdev3' 00:32:05.549 11:42:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:32:05.549 11:42:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:32:05.549 11:42:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:32:05.808 11:42:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:32:05.808 "name": "NewBaseBdev", 00:32:05.808 "aliases": [ 00:32:05.808 "3da126dc-058c-433f-9820-44697496a938" 00:32:05.808 ], 00:32:05.808 "product_name": "Malloc disk", 00:32:05.808 "block_size": 512, 00:32:05.808 "num_blocks": 65536, 00:32:05.808 "uuid": "3da126dc-058c-433f-9820-44697496a938", 00:32:05.808 "assigned_rate_limits": { 00:32:05.808 "rw_ios_per_sec": 0, 00:32:05.808 "rw_mbytes_per_sec": 0, 00:32:05.808 "r_mbytes_per_sec": 0, 00:32:05.808 "w_mbytes_per_sec": 0 00:32:05.808 }, 00:32:05.808 "claimed": true, 00:32:05.808 "claim_type": "exclusive_write", 00:32:05.808 "zoned": false, 00:32:05.808 "supported_io_types": { 00:32:05.808 "read": true, 00:32:05.808 "write": true, 00:32:05.808 "unmap": true, 00:32:05.808 "write_zeroes": true, 00:32:05.808 "flush": true, 00:32:05.808 "reset": true, 00:32:05.808 "compare": false, 00:32:05.808 "compare_and_write": false, 00:32:05.808 "abort": true, 00:32:05.808 "nvme_admin": false, 00:32:05.808 "nvme_io": false 00:32:05.808 }, 00:32:05.808 "memory_domains": [ 00:32:05.808 { 00:32:05.808 "dma_device_id": "system", 00:32:05.808 "dma_device_type": 1 00:32:05.808 }, 00:32:05.808 { 00:32:05.808 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:05.808 "dma_device_type": 2 00:32:05.808 } 00:32:05.808 ], 00:32:05.808 "driver_specific": {} 00:32:05.808 }' 00:32:05.808 11:42:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:32:05.808 11:42:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:32:05.808 11:42:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:32:05.808 11:42:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:32:05.808 11:42:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:32:06.066 11:42:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:32:06.066 11:42:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:32:06.066 11:42:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:32:06.066 11:42:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:32:06.066 11:42:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:32:06.066 11:42:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:32:06.066 11:42:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:32:06.066 11:42:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:32:06.066 11:42:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:32:06.066 11:42:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:32:06.324 11:42:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:32:06.324 "name": "BaseBdev2", 00:32:06.324 "aliases": [ 00:32:06.324 "6566273b-ce24-4060-8cde-3ea06e48a9c3" 00:32:06.324 ], 00:32:06.324 "product_name": "Malloc disk", 00:32:06.324 "block_size": 512, 00:32:06.324 "num_blocks": 65536, 00:32:06.324 "uuid": "6566273b-ce24-4060-8cde-3ea06e48a9c3", 00:32:06.324 "assigned_rate_limits": { 00:32:06.324 "rw_ios_per_sec": 0, 00:32:06.324 "rw_mbytes_per_sec": 0, 00:32:06.324 "r_mbytes_per_sec": 0, 00:32:06.324 "w_mbytes_per_sec": 0 00:32:06.324 }, 00:32:06.324 "claimed": true, 00:32:06.324 "claim_type": "exclusive_write", 00:32:06.324 "zoned": false, 00:32:06.324 "supported_io_types": { 00:32:06.324 "read": true, 00:32:06.324 "write": true, 00:32:06.324 "unmap": true, 00:32:06.324 "write_zeroes": true, 00:32:06.324 "flush": true, 00:32:06.324 "reset": true, 00:32:06.324 "compare": false, 00:32:06.324 "compare_and_write": false, 00:32:06.324 "abort": true, 00:32:06.324 "nvme_admin": false, 00:32:06.324 "nvme_io": false 00:32:06.324 }, 00:32:06.324 "memory_domains": [ 00:32:06.324 { 00:32:06.324 "dma_device_id": "system", 00:32:06.324 "dma_device_type": 1 00:32:06.324 }, 00:32:06.324 { 00:32:06.324 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:06.324 "dma_device_type": 2 00:32:06.324 } 00:32:06.324 ], 00:32:06.324 "driver_specific": {} 00:32:06.324 }' 00:32:06.324 11:42:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:32:06.324 11:42:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:32:06.324 11:42:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:32:06.324 11:42:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:32:06.324 11:42:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:32:06.324 11:42:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:32:06.324 11:42:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:32:06.583 11:42:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:32:06.583 11:42:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:32:06.583 11:42:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:32:06.583 11:42:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:32:06.583 11:42:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:32:06.583 11:42:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:32:06.583 11:42:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:32:06.583 11:42:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:32:06.841 11:42:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:32:06.841 "name": "BaseBdev3", 00:32:06.841 "aliases": [ 00:32:06.841 "559fc6b8-53d8-4b68-ac4c-3fc6b027b6cc" 00:32:06.841 ], 00:32:06.841 "product_name": "Malloc disk", 00:32:06.841 "block_size": 512, 00:32:06.841 "num_blocks": 65536, 00:32:06.841 "uuid": "559fc6b8-53d8-4b68-ac4c-3fc6b027b6cc", 00:32:06.841 "assigned_rate_limits": { 00:32:06.841 "rw_ios_per_sec": 0, 00:32:06.841 "rw_mbytes_per_sec": 0, 00:32:06.841 "r_mbytes_per_sec": 0, 00:32:06.841 "w_mbytes_per_sec": 0 00:32:06.841 }, 00:32:06.841 "claimed": true, 00:32:06.841 "claim_type": "exclusive_write", 00:32:06.841 "zoned": false, 00:32:06.841 "supported_io_types": { 00:32:06.841 "read": true, 00:32:06.841 "write": true, 00:32:06.841 "unmap": true, 00:32:06.841 "write_zeroes": true, 00:32:06.841 "flush": true, 00:32:06.841 "reset": true, 00:32:06.841 "compare": false, 00:32:06.841 "compare_and_write": false, 00:32:06.841 "abort": true, 00:32:06.841 "nvme_admin": false, 00:32:06.841 "nvme_io": false 00:32:06.841 }, 00:32:06.841 "memory_domains": [ 00:32:06.841 { 00:32:06.841 "dma_device_id": "system", 00:32:06.841 "dma_device_type": 1 00:32:06.841 }, 00:32:06.841 { 00:32:06.841 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:06.841 "dma_device_type": 2 00:32:06.841 } 00:32:06.841 ], 00:32:06.841 "driver_specific": {} 00:32:06.841 }' 00:32:06.841 11:42:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:32:06.841 11:42:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:32:06.841 11:42:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:32:06.841 11:42:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:32:06.841 11:42:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:32:06.841 11:42:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:32:06.841 11:42:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:32:06.841 11:42:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:32:06.841 11:42:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:32:06.841 11:42:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:32:07.099 11:42:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:32:07.099 11:42:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:32:07.099 11:42:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:32:07.099 [2024-06-10 11:42:51.009365] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:32:07.099 [2024-06-10 11:42:51.009385] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:32:07.099 [2024-06-10 11:42:51.009423] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:32:07.099 [2024-06-10 11:42:51.009599] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:32:07.099 [2024-06-10 11:42:51.009607] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25e7ea0 name Existed_Raid, state offline 00:32:07.099 11:42:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 168460 00:32:07.099 11:42:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@949 -- # '[' -z 168460 ']' 00:32:07.099 11:42:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # kill -0 168460 00:32:07.099 11:42:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # uname 00:32:07.099 11:42:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:32:07.099 11:42:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 168460 00:32:07.359 11:42:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:32:07.359 11:42:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:32:07.359 11:42:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 168460' 00:32:07.359 killing process with pid 168460 00:32:07.359 11:42:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # kill 168460 00:32:07.359 [2024-06-10 11:42:51.075450] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:32:07.359 11:42:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@973 -- # wait 168460 00:32:07.359 [2024-06-10 11:42:51.101528] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:32:07.359 11:42:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:32:07.359 00:32:07.359 real 0m21.756s 00:32:07.359 user 0m39.585s 00:32:07.359 sys 0m4.223s 00:32:07.359 11:42:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:32:07.359 11:42:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:32:07.359 ************************************ 00:32:07.359 END TEST raid_state_function_test 00:32:07.359 ************************************ 00:32:07.618 11:42:51 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 3 true 00:32:07.618 11:42:51 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:32:07.618 11:42:51 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:32:07.618 11:42:51 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:32:07.618 ************************************ 00:32:07.618 START TEST raid_state_function_test_sb 00:32:07.618 ************************************ 00:32:07.618 11:42:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # raid_state_function_test raid1 3 true 00:32:07.618 11:42:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:32:07.618 11:42:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:32:07.618 11:42:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:32:07.618 11:42:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:32:07.618 11:42:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:32:07.618 11:42:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:32:07.618 11:42:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:32:07.618 11:42:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:32:07.618 11:42:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:32:07.618 11:42:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:32:07.618 11:42:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:32:07.618 11:42:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:32:07.618 11:42:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:32:07.618 11:42:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:32:07.618 11:42:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:32:07.618 11:42:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:32:07.618 11:42:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:32:07.618 11:42:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:32:07.618 11:42:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:32:07.618 11:42:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:32:07.619 11:42:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:32:07.619 11:42:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:32:07.619 11:42:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:32:07.619 11:42:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:32:07.619 11:42:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:32:07.619 11:42:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:32:07.619 11:42:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=171894 00:32:07.619 11:42:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 171894' 00:32:07.619 Process raid pid: 171894 00:32:07.619 11:42:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 171894 /var/tmp/spdk-raid.sock 00:32:07.619 11:42:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@830 -- # '[' -z 171894 ']' 00:32:07.619 11:42:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:32:07.619 11:42:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local max_retries=100 00:32:07.619 11:42:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:32:07.619 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:32:07.619 11:42:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@839 -- # xtrace_disable 00:32:07.619 11:42:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:32:07.619 [2024-06-10 11:42:51.422816] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:32:07.619 [2024-06-10 11:42:51.422863] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:32:07.619 [2024-06-10 11:42:51.505247] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:07.877 [2024-06-10 11:42:51.589961] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:32:07.877 [2024-06-10 11:42:51.644104] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:32:07.877 [2024-06-10 11:42:51.644125] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:32:08.444 11:42:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:32:08.444 11:42:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@863 -- # return 0 00:32:08.444 11:42:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:32:08.444 [2024-06-10 11:42:52.352655] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:32:08.444 [2024-06-10 11:42:52.352690] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:32:08.444 [2024-06-10 11:42:52.352698] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:32:08.444 [2024-06-10 11:42:52.352705] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:32:08.444 [2024-06-10 11:42:52.352712] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:32:08.444 [2024-06-10 11:42:52.352719] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:32:08.444 11:42:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:32:08.444 11:42:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:32:08.444 11:42:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:32:08.444 11:42:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:08.445 11:42:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:08.445 11:42:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:32:08.445 11:42:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:08.445 11:42:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:08.445 11:42:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:08.445 11:42:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:08.445 11:42:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:08.445 11:42:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:32:08.703 11:42:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:08.703 "name": "Existed_Raid", 00:32:08.703 "uuid": "c17876e0-31c5-4f53-9bc7-02fe61b7d3d9", 00:32:08.703 "strip_size_kb": 0, 00:32:08.703 "state": "configuring", 00:32:08.703 "raid_level": "raid1", 00:32:08.703 "superblock": true, 00:32:08.703 "num_base_bdevs": 3, 00:32:08.703 "num_base_bdevs_discovered": 0, 00:32:08.703 "num_base_bdevs_operational": 3, 00:32:08.703 "base_bdevs_list": [ 00:32:08.703 { 00:32:08.703 "name": "BaseBdev1", 00:32:08.703 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:08.703 "is_configured": false, 00:32:08.703 "data_offset": 0, 00:32:08.703 "data_size": 0 00:32:08.703 }, 00:32:08.703 { 00:32:08.703 "name": "BaseBdev2", 00:32:08.703 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:08.703 "is_configured": false, 00:32:08.703 "data_offset": 0, 00:32:08.703 "data_size": 0 00:32:08.703 }, 00:32:08.703 { 00:32:08.703 "name": "BaseBdev3", 00:32:08.703 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:08.703 "is_configured": false, 00:32:08.703 "data_offset": 0, 00:32:08.703 "data_size": 0 00:32:08.703 } 00:32:08.703 ] 00:32:08.703 }' 00:32:08.703 11:42:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:08.703 11:42:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:32:09.269 11:42:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:32:09.269 [2024-06-10 11:42:53.210773] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:32:09.269 [2024-06-10 11:42:53.210792] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1785530 name Existed_Raid, state configuring 00:32:09.527 11:42:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:32:09.527 [2024-06-10 11:42:53.387251] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:32:09.527 [2024-06-10 11:42:53.387270] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:32:09.527 [2024-06-10 11:42:53.387276] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:32:09.527 [2024-06-10 11:42:53.387284] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:32:09.527 [2024-06-10 11:42:53.387305] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:32:09.527 [2024-06-10 11:42:53.387312] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:32:09.527 11:42:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:32:09.786 [2024-06-10 11:42:53.568176] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:32:09.786 BaseBdev1 00:32:09.786 11:42:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:32:09.786 11:42:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:32:09.786 11:42:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:32:09.786 11:42:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:32:09.786 11:42:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:32:09.786 11:42:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:32:09.786 11:42:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:32:10.044 11:42:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:32:10.044 [ 00:32:10.044 { 00:32:10.044 "name": "BaseBdev1", 00:32:10.044 "aliases": [ 00:32:10.044 "0194d5f9-69c2-4c64-ae87-ab2c524265b9" 00:32:10.044 ], 00:32:10.044 "product_name": "Malloc disk", 00:32:10.044 "block_size": 512, 00:32:10.044 "num_blocks": 65536, 00:32:10.044 "uuid": "0194d5f9-69c2-4c64-ae87-ab2c524265b9", 00:32:10.044 "assigned_rate_limits": { 00:32:10.044 "rw_ios_per_sec": 0, 00:32:10.044 "rw_mbytes_per_sec": 0, 00:32:10.044 "r_mbytes_per_sec": 0, 00:32:10.044 "w_mbytes_per_sec": 0 00:32:10.044 }, 00:32:10.044 "claimed": true, 00:32:10.044 "claim_type": "exclusive_write", 00:32:10.044 "zoned": false, 00:32:10.044 "supported_io_types": { 00:32:10.044 "read": true, 00:32:10.044 "write": true, 00:32:10.044 "unmap": true, 00:32:10.044 "write_zeroes": true, 00:32:10.044 "flush": true, 00:32:10.044 "reset": true, 00:32:10.044 "compare": false, 00:32:10.044 "compare_and_write": false, 00:32:10.044 "abort": true, 00:32:10.044 "nvme_admin": false, 00:32:10.044 "nvme_io": false 00:32:10.044 }, 00:32:10.044 "memory_domains": [ 00:32:10.044 { 00:32:10.044 "dma_device_id": "system", 00:32:10.044 "dma_device_type": 1 00:32:10.044 }, 00:32:10.044 { 00:32:10.044 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:10.044 "dma_device_type": 2 00:32:10.044 } 00:32:10.044 ], 00:32:10.044 "driver_specific": {} 00:32:10.044 } 00:32:10.044 ] 00:32:10.044 11:42:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:32:10.044 11:42:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:32:10.044 11:42:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:32:10.044 11:42:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:32:10.044 11:42:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:10.044 11:42:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:10.044 11:42:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:32:10.044 11:42:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:10.044 11:42:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:10.044 11:42:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:10.044 11:42:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:10.044 11:42:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:10.044 11:42:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:32:10.303 11:42:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:10.303 "name": "Existed_Raid", 00:32:10.303 "uuid": "610b75aa-651f-48ea-bd67-9273c9c13ff4", 00:32:10.303 "strip_size_kb": 0, 00:32:10.303 "state": "configuring", 00:32:10.303 "raid_level": "raid1", 00:32:10.303 "superblock": true, 00:32:10.303 "num_base_bdevs": 3, 00:32:10.303 "num_base_bdevs_discovered": 1, 00:32:10.303 "num_base_bdevs_operational": 3, 00:32:10.303 "base_bdevs_list": [ 00:32:10.303 { 00:32:10.303 "name": "BaseBdev1", 00:32:10.303 "uuid": "0194d5f9-69c2-4c64-ae87-ab2c524265b9", 00:32:10.303 "is_configured": true, 00:32:10.303 "data_offset": 2048, 00:32:10.303 "data_size": 63488 00:32:10.303 }, 00:32:10.303 { 00:32:10.303 "name": "BaseBdev2", 00:32:10.303 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:10.303 "is_configured": false, 00:32:10.303 "data_offset": 0, 00:32:10.303 "data_size": 0 00:32:10.303 }, 00:32:10.303 { 00:32:10.303 "name": "BaseBdev3", 00:32:10.303 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:10.303 "is_configured": false, 00:32:10.303 "data_offset": 0, 00:32:10.303 "data_size": 0 00:32:10.303 } 00:32:10.303 ] 00:32:10.303 }' 00:32:10.303 11:42:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:10.303 11:42:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:32:10.869 11:42:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:32:10.869 [2024-06-10 11:42:54.723151] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:32:10.869 [2024-06-10 11:42:54.723183] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1784e00 name Existed_Raid, state configuring 00:32:10.869 11:42:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:32:11.128 [2024-06-10 11:42:54.883594] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:32:11.128 [2024-06-10 11:42:54.884692] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:32:11.128 [2024-06-10 11:42:54.884719] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:32:11.128 [2024-06-10 11:42:54.884726] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:32:11.128 [2024-06-10 11:42:54.884734] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:32:11.128 11:42:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:32:11.128 11:42:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:32:11.128 11:42:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:32:11.128 11:42:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:32:11.128 11:42:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:32:11.128 11:42:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:11.128 11:42:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:11.128 11:42:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:32:11.128 11:42:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:11.128 11:42:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:11.128 11:42:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:11.128 11:42:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:11.128 11:42:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:11.128 11:42:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:32:11.387 11:42:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:11.387 "name": "Existed_Raid", 00:32:11.387 "uuid": "f2d6f142-30f2-4f99-84b5-a07881f59d28", 00:32:11.387 "strip_size_kb": 0, 00:32:11.387 "state": "configuring", 00:32:11.387 "raid_level": "raid1", 00:32:11.387 "superblock": true, 00:32:11.387 "num_base_bdevs": 3, 00:32:11.387 "num_base_bdevs_discovered": 1, 00:32:11.387 "num_base_bdevs_operational": 3, 00:32:11.387 "base_bdevs_list": [ 00:32:11.387 { 00:32:11.387 "name": "BaseBdev1", 00:32:11.387 "uuid": "0194d5f9-69c2-4c64-ae87-ab2c524265b9", 00:32:11.387 "is_configured": true, 00:32:11.387 "data_offset": 2048, 00:32:11.387 "data_size": 63488 00:32:11.387 }, 00:32:11.387 { 00:32:11.387 "name": "BaseBdev2", 00:32:11.387 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:11.387 "is_configured": false, 00:32:11.387 "data_offset": 0, 00:32:11.387 "data_size": 0 00:32:11.387 }, 00:32:11.387 { 00:32:11.387 "name": "BaseBdev3", 00:32:11.387 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:11.387 "is_configured": false, 00:32:11.387 "data_offset": 0, 00:32:11.387 "data_size": 0 00:32:11.387 } 00:32:11.387 ] 00:32:11.387 }' 00:32:11.387 11:42:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:11.387 11:42:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:32:11.645 11:42:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:32:11.903 [2024-06-10 11:42:55.740614] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:32:11.903 BaseBdev2 00:32:11.903 11:42:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:32:11.903 11:42:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:32:11.903 11:42:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:32:11.903 11:42:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:32:11.903 11:42:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:32:11.903 11:42:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:32:11.903 11:42:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:32:12.161 11:42:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:32:12.161 [ 00:32:12.161 { 00:32:12.161 "name": "BaseBdev2", 00:32:12.161 "aliases": [ 00:32:12.161 "244143cc-e357-4836-b5f4-6a7c0cb8144c" 00:32:12.161 ], 00:32:12.161 "product_name": "Malloc disk", 00:32:12.161 "block_size": 512, 00:32:12.161 "num_blocks": 65536, 00:32:12.161 "uuid": "244143cc-e357-4836-b5f4-6a7c0cb8144c", 00:32:12.161 "assigned_rate_limits": { 00:32:12.161 "rw_ios_per_sec": 0, 00:32:12.161 "rw_mbytes_per_sec": 0, 00:32:12.161 "r_mbytes_per_sec": 0, 00:32:12.161 "w_mbytes_per_sec": 0 00:32:12.161 }, 00:32:12.161 "claimed": true, 00:32:12.161 "claim_type": "exclusive_write", 00:32:12.161 "zoned": false, 00:32:12.161 "supported_io_types": { 00:32:12.161 "read": true, 00:32:12.161 "write": true, 00:32:12.161 "unmap": true, 00:32:12.161 "write_zeroes": true, 00:32:12.161 "flush": true, 00:32:12.161 "reset": true, 00:32:12.161 "compare": false, 00:32:12.161 "compare_and_write": false, 00:32:12.161 "abort": true, 00:32:12.161 "nvme_admin": false, 00:32:12.161 "nvme_io": false 00:32:12.161 }, 00:32:12.161 "memory_domains": [ 00:32:12.161 { 00:32:12.161 "dma_device_id": "system", 00:32:12.161 "dma_device_type": 1 00:32:12.161 }, 00:32:12.161 { 00:32:12.161 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:12.161 "dma_device_type": 2 00:32:12.161 } 00:32:12.161 ], 00:32:12.161 "driver_specific": {} 00:32:12.161 } 00:32:12.161 ] 00:32:12.420 11:42:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:32:12.420 11:42:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:32:12.420 11:42:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:32:12.420 11:42:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:32:12.420 11:42:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:32:12.420 11:42:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:32:12.420 11:42:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:12.420 11:42:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:12.420 11:42:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:32:12.420 11:42:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:12.420 11:42:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:12.420 11:42:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:12.420 11:42:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:12.420 11:42:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:12.420 11:42:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:32:12.420 11:42:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:12.420 "name": "Existed_Raid", 00:32:12.420 "uuid": "f2d6f142-30f2-4f99-84b5-a07881f59d28", 00:32:12.420 "strip_size_kb": 0, 00:32:12.420 "state": "configuring", 00:32:12.420 "raid_level": "raid1", 00:32:12.420 "superblock": true, 00:32:12.420 "num_base_bdevs": 3, 00:32:12.420 "num_base_bdevs_discovered": 2, 00:32:12.420 "num_base_bdevs_operational": 3, 00:32:12.420 "base_bdevs_list": [ 00:32:12.420 { 00:32:12.420 "name": "BaseBdev1", 00:32:12.420 "uuid": "0194d5f9-69c2-4c64-ae87-ab2c524265b9", 00:32:12.420 "is_configured": true, 00:32:12.420 "data_offset": 2048, 00:32:12.420 "data_size": 63488 00:32:12.420 }, 00:32:12.420 { 00:32:12.420 "name": "BaseBdev2", 00:32:12.420 "uuid": "244143cc-e357-4836-b5f4-6a7c0cb8144c", 00:32:12.420 "is_configured": true, 00:32:12.420 "data_offset": 2048, 00:32:12.420 "data_size": 63488 00:32:12.420 }, 00:32:12.420 { 00:32:12.420 "name": "BaseBdev3", 00:32:12.420 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:12.420 "is_configured": false, 00:32:12.420 "data_offset": 0, 00:32:12.420 "data_size": 0 00:32:12.420 } 00:32:12.420 ] 00:32:12.420 }' 00:32:12.420 11:42:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:12.420 11:42:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:32:12.987 11:42:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:32:13.245 [2024-06-10 11:42:56.966676] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:32:13.245 [2024-06-10 11:42:56.966802] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1785cf0 00:32:13.245 [2024-06-10 11:42:56.966812] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:32:13.245 [2024-06-10 11:42:56.966956] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x179cbe0 00:32:13.245 [2024-06-10 11:42:56.967055] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1785cf0 00:32:13.245 [2024-06-10 11:42:56.967063] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1785cf0 00:32:13.245 [2024-06-10 11:42:56.967135] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:32:13.245 BaseBdev3 00:32:13.245 11:42:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:32:13.245 11:42:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:32:13.245 11:42:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:32:13.245 11:42:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:32:13.245 11:42:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:32:13.245 11:42:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:32:13.245 11:42:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:32:13.245 11:42:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:32:13.504 [ 00:32:13.504 { 00:32:13.504 "name": "BaseBdev3", 00:32:13.504 "aliases": [ 00:32:13.504 "398cbf1d-013d-4c1c-8b9c-f8a753af45f4" 00:32:13.504 ], 00:32:13.504 "product_name": "Malloc disk", 00:32:13.504 "block_size": 512, 00:32:13.504 "num_blocks": 65536, 00:32:13.504 "uuid": "398cbf1d-013d-4c1c-8b9c-f8a753af45f4", 00:32:13.504 "assigned_rate_limits": { 00:32:13.504 "rw_ios_per_sec": 0, 00:32:13.504 "rw_mbytes_per_sec": 0, 00:32:13.504 "r_mbytes_per_sec": 0, 00:32:13.504 "w_mbytes_per_sec": 0 00:32:13.504 }, 00:32:13.504 "claimed": true, 00:32:13.504 "claim_type": "exclusive_write", 00:32:13.504 "zoned": false, 00:32:13.504 "supported_io_types": { 00:32:13.504 "read": true, 00:32:13.504 "write": true, 00:32:13.504 "unmap": true, 00:32:13.504 "write_zeroes": true, 00:32:13.504 "flush": true, 00:32:13.504 "reset": true, 00:32:13.504 "compare": false, 00:32:13.504 "compare_and_write": false, 00:32:13.504 "abort": true, 00:32:13.504 "nvme_admin": false, 00:32:13.504 "nvme_io": false 00:32:13.504 }, 00:32:13.504 "memory_domains": [ 00:32:13.504 { 00:32:13.504 "dma_device_id": "system", 00:32:13.504 "dma_device_type": 1 00:32:13.504 }, 00:32:13.504 { 00:32:13.504 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:13.504 "dma_device_type": 2 00:32:13.504 } 00:32:13.504 ], 00:32:13.504 "driver_specific": {} 00:32:13.504 } 00:32:13.504 ] 00:32:13.504 11:42:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:32:13.504 11:42:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:32:13.504 11:42:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:32:13.504 11:42:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:32:13.504 11:42:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:32:13.504 11:42:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:32:13.504 11:42:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:13.504 11:42:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:13.504 11:42:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:32:13.504 11:42:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:13.504 11:42:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:13.504 11:42:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:13.504 11:42:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:13.504 11:42:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:13.504 11:42:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:32:13.762 11:42:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:13.762 "name": "Existed_Raid", 00:32:13.762 "uuid": "f2d6f142-30f2-4f99-84b5-a07881f59d28", 00:32:13.762 "strip_size_kb": 0, 00:32:13.762 "state": "online", 00:32:13.762 "raid_level": "raid1", 00:32:13.762 "superblock": true, 00:32:13.762 "num_base_bdevs": 3, 00:32:13.762 "num_base_bdevs_discovered": 3, 00:32:13.762 "num_base_bdevs_operational": 3, 00:32:13.762 "base_bdevs_list": [ 00:32:13.762 { 00:32:13.762 "name": "BaseBdev1", 00:32:13.762 "uuid": "0194d5f9-69c2-4c64-ae87-ab2c524265b9", 00:32:13.762 "is_configured": true, 00:32:13.762 "data_offset": 2048, 00:32:13.762 "data_size": 63488 00:32:13.762 }, 00:32:13.762 { 00:32:13.762 "name": "BaseBdev2", 00:32:13.762 "uuid": "244143cc-e357-4836-b5f4-6a7c0cb8144c", 00:32:13.762 "is_configured": true, 00:32:13.762 "data_offset": 2048, 00:32:13.762 "data_size": 63488 00:32:13.762 }, 00:32:13.762 { 00:32:13.762 "name": "BaseBdev3", 00:32:13.762 "uuid": "398cbf1d-013d-4c1c-8b9c-f8a753af45f4", 00:32:13.762 "is_configured": true, 00:32:13.762 "data_offset": 2048, 00:32:13.762 "data_size": 63488 00:32:13.762 } 00:32:13.762 ] 00:32:13.762 }' 00:32:13.762 11:42:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:13.762 11:42:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:32:14.328 11:42:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:32:14.328 11:42:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:32:14.328 11:42:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:32:14.328 11:42:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:32:14.328 11:42:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:32:14.328 11:42:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:32:14.328 11:42:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:32:14.328 11:42:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:32:14.328 [2024-06-10 11:42:58.174025] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:32:14.328 11:42:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:32:14.328 "name": "Existed_Raid", 00:32:14.328 "aliases": [ 00:32:14.328 "f2d6f142-30f2-4f99-84b5-a07881f59d28" 00:32:14.328 ], 00:32:14.328 "product_name": "Raid Volume", 00:32:14.328 "block_size": 512, 00:32:14.328 "num_blocks": 63488, 00:32:14.328 "uuid": "f2d6f142-30f2-4f99-84b5-a07881f59d28", 00:32:14.328 "assigned_rate_limits": { 00:32:14.328 "rw_ios_per_sec": 0, 00:32:14.328 "rw_mbytes_per_sec": 0, 00:32:14.328 "r_mbytes_per_sec": 0, 00:32:14.328 "w_mbytes_per_sec": 0 00:32:14.328 }, 00:32:14.329 "claimed": false, 00:32:14.329 "zoned": false, 00:32:14.329 "supported_io_types": { 00:32:14.329 "read": true, 00:32:14.329 "write": true, 00:32:14.329 "unmap": false, 00:32:14.329 "write_zeroes": true, 00:32:14.329 "flush": false, 00:32:14.329 "reset": true, 00:32:14.329 "compare": false, 00:32:14.329 "compare_and_write": false, 00:32:14.329 "abort": false, 00:32:14.329 "nvme_admin": false, 00:32:14.329 "nvme_io": false 00:32:14.329 }, 00:32:14.329 "memory_domains": [ 00:32:14.329 { 00:32:14.329 "dma_device_id": "system", 00:32:14.329 "dma_device_type": 1 00:32:14.329 }, 00:32:14.329 { 00:32:14.329 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:14.329 "dma_device_type": 2 00:32:14.329 }, 00:32:14.329 { 00:32:14.329 "dma_device_id": "system", 00:32:14.329 "dma_device_type": 1 00:32:14.329 }, 00:32:14.329 { 00:32:14.329 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:14.329 "dma_device_type": 2 00:32:14.329 }, 00:32:14.329 { 00:32:14.329 "dma_device_id": "system", 00:32:14.329 "dma_device_type": 1 00:32:14.329 }, 00:32:14.329 { 00:32:14.329 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:14.329 "dma_device_type": 2 00:32:14.329 } 00:32:14.329 ], 00:32:14.329 "driver_specific": { 00:32:14.329 "raid": { 00:32:14.329 "uuid": "f2d6f142-30f2-4f99-84b5-a07881f59d28", 00:32:14.329 "strip_size_kb": 0, 00:32:14.329 "state": "online", 00:32:14.329 "raid_level": "raid1", 00:32:14.329 "superblock": true, 00:32:14.329 "num_base_bdevs": 3, 00:32:14.329 "num_base_bdevs_discovered": 3, 00:32:14.329 "num_base_bdevs_operational": 3, 00:32:14.329 "base_bdevs_list": [ 00:32:14.329 { 00:32:14.329 "name": "BaseBdev1", 00:32:14.329 "uuid": "0194d5f9-69c2-4c64-ae87-ab2c524265b9", 00:32:14.329 "is_configured": true, 00:32:14.329 "data_offset": 2048, 00:32:14.329 "data_size": 63488 00:32:14.329 }, 00:32:14.329 { 00:32:14.329 "name": "BaseBdev2", 00:32:14.329 "uuid": "244143cc-e357-4836-b5f4-6a7c0cb8144c", 00:32:14.329 "is_configured": true, 00:32:14.329 "data_offset": 2048, 00:32:14.329 "data_size": 63488 00:32:14.329 }, 00:32:14.329 { 00:32:14.329 "name": "BaseBdev3", 00:32:14.329 "uuid": "398cbf1d-013d-4c1c-8b9c-f8a753af45f4", 00:32:14.329 "is_configured": true, 00:32:14.329 "data_offset": 2048, 00:32:14.329 "data_size": 63488 00:32:14.329 } 00:32:14.329 ] 00:32:14.329 } 00:32:14.329 } 00:32:14.329 }' 00:32:14.329 11:42:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:32:14.329 11:42:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:32:14.329 BaseBdev2 00:32:14.329 BaseBdev3' 00:32:14.329 11:42:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:32:14.329 11:42:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:32:14.329 11:42:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:32:14.587 11:42:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:32:14.587 "name": "BaseBdev1", 00:32:14.587 "aliases": [ 00:32:14.587 "0194d5f9-69c2-4c64-ae87-ab2c524265b9" 00:32:14.587 ], 00:32:14.587 "product_name": "Malloc disk", 00:32:14.587 "block_size": 512, 00:32:14.587 "num_blocks": 65536, 00:32:14.587 "uuid": "0194d5f9-69c2-4c64-ae87-ab2c524265b9", 00:32:14.587 "assigned_rate_limits": { 00:32:14.587 "rw_ios_per_sec": 0, 00:32:14.587 "rw_mbytes_per_sec": 0, 00:32:14.587 "r_mbytes_per_sec": 0, 00:32:14.587 "w_mbytes_per_sec": 0 00:32:14.587 }, 00:32:14.587 "claimed": true, 00:32:14.587 "claim_type": "exclusive_write", 00:32:14.587 "zoned": false, 00:32:14.587 "supported_io_types": { 00:32:14.587 "read": true, 00:32:14.587 "write": true, 00:32:14.587 "unmap": true, 00:32:14.587 "write_zeroes": true, 00:32:14.587 "flush": true, 00:32:14.587 "reset": true, 00:32:14.587 "compare": false, 00:32:14.587 "compare_and_write": false, 00:32:14.587 "abort": true, 00:32:14.587 "nvme_admin": false, 00:32:14.587 "nvme_io": false 00:32:14.587 }, 00:32:14.587 "memory_domains": [ 00:32:14.587 { 00:32:14.587 "dma_device_id": "system", 00:32:14.587 "dma_device_type": 1 00:32:14.587 }, 00:32:14.587 { 00:32:14.587 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:14.587 "dma_device_type": 2 00:32:14.587 } 00:32:14.587 ], 00:32:14.587 "driver_specific": {} 00:32:14.587 }' 00:32:14.587 11:42:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:32:14.587 11:42:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:32:14.587 11:42:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:32:14.587 11:42:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:32:14.587 11:42:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:32:14.845 11:42:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:32:14.845 11:42:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:32:14.845 11:42:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:32:14.845 11:42:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:32:14.845 11:42:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:32:14.845 11:42:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:32:14.845 11:42:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:32:14.845 11:42:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:32:14.845 11:42:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:32:14.845 11:42:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:32:15.104 11:42:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:32:15.104 "name": "BaseBdev2", 00:32:15.104 "aliases": [ 00:32:15.104 "244143cc-e357-4836-b5f4-6a7c0cb8144c" 00:32:15.104 ], 00:32:15.104 "product_name": "Malloc disk", 00:32:15.104 "block_size": 512, 00:32:15.104 "num_blocks": 65536, 00:32:15.104 "uuid": "244143cc-e357-4836-b5f4-6a7c0cb8144c", 00:32:15.104 "assigned_rate_limits": { 00:32:15.104 "rw_ios_per_sec": 0, 00:32:15.104 "rw_mbytes_per_sec": 0, 00:32:15.104 "r_mbytes_per_sec": 0, 00:32:15.104 "w_mbytes_per_sec": 0 00:32:15.104 }, 00:32:15.104 "claimed": true, 00:32:15.104 "claim_type": "exclusive_write", 00:32:15.104 "zoned": false, 00:32:15.104 "supported_io_types": { 00:32:15.104 "read": true, 00:32:15.104 "write": true, 00:32:15.104 "unmap": true, 00:32:15.104 "write_zeroes": true, 00:32:15.104 "flush": true, 00:32:15.104 "reset": true, 00:32:15.104 "compare": false, 00:32:15.104 "compare_and_write": false, 00:32:15.104 "abort": true, 00:32:15.104 "nvme_admin": false, 00:32:15.104 "nvme_io": false 00:32:15.104 }, 00:32:15.104 "memory_domains": [ 00:32:15.104 { 00:32:15.104 "dma_device_id": "system", 00:32:15.104 "dma_device_type": 1 00:32:15.104 }, 00:32:15.104 { 00:32:15.104 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:15.104 "dma_device_type": 2 00:32:15.104 } 00:32:15.104 ], 00:32:15.104 "driver_specific": {} 00:32:15.104 }' 00:32:15.104 11:42:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:32:15.104 11:42:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:32:15.104 11:42:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:32:15.104 11:42:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:32:15.104 11:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:32:15.362 11:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:32:15.362 11:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:32:15.362 11:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:32:15.362 11:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:32:15.362 11:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:32:15.362 11:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:32:15.362 11:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:32:15.362 11:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:32:15.362 11:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:32:15.362 11:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:32:15.620 11:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:32:15.620 "name": "BaseBdev3", 00:32:15.620 "aliases": [ 00:32:15.620 "398cbf1d-013d-4c1c-8b9c-f8a753af45f4" 00:32:15.620 ], 00:32:15.620 "product_name": "Malloc disk", 00:32:15.620 "block_size": 512, 00:32:15.620 "num_blocks": 65536, 00:32:15.620 "uuid": "398cbf1d-013d-4c1c-8b9c-f8a753af45f4", 00:32:15.620 "assigned_rate_limits": { 00:32:15.620 "rw_ios_per_sec": 0, 00:32:15.620 "rw_mbytes_per_sec": 0, 00:32:15.620 "r_mbytes_per_sec": 0, 00:32:15.620 "w_mbytes_per_sec": 0 00:32:15.620 }, 00:32:15.620 "claimed": true, 00:32:15.620 "claim_type": "exclusive_write", 00:32:15.620 "zoned": false, 00:32:15.620 "supported_io_types": { 00:32:15.620 "read": true, 00:32:15.620 "write": true, 00:32:15.620 "unmap": true, 00:32:15.620 "write_zeroes": true, 00:32:15.620 "flush": true, 00:32:15.620 "reset": true, 00:32:15.620 "compare": false, 00:32:15.620 "compare_and_write": false, 00:32:15.620 "abort": true, 00:32:15.620 "nvme_admin": false, 00:32:15.620 "nvme_io": false 00:32:15.620 }, 00:32:15.620 "memory_domains": [ 00:32:15.620 { 00:32:15.620 "dma_device_id": "system", 00:32:15.620 "dma_device_type": 1 00:32:15.620 }, 00:32:15.620 { 00:32:15.620 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:15.620 "dma_device_type": 2 00:32:15.620 } 00:32:15.620 ], 00:32:15.620 "driver_specific": {} 00:32:15.620 }' 00:32:15.620 11:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:32:15.620 11:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:32:15.620 11:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:32:15.620 11:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:32:15.620 11:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:32:15.878 11:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:32:15.878 11:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:32:15.878 11:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:32:15.878 11:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:32:15.878 11:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:32:15.878 11:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:32:15.878 11:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:32:15.878 11:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:32:16.136 [2024-06-10 11:42:59.882302] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:32:16.136 11:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:32:16.136 11:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:32:16.136 11:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:32:16.136 11:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:32:16.136 11:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:32:16.136 11:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:32:16.136 11:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:32:16.136 11:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:32:16.136 11:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:16.136 11:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:16.137 11:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:32:16.137 11:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:16.137 11:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:16.137 11:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:16.137 11:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:16.137 11:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:16.137 11:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:32:16.395 11:43:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:16.395 "name": "Existed_Raid", 00:32:16.395 "uuid": "f2d6f142-30f2-4f99-84b5-a07881f59d28", 00:32:16.395 "strip_size_kb": 0, 00:32:16.395 "state": "online", 00:32:16.395 "raid_level": "raid1", 00:32:16.395 "superblock": true, 00:32:16.395 "num_base_bdevs": 3, 00:32:16.395 "num_base_bdevs_discovered": 2, 00:32:16.395 "num_base_bdevs_operational": 2, 00:32:16.395 "base_bdevs_list": [ 00:32:16.395 { 00:32:16.395 "name": null, 00:32:16.395 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:16.395 "is_configured": false, 00:32:16.395 "data_offset": 2048, 00:32:16.395 "data_size": 63488 00:32:16.395 }, 00:32:16.395 { 00:32:16.395 "name": "BaseBdev2", 00:32:16.395 "uuid": "244143cc-e357-4836-b5f4-6a7c0cb8144c", 00:32:16.395 "is_configured": true, 00:32:16.395 "data_offset": 2048, 00:32:16.395 "data_size": 63488 00:32:16.395 }, 00:32:16.395 { 00:32:16.395 "name": "BaseBdev3", 00:32:16.395 "uuid": "398cbf1d-013d-4c1c-8b9c-f8a753af45f4", 00:32:16.395 "is_configured": true, 00:32:16.395 "data_offset": 2048, 00:32:16.395 "data_size": 63488 00:32:16.395 } 00:32:16.395 ] 00:32:16.395 }' 00:32:16.395 11:43:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:16.395 11:43:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:32:16.653 11:43:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:32:16.653 11:43:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:32:16.653 11:43:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:16.653 11:43:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:32:16.911 11:43:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:32:16.911 11:43:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:32:16.911 11:43:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:32:17.169 [2024-06-10 11:43:00.902644] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:32:17.169 11:43:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:32:17.169 11:43:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:32:17.169 11:43:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:17.169 11:43:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:32:17.169 11:43:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:32:17.169 11:43:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:32:17.169 11:43:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:32:17.427 [2024-06-10 11:43:01.271650] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:32:17.427 [2024-06-10 11:43:01.271714] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:32:17.427 [2024-06-10 11:43:01.283785] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:32:17.427 [2024-06-10 11:43:01.283809] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:32:17.427 [2024-06-10 11:43:01.283818] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1785cf0 name Existed_Raid, state offline 00:32:17.427 11:43:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:32:17.427 11:43:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:32:17.427 11:43:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:17.427 11:43:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:32:17.685 11:43:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:32:17.685 11:43:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:32:17.685 11:43:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:32:17.685 11:43:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:32:17.685 11:43:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:32:17.685 11:43:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:32:17.944 BaseBdev2 00:32:17.944 11:43:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:32:17.944 11:43:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:32:17.944 11:43:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:32:17.944 11:43:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:32:17.944 11:43:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:32:17.944 11:43:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:32:17.944 11:43:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:32:17.944 11:43:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:32:18.202 [ 00:32:18.202 { 00:32:18.202 "name": "BaseBdev2", 00:32:18.202 "aliases": [ 00:32:18.202 "3097979e-858e-4ebb-ab2a-5a9ee0adab8c" 00:32:18.202 ], 00:32:18.202 "product_name": "Malloc disk", 00:32:18.202 "block_size": 512, 00:32:18.202 "num_blocks": 65536, 00:32:18.202 "uuid": "3097979e-858e-4ebb-ab2a-5a9ee0adab8c", 00:32:18.202 "assigned_rate_limits": { 00:32:18.202 "rw_ios_per_sec": 0, 00:32:18.202 "rw_mbytes_per_sec": 0, 00:32:18.202 "r_mbytes_per_sec": 0, 00:32:18.202 "w_mbytes_per_sec": 0 00:32:18.202 }, 00:32:18.202 "claimed": false, 00:32:18.202 "zoned": false, 00:32:18.202 "supported_io_types": { 00:32:18.202 "read": true, 00:32:18.202 "write": true, 00:32:18.202 "unmap": true, 00:32:18.202 "write_zeroes": true, 00:32:18.202 "flush": true, 00:32:18.202 "reset": true, 00:32:18.202 "compare": false, 00:32:18.202 "compare_and_write": false, 00:32:18.202 "abort": true, 00:32:18.202 "nvme_admin": false, 00:32:18.202 "nvme_io": false 00:32:18.202 }, 00:32:18.202 "memory_domains": [ 00:32:18.202 { 00:32:18.202 "dma_device_id": "system", 00:32:18.202 "dma_device_type": 1 00:32:18.202 }, 00:32:18.202 { 00:32:18.202 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:18.202 "dma_device_type": 2 00:32:18.202 } 00:32:18.202 ], 00:32:18.202 "driver_specific": {} 00:32:18.202 } 00:32:18.202 ] 00:32:18.202 11:43:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:32:18.202 11:43:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:32:18.202 11:43:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:32:18.202 11:43:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:32:18.460 BaseBdev3 00:32:18.460 11:43:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:32:18.460 11:43:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:32:18.460 11:43:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:32:18.460 11:43:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:32:18.460 11:43:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:32:18.460 11:43:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:32:18.460 11:43:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:32:18.460 11:43:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:32:18.718 [ 00:32:18.718 { 00:32:18.718 "name": "BaseBdev3", 00:32:18.718 "aliases": [ 00:32:18.718 "52df319f-69a0-45cd-8b7e-e19a081976f5" 00:32:18.718 ], 00:32:18.718 "product_name": "Malloc disk", 00:32:18.718 "block_size": 512, 00:32:18.718 "num_blocks": 65536, 00:32:18.718 "uuid": "52df319f-69a0-45cd-8b7e-e19a081976f5", 00:32:18.718 "assigned_rate_limits": { 00:32:18.718 "rw_ios_per_sec": 0, 00:32:18.718 "rw_mbytes_per_sec": 0, 00:32:18.718 "r_mbytes_per_sec": 0, 00:32:18.719 "w_mbytes_per_sec": 0 00:32:18.719 }, 00:32:18.719 "claimed": false, 00:32:18.719 "zoned": false, 00:32:18.719 "supported_io_types": { 00:32:18.719 "read": true, 00:32:18.719 "write": true, 00:32:18.719 "unmap": true, 00:32:18.719 "write_zeroes": true, 00:32:18.719 "flush": true, 00:32:18.719 "reset": true, 00:32:18.719 "compare": false, 00:32:18.719 "compare_and_write": false, 00:32:18.719 "abort": true, 00:32:18.719 "nvme_admin": false, 00:32:18.719 "nvme_io": false 00:32:18.719 }, 00:32:18.719 "memory_domains": [ 00:32:18.719 { 00:32:18.719 "dma_device_id": "system", 00:32:18.719 "dma_device_type": 1 00:32:18.719 }, 00:32:18.719 { 00:32:18.719 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:18.719 "dma_device_type": 2 00:32:18.719 } 00:32:18.719 ], 00:32:18.719 "driver_specific": {} 00:32:18.719 } 00:32:18.719 ] 00:32:18.719 11:43:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:32:18.719 11:43:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:32:18.719 11:43:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:32:18.719 11:43:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:32:18.719 [2024-06-10 11:43:02.663077] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:32:18.719 [2024-06-10 11:43:02.663112] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:32:18.719 [2024-06-10 11:43:02.663125] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:32:18.719 [2024-06-10 11:43:02.664177] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:32:19.006 11:43:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:32:19.006 11:43:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:32:19.006 11:43:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:32:19.006 11:43:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:19.006 11:43:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:19.006 11:43:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:32:19.006 11:43:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:19.006 11:43:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:19.006 11:43:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:19.006 11:43:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:19.006 11:43:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:19.006 11:43:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:32:19.006 11:43:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:19.006 "name": "Existed_Raid", 00:32:19.006 "uuid": "2ad2a759-b7ea-4c2f-9dc1-21e7b82b38ad", 00:32:19.006 "strip_size_kb": 0, 00:32:19.006 "state": "configuring", 00:32:19.006 "raid_level": "raid1", 00:32:19.006 "superblock": true, 00:32:19.006 "num_base_bdevs": 3, 00:32:19.006 "num_base_bdevs_discovered": 2, 00:32:19.006 "num_base_bdevs_operational": 3, 00:32:19.006 "base_bdevs_list": [ 00:32:19.006 { 00:32:19.006 "name": "BaseBdev1", 00:32:19.006 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:19.007 "is_configured": false, 00:32:19.007 "data_offset": 0, 00:32:19.007 "data_size": 0 00:32:19.007 }, 00:32:19.007 { 00:32:19.007 "name": "BaseBdev2", 00:32:19.007 "uuid": "3097979e-858e-4ebb-ab2a-5a9ee0adab8c", 00:32:19.007 "is_configured": true, 00:32:19.007 "data_offset": 2048, 00:32:19.007 "data_size": 63488 00:32:19.007 }, 00:32:19.007 { 00:32:19.007 "name": "BaseBdev3", 00:32:19.007 "uuid": "52df319f-69a0-45cd-8b7e-e19a081976f5", 00:32:19.007 "is_configured": true, 00:32:19.007 "data_offset": 2048, 00:32:19.007 "data_size": 63488 00:32:19.007 } 00:32:19.007 ] 00:32:19.007 }' 00:32:19.007 11:43:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:19.007 11:43:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:32:19.608 11:43:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:32:19.608 [2024-06-10 11:43:03.541325] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:32:19.865 11:43:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:32:19.865 11:43:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:32:19.866 11:43:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:32:19.866 11:43:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:19.866 11:43:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:19.866 11:43:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:32:19.866 11:43:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:19.866 11:43:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:19.866 11:43:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:19.866 11:43:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:19.866 11:43:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:19.866 11:43:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:32:19.866 11:43:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:19.866 "name": "Existed_Raid", 00:32:19.866 "uuid": "2ad2a759-b7ea-4c2f-9dc1-21e7b82b38ad", 00:32:19.866 "strip_size_kb": 0, 00:32:19.866 "state": "configuring", 00:32:19.866 "raid_level": "raid1", 00:32:19.866 "superblock": true, 00:32:19.866 "num_base_bdevs": 3, 00:32:19.866 "num_base_bdevs_discovered": 1, 00:32:19.866 "num_base_bdevs_operational": 3, 00:32:19.866 "base_bdevs_list": [ 00:32:19.866 { 00:32:19.866 "name": "BaseBdev1", 00:32:19.866 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:19.866 "is_configured": false, 00:32:19.866 "data_offset": 0, 00:32:19.866 "data_size": 0 00:32:19.866 }, 00:32:19.866 { 00:32:19.866 "name": null, 00:32:19.866 "uuid": "3097979e-858e-4ebb-ab2a-5a9ee0adab8c", 00:32:19.866 "is_configured": false, 00:32:19.866 "data_offset": 2048, 00:32:19.866 "data_size": 63488 00:32:19.866 }, 00:32:19.866 { 00:32:19.866 "name": "BaseBdev3", 00:32:19.866 "uuid": "52df319f-69a0-45cd-8b7e-e19a081976f5", 00:32:19.866 "is_configured": true, 00:32:19.866 "data_offset": 2048, 00:32:19.866 "data_size": 63488 00:32:19.866 } 00:32:19.866 ] 00:32:19.866 }' 00:32:19.866 11:43:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:19.866 11:43:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:32:20.431 11:43:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:20.431 11:43:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:32:20.689 11:43:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:32:20.689 11:43:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:32:20.689 [2024-06-10 11:43:04.568061] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:32:20.689 BaseBdev1 00:32:20.689 11:43:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:32:20.689 11:43:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:32:20.689 11:43:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:32:20.690 11:43:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:32:20.690 11:43:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:32:20.690 11:43:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:32:20.690 11:43:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:32:20.948 11:43:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:32:21.207 [ 00:32:21.207 { 00:32:21.207 "name": "BaseBdev1", 00:32:21.207 "aliases": [ 00:32:21.207 "52f9816c-2593-45f1-91d7-5cfbd6e531b5" 00:32:21.207 ], 00:32:21.207 "product_name": "Malloc disk", 00:32:21.207 "block_size": 512, 00:32:21.207 "num_blocks": 65536, 00:32:21.207 "uuid": "52f9816c-2593-45f1-91d7-5cfbd6e531b5", 00:32:21.207 "assigned_rate_limits": { 00:32:21.207 "rw_ios_per_sec": 0, 00:32:21.207 "rw_mbytes_per_sec": 0, 00:32:21.207 "r_mbytes_per_sec": 0, 00:32:21.207 "w_mbytes_per_sec": 0 00:32:21.207 }, 00:32:21.207 "claimed": true, 00:32:21.207 "claim_type": "exclusive_write", 00:32:21.207 "zoned": false, 00:32:21.207 "supported_io_types": { 00:32:21.207 "read": true, 00:32:21.207 "write": true, 00:32:21.207 "unmap": true, 00:32:21.207 "write_zeroes": true, 00:32:21.207 "flush": true, 00:32:21.207 "reset": true, 00:32:21.207 "compare": false, 00:32:21.207 "compare_and_write": false, 00:32:21.207 "abort": true, 00:32:21.207 "nvme_admin": false, 00:32:21.207 "nvme_io": false 00:32:21.207 }, 00:32:21.207 "memory_domains": [ 00:32:21.207 { 00:32:21.207 "dma_device_id": "system", 00:32:21.207 "dma_device_type": 1 00:32:21.207 }, 00:32:21.207 { 00:32:21.207 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:21.207 "dma_device_type": 2 00:32:21.207 } 00:32:21.207 ], 00:32:21.207 "driver_specific": {} 00:32:21.207 } 00:32:21.207 ] 00:32:21.207 11:43:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:32:21.207 11:43:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:32:21.207 11:43:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:32:21.207 11:43:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:32:21.207 11:43:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:21.207 11:43:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:21.207 11:43:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:32:21.207 11:43:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:21.207 11:43:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:21.207 11:43:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:21.207 11:43:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:21.207 11:43:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:21.207 11:43:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:32:21.207 11:43:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:21.207 "name": "Existed_Raid", 00:32:21.207 "uuid": "2ad2a759-b7ea-4c2f-9dc1-21e7b82b38ad", 00:32:21.207 "strip_size_kb": 0, 00:32:21.207 "state": "configuring", 00:32:21.207 "raid_level": "raid1", 00:32:21.207 "superblock": true, 00:32:21.207 "num_base_bdevs": 3, 00:32:21.207 "num_base_bdevs_discovered": 2, 00:32:21.207 "num_base_bdevs_operational": 3, 00:32:21.207 "base_bdevs_list": [ 00:32:21.207 { 00:32:21.207 "name": "BaseBdev1", 00:32:21.207 "uuid": "52f9816c-2593-45f1-91d7-5cfbd6e531b5", 00:32:21.207 "is_configured": true, 00:32:21.207 "data_offset": 2048, 00:32:21.207 "data_size": 63488 00:32:21.207 }, 00:32:21.207 { 00:32:21.207 "name": null, 00:32:21.207 "uuid": "3097979e-858e-4ebb-ab2a-5a9ee0adab8c", 00:32:21.207 "is_configured": false, 00:32:21.207 "data_offset": 2048, 00:32:21.207 "data_size": 63488 00:32:21.207 }, 00:32:21.207 { 00:32:21.207 "name": "BaseBdev3", 00:32:21.207 "uuid": "52df319f-69a0-45cd-8b7e-e19a081976f5", 00:32:21.207 "is_configured": true, 00:32:21.207 "data_offset": 2048, 00:32:21.207 "data_size": 63488 00:32:21.207 } 00:32:21.207 ] 00:32:21.207 }' 00:32:21.207 11:43:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:21.207 11:43:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:32:21.773 11:43:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:21.774 11:43:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:32:22.031 11:43:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:32:22.032 11:43:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:32:22.290 [2024-06-10 11:43:05.979751] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:32:22.290 11:43:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:32:22.290 11:43:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:32:22.290 11:43:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:32:22.290 11:43:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:22.290 11:43:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:22.290 11:43:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:32:22.290 11:43:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:22.290 11:43:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:22.290 11:43:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:22.290 11:43:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:22.290 11:43:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:22.290 11:43:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:32:22.290 11:43:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:22.290 "name": "Existed_Raid", 00:32:22.290 "uuid": "2ad2a759-b7ea-4c2f-9dc1-21e7b82b38ad", 00:32:22.290 "strip_size_kb": 0, 00:32:22.290 "state": "configuring", 00:32:22.290 "raid_level": "raid1", 00:32:22.290 "superblock": true, 00:32:22.290 "num_base_bdevs": 3, 00:32:22.290 "num_base_bdevs_discovered": 1, 00:32:22.290 "num_base_bdevs_operational": 3, 00:32:22.290 "base_bdevs_list": [ 00:32:22.290 { 00:32:22.290 "name": "BaseBdev1", 00:32:22.290 "uuid": "52f9816c-2593-45f1-91d7-5cfbd6e531b5", 00:32:22.290 "is_configured": true, 00:32:22.290 "data_offset": 2048, 00:32:22.290 "data_size": 63488 00:32:22.290 }, 00:32:22.290 { 00:32:22.290 "name": null, 00:32:22.290 "uuid": "3097979e-858e-4ebb-ab2a-5a9ee0adab8c", 00:32:22.290 "is_configured": false, 00:32:22.290 "data_offset": 2048, 00:32:22.290 "data_size": 63488 00:32:22.290 }, 00:32:22.290 { 00:32:22.290 "name": null, 00:32:22.290 "uuid": "52df319f-69a0-45cd-8b7e-e19a081976f5", 00:32:22.290 "is_configured": false, 00:32:22.290 "data_offset": 2048, 00:32:22.290 "data_size": 63488 00:32:22.290 } 00:32:22.290 ] 00:32:22.290 }' 00:32:22.290 11:43:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:22.290 11:43:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:32:22.856 11:43:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:32:22.856 11:43:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:23.115 11:43:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:32:23.115 11:43:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:32:23.115 [2024-06-10 11:43:07.002397] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:32:23.115 11:43:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:32:23.115 11:43:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:32:23.115 11:43:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:32:23.115 11:43:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:23.115 11:43:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:23.115 11:43:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:32:23.115 11:43:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:23.115 11:43:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:23.115 11:43:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:23.115 11:43:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:23.115 11:43:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:23.115 11:43:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:32:23.372 11:43:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:23.372 "name": "Existed_Raid", 00:32:23.372 "uuid": "2ad2a759-b7ea-4c2f-9dc1-21e7b82b38ad", 00:32:23.372 "strip_size_kb": 0, 00:32:23.372 "state": "configuring", 00:32:23.372 "raid_level": "raid1", 00:32:23.372 "superblock": true, 00:32:23.372 "num_base_bdevs": 3, 00:32:23.372 "num_base_bdevs_discovered": 2, 00:32:23.372 "num_base_bdevs_operational": 3, 00:32:23.372 "base_bdevs_list": [ 00:32:23.372 { 00:32:23.372 "name": "BaseBdev1", 00:32:23.372 "uuid": "52f9816c-2593-45f1-91d7-5cfbd6e531b5", 00:32:23.372 "is_configured": true, 00:32:23.372 "data_offset": 2048, 00:32:23.372 "data_size": 63488 00:32:23.372 }, 00:32:23.372 { 00:32:23.372 "name": null, 00:32:23.372 "uuid": "3097979e-858e-4ebb-ab2a-5a9ee0adab8c", 00:32:23.372 "is_configured": false, 00:32:23.372 "data_offset": 2048, 00:32:23.372 "data_size": 63488 00:32:23.372 }, 00:32:23.372 { 00:32:23.372 "name": "BaseBdev3", 00:32:23.373 "uuid": "52df319f-69a0-45cd-8b7e-e19a081976f5", 00:32:23.373 "is_configured": true, 00:32:23.373 "data_offset": 2048, 00:32:23.373 "data_size": 63488 00:32:23.373 } 00:32:23.373 ] 00:32:23.373 }' 00:32:23.373 11:43:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:23.373 11:43:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:32:23.937 11:43:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:23.938 11:43:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:32:23.938 11:43:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:32:23.938 11:43:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:32:24.196 [2024-06-10 11:43:08.041100] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:32:24.196 11:43:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:32:24.196 11:43:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:32:24.196 11:43:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:32:24.196 11:43:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:24.196 11:43:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:24.196 11:43:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:32:24.196 11:43:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:24.196 11:43:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:24.196 11:43:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:24.196 11:43:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:24.196 11:43:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:24.196 11:43:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:32:24.453 11:43:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:24.453 "name": "Existed_Raid", 00:32:24.453 "uuid": "2ad2a759-b7ea-4c2f-9dc1-21e7b82b38ad", 00:32:24.453 "strip_size_kb": 0, 00:32:24.453 "state": "configuring", 00:32:24.453 "raid_level": "raid1", 00:32:24.453 "superblock": true, 00:32:24.453 "num_base_bdevs": 3, 00:32:24.453 "num_base_bdevs_discovered": 1, 00:32:24.453 "num_base_bdevs_operational": 3, 00:32:24.453 "base_bdevs_list": [ 00:32:24.453 { 00:32:24.453 "name": null, 00:32:24.453 "uuid": "52f9816c-2593-45f1-91d7-5cfbd6e531b5", 00:32:24.453 "is_configured": false, 00:32:24.453 "data_offset": 2048, 00:32:24.453 "data_size": 63488 00:32:24.453 }, 00:32:24.453 { 00:32:24.453 "name": null, 00:32:24.453 "uuid": "3097979e-858e-4ebb-ab2a-5a9ee0adab8c", 00:32:24.453 "is_configured": false, 00:32:24.453 "data_offset": 2048, 00:32:24.453 "data_size": 63488 00:32:24.453 }, 00:32:24.453 { 00:32:24.453 "name": "BaseBdev3", 00:32:24.453 "uuid": "52df319f-69a0-45cd-8b7e-e19a081976f5", 00:32:24.453 "is_configured": true, 00:32:24.453 "data_offset": 2048, 00:32:24.453 "data_size": 63488 00:32:24.453 } 00:32:24.453 ] 00:32:24.453 }' 00:32:24.453 11:43:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:24.453 11:43:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:32:25.019 11:43:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:25.019 11:43:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:32:25.019 11:43:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:32:25.019 11:43:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:32:25.277 [2024-06-10 11:43:09.085914] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:32:25.277 11:43:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:32:25.277 11:43:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:32:25.277 11:43:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:32:25.277 11:43:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:25.277 11:43:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:25.277 11:43:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:32:25.277 11:43:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:25.277 11:43:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:25.277 11:43:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:25.277 11:43:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:25.277 11:43:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:25.277 11:43:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:32:25.535 11:43:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:25.535 "name": "Existed_Raid", 00:32:25.535 "uuid": "2ad2a759-b7ea-4c2f-9dc1-21e7b82b38ad", 00:32:25.535 "strip_size_kb": 0, 00:32:25.535 "state": "configuring", 00:32:25.535 "raid_level": "raid1", 00:32:25.535 "superblock": true, 00:32:25.535 "num_base_bdevs": 3, 00:32:25.535 "num_base_bdevs_discovered": 2, 00:32:25.535 "num_base_bdevs_operational": 3, 00:32:25.535 "base_bdevs_list": [ 00:32:25.535 { 00:32:25.535 "name": null, 00:32:25.535 "uuid": "52f9816c-2593-45f1-91d7-5cfbd6e531b5", 00:32:25.535 "is_configured": false, 00:32:25.535 "data_offset": 2048, 00:32:25.535 "data_size": 63488 00:32:25.535 }, 00:32:25.535 { 00:32:25.535 "name": "BaseBdev2", 00:32:25.535 "uuid": "3097979e-858e-4ebb-ab2a-5a9ee0adab8c", 00:32:25.535 "is_configured": true, 00:32:25.535 "data_offset": 2048, 00:32:25.535 "data_size": 63488 00:32:25.535 }, 00:32:25.535 { 00:32:25.535 "name": "BaseBdev3", 00:32:25.535 "uuid": "52df319f-69a0-45cd-8b7e-e19a081976f5", 00:32:25.535 "is_configured": true, 00:32:25.535 "data_offset": 2048, 00:32:25.535 "data_size": 63488 00:32:25.535 } 00:32:25.535 ] 00:32:25.535 }' 00:32:25.535 11:43:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:25.535 11:43:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:32:26.100 11:43:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:26.100 11:43:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:32:26.100 11:43:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:32:26.100 11:43:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:26.100 11:43:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:32:26.358 11:43:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 52f9816c-2593-45f1-91d7-5cfbd6e531b5 00:32:26.358 [2024-06-10 11:43:10.289028] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:32:26.358 [2024-06-10 11:43:10.289175] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x17859f0 00:32:26.358 [2024-06-10 11:43:10.289184] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:32:26.358 [2024-06-10 11:43:10.289304] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1938080 00:32:26.358 [2024-06-10 11:43:10.289389] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x17859f0 00:32:26.358 [2024-06-10 11:43:10.289395] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x17859f0 00:32:26.358 [2024-06-10 11:43:10.289456] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:32:26.358 NewBaseBdev 00:32:26.358 11:43:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:32:26.358 11:43:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=NewBaseBdev 00:32:26.358 11:43:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:32:26.358 11:43:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:32:26.358 11:43:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:32:26.358 11:43:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:32:26.615 11:43:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:32:26.615 11:43:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:32:26.873 [ 00:32:26.873 { 00:32:26.873 "name": "NewBaseBdev", 00:32:26.873 "aliases": [ 00:32:26.873 "52f9816c-2593-45f1-91d7-5cfbd6e531b5" 00:32:26.873 ], 00:32:26.873 "product_name": "Malloc disk", 00:32:26.873 "block_size": 512, 00:32:26.873 "num_blocks": 65536, 00:32:26.873 "uuid": "52f9816c-2593-45f1-91d7-5cfbd6e531b5", 00:32:26.873 "assigned_rate_limits": { 00:32:26.874 "rw_ios_per_sec": 0, 00:32:26.874 "rw_mbytes_per_sec": 0, 00:32:26.874 "r_mbytes_per_sec": 0, 00:32:26.874 "w_mbytes_per_sec": 0 00:32:26.874 }, 00:32:26.874 "claimed": true, 00:32:26.874 "claim_type": "exclusive_write", 00:32:26.874 "zoned": false, 00:32:26.874 "supported_io_types": { 00:32:26.874 "read": true, 00:32:26.874 "write": true, 00:32:26.874 "unmap": true, 00:32:26.874 "write_zeroes": true, 00:32:26.874 "flush": true, 00:32:26.874 "reset": true, 00:32:26.874 "compare": false, 00:32:26.874 "compare_and_write": false, 00:32:26.874 "abort": true, 00:32:26.874 "nvme_admin": false, 00:32:26.874 "nvme_io": false 00:32:26.874 }, 00:32:26.874 "memory_domains": [ 00:32:26.874 { 00:32:26.874 "dma_device_id": "system", 00:32:26.874 "dma_device_type": 1 00:32:26.874 }, 00:32:26.874 { 00:32:26.874 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:26.874 "dma_device_type": 2 00:32:26.874 } 00:32:26.874 ], 00:32:26.874 "driver_specific": {} 00:32:26.874 } 00:32:26.874 ] 00:32:26.874 11:43:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:32:26.874 11:43:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:32:26.874 11:43:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:32:26.874 11:43:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:32:26.874 11:43:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:26.874 11:43:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:26.874 11:43:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:32:26.874 11:43:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:26.874 11:43:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:26.874 11:43:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:26.874 11:43:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:26.874 11:43:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:26.874 11:43:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:32:26.874 11:43:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:26.874 "name": "Existed_Raid", 00:32:26.874 "uuid": "2ad2a759-b7ea-4c2f-9dc1-21e7b82b38ad", 00:32:26.874 "strip_size_kb": 0, 00:32:26.874 "state": "online", 00:32:26.874 "raid_level": "raid1", 00:32:26.874 "superblock": true, 00:32:26.874 "num_base_bdevs": 3, 00:32:26.874 "num_base_bdevs_discovered": 3, 00:32:26.874 "num_base_bdevs_operational": 3, 00:32:26.874 "base_bdevs_list": [ 00:32:26.874 { 00:32:26.874 "name": "NewBaseBdev", 00:32:26.874 "uuid": "52f9816c-2593-45f1-91d7-5cfbd6e531b5", 00:32:26.874 "is_configured": true, 00:32:26.874 "data_offset": 2048, 00:32:26.874 "data_size": 63488 00:32:26.874 }, 00:32:26.874 { 00:32:26.874 "name": "BaseBdev2", 00:32:26.874 "uuid": "3097979e-858e-4ebb-ab2a-5a9ee0adab8c", 00:32:26.874 "is_configured": true, 00:32:26.874 "data_offset": 2048, 00:32:26.874 "data_size": 63488 00:32:26.874 }, 00:32:26.874 { 00:32:26.874 "name": "BaseBdev3", 00:32:26.874 "uuid": "52df319f-69a0-45cd-8b7e-e19a081976f5", 00:32:26.874 "is_configured": true, 00:32:26.874 "data_offset": 2048, 00:32:26.874 "data_size": 63488 00:32:26.874 } 00:32:26.874 ] 00:32:26.874 }' 00:32:26.874 11:43:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:26.874 11:43:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:32:27.440 11:43:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:32:27.440 11:43:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:32:27.440 11:43:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:32:27.440 11:43:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:32:27.440 11:43:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:32:27.440 11:43:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:32:27.440 11:43:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:32:27.440 11:43:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:32:27.696 [2024-06-10 11:43:11.484338] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:32:27.696 11:43:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:32:27.696 "name": "Existed_Raid", 00:32:27.696 "aliases": [ 00:32:27.696 "2ad2a759-b7ea-4c2f-9dc1-21e7b82b38ad" 00:32:27.696 ], 00:32:27.697 "product_name": "Raid Volume", 00:32:27.697 "block_size": 512, 00:32:27.697 "num_blocks": 63488, 00:32:27.697 "uuid": "2ad2a759-b7ea-4c2f-9dc1-21e7b82b38ad", 00:32:27.697 "assigned_rate_limits": { 00:32:27.697 "rw_ios_per_sec": 0, 00:32:27.697 "rw_mbytes_per_sec": 0, 00:32:27.697 "r_mbytes_per_sec": 0, 00:32:27.697 "w_mbytes_per_sec": 0 00:32:27.697 }, 00:32:27.697 "claimed": false, 00:32:27.697 "zoned": false, 00:32:27.697 "supported_io_types": { 00:32:27.697 "read": true, 00:32:27.697 "write": true, 00:32:27.697 "unmap": false, 00:32:27.697 "write_zeroes": true, 00:32:27.697 "flush": false, 00:32:27.697 "reset": true, 00:32:27.697 "compare": false, 00:32:27.697 "compare_and_write": false, 00:32:27.697 "abort": false, 00:32:27.697 "nvme_admin": false, 00:32:27.697 "nvme_io": false 00:32:27.697 }, 00:32:27.697 "memory_domains": [ 00:32:27.697 { 00:32:27.697 "dma_device_id": "system", 00:32:27.697 "dma_device_type": 1 00:32:27.697 }, 00:32:27.697 { 00:32:27.697 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:27.697 "dma_device_type": 2 00:32:27.697 }, 00:32:27.697 { 00:32:27.697 "dma_device_id": "system", 00:32:27.697 "dma_device_type": 1 00:32:27.697 }, 00:32:27.697 { 00:32:27.697 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:27.697 "dma_device_type": 2 00:32:27.697 }, 00:32:27.697 { 00:32:27.697 "dma_device_id": "system", 00:32:27.697 "dma_device_type": 1 00:32:27.697 }, 00:32:27.697 { 00:32:27.697 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:27.697 "dma_device_type": 2 00:32:27.697 } 00:32:27.697 ], 00:32:27.697 "driver_specific": { 00:32:27.697 "raid": { 00:32:27.697 "uuid": "2ad2a759-b7ea-4c2f-9dc1-21e7b82b38ad", 00:32:27.697 "strip_size_kb": 0, 00:32:27.697 "state": "online", 00:32:27.697 "raid_level": "raid1", 00:32:27.697 "superblock": true, 00:32:27.697 "num_base_bdevs": 3, 00:32:27.697 "num_base_bdevs_discovered": 3, 00:32:27.697 "num_base_bdevs_operational": 3, 00:32:27.697 "base_bdevs_list": [ 00:32:27.697 { 00:32:27.697 "name": "NewBaseBdev", 00:32:27.697 "uuid": "52f9816c-2593-45f1-91d7-5cfbd6e531b5", 00:32:27.697 "is_configured": true, 00:32:27.697 "data_offset": 2048, 00:32:27.697 "data_size": 63488 00:32:27.697 }, 00:32:27.697 { 00:32:27.697 "name": "BaseBdev2", 00:32:27.697 "uuid": "3097979e-858e-4ebb-ab2a-5a9ee0adab8c", 00:32:27.697 "is_configured": true, 00:32:27.697 "data_offset": 2048, 00:32:27.697 "data_size": 63488 00:32:27.697 }, 00:32:27.697 { 00:32:27.697 "name": "BaseBdev3", 00:32:27.697 "uuid": "52df319f-69a0-45cd-8b7e-e19a081976f5", 00:32:27.697 "is_configured": true, 00:32:27.697 "data_offset": 2048, 00:32:27.697 "data_size": 63488 00:32:27.697 } 00:32:27.697 ] 00:32:27.697 } 00:32:27.697 } 00:32:27.697 }' 00:32:27.697 11:43:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:32:27.697 11:43:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:32:27.697 BaseBdev2 00:32:27.697 BaseBdev3' 00:32:27.697 11:43:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:32:27.697 11:43:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:32:27.697 11:43:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:32:27.954 11:43:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:32:27.954 "name": "NewBaseBdev", 00:32:27.954 "aliases": [ 00:32:27.954 "52f9816c-2593-45f1-91d7-5cfbd6e531b5" 00:32:27.954 ], 00:32:27.954 "product_name": "Malloc disk", 00:32:27.954 "block_size": 512, 00:32:27.954 "num_blocks": 65536, 00:32:27.954 "uuid": "52f9816c-2593-45f1-91d7-5cfbd6e531b5", 00:32:27.954 "assigned_rate_limits": { 00:32:27.954 "rw_ios_per_sec": 0, 00:32:27.954 "rw_mbytes_per_sec": 0, 00:32:27.954 "r_mbytes_per_sec": 0, 00:32:27.954 "w_mbytes_per_sec": 0 00:32:27.955 }, 00:32:27.955 "claimed": true, 00:32:27.955 "claim_type": "exclusive_write", 00:32:27.955 "zoned": false, 00:32:27.955 "supported_io_types": { 00:32:27.955 "read": true, 00:32:27.955 "write": true, 00:32:27.955 "unmap": true, 00:32:27.955 "write_zeroes": true, 00:32:27.955 "flush": true, 00:32:27.955 "reset": true, 00:32:27.955 "compare": false, 00:32:27.955 "compare_and_write": false, 00:32:27.955 "abort": true, 00:32:27.955 "nvme_admin": false, 00:32:27.955 "nvme_io": false 00:32:27.955 }, 00:32:27.955 "memory_domains": [ 00:32:27.955 { 00:32:27.955 "dma_device_id": "system", 00:32:27.955 "dma_device_type": 1 00:32:27.955 }, 00:32:27.955 { 00:32:27.955 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:27.955 "dma_device_type": 2 00:32:27.955 } 00:32:27.955 ], 00:32:27.955 "driver_specific": {} 00:32:27.955 }' 00:32:27.955 11:43:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:32:27.955 11:43:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:32:27.955 11:43:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:32:27.955 11:43:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:32:27.955 11:43:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:32:27.955 11:43:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:32:27.955 11:43:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:32:28.212 11:43:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:32:28.212 11:43:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:32:28.212 11:43:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:32:28.212 11:43:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:32:28.212 11:43:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:32:28.212 11:43:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:32:28.212 11:43:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:32:28.212 11:43:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:32:28.469 11:43:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:32:28.469 "name": "BaseBdev2", 00:32:28.469 "aliases": [ 00:32:28.469 "3097979e-858e-4ebb-ab2a-5a9ee0adab8c" 00:32:28.469 ], 00:32:28.469 "product_name": "Malloc disk", 00:32:28.469 "block_size": 512, 00:32:28.469 "num_blocks": 65536, 00:32:28.469 "uuid": "3097979e-858e-4ebb-ab2a-5a9ee0adab8c", 00:32:28.469 "assigned_rate_limits": { 00:32:28.469 "rw_ios_per_sec": 0, 00:32:28.469 "rw_mbytes_per_sec": 0, 00:32:28.469 "r_mbytes_per_sec": 0, 00:32:28.469 "w_mbytes_per_sec": 0 00:32:28.469 }, 00:32:28.469 "claimed": true, 00:32:28.469 "claim_type": "exclusive_write", 00:32:28.469 "zoned": false, 00:32:28.469 "supported_io_types": { 00:32:28.469 "read": true, 00:32:28.469 "write": true, 00:32:28.469 "unmap": true, 00:32:28.469 "write_zeroes": true, 00:32:28.469 "flush": true, 00:32:28.469 "reset": true, 00:32:28.469 "compare": false, 00:32:28.469 "compare_and_write": false, 00:32:28.469 "abort": true, 00:32:28.469 "nvme_admin": false, 00:32:28.469 "nvme_io": false 00:32:28.469 }, 00:32:28.469 "memory_domains": [ 00:32:28.469 { 00:32:28.469 "dma_device_id": "system", 00:32:28.469 "dma_device_type": 1 00:32:28.469 }, 00:32:28.469 { 00:32:28.469 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:28.469 "dma_device_type": 2 00:32:28.469 } 00:32:28.469 ], 00:32:28.469 "driver_specific": {} 00:32:28.469 }' 00:32:28.469 11:43:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:32:28.469 11:43:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:32:28.469 11:43:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:32:28.469 11:43:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:32:28.469 11:43:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:32:28.469 11:43:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:32:28.469 11:43:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:32:28.469 11:43:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:32:28.726 11:43:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:32:28.726 11:43:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:32:28.726 11:43:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:32:28.726 11:43:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:32:28.726 11:43:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:32:28.726 11:43:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:32:28.726 11:43:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:32:28.726 11:43:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:32:28.726 "name": "BaseBdev3", 00:32:28.726 "aliases": [ 00:32:28.726 "52df319f-69a0-45cd-8b7e-e19a081976f5" 00:32:28.726 ], 00:32:28.726 "product_name": "Malloc disk", 00:32:28.726 "block_size": 512, 00:32:28.726 "num_blocks": 65536, 00:32:28.726 "uuid": "52df319f-69a0-45cd-8b7e-e19a081976f5", 00:32:28.726 "assigned_rate_limits": { 00:32:28.726 "rw_ios_per_sec": 0, 00:32:28.726 "rw_mbytes_per_sec": 0, 00:32:28.726 "r_mbytes_per_sec": 0, 00:32:28.726 "w_mbytes_per_sec": 0 00:32:28.726 }, 00:32:28.726 "claimed": true, 00:32:28.726 "claim_type": "exclusive_write", 00:32:28.726 "zoned": false, 00:32:28.726 "supported_io_types": { 00:32:28.726 "read": true, 00:32:28.726 "write": true, 00:32:28.726 "unmap": true, 00:32:28.726 "write_zeroes": true, 00:32:28.726 "flush": true, 00:32:28.726 "reset": true, 00:32:28.726 "compare": false, 00:32:28.726 "compare_and_write": false, 00:32:28.726 "abort": true, 00:32:28.726 "nvme_admin": false, 00:32:28.726 "nvme_io": false 00:32:28.726 }, 00:32:28.726 "memory_domains": [ 00:32:28.726 { 00:32:28.726 "dma_device_id": "system", 00:32:28.726 "dma_device_type": 1 00:32:28.726 }, 00:32:28.726 { 00:32:28.726 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:28.726 "dma_device_type": 2 00:32:28.726 } 00:32:28.726 ], 00:32:28.726 "driver_specific": {} 00:32:28.726 }' 00:32:28.984 11:43:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:32:28.984 11:43:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:32:28.984 11:43:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:32:28.984 11:43:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:32:28.984 11:43:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:32:28.984 11:43:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:32:28.984 11:43:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:32:28.984 11:43:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:32:28.984 11:43:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:32:28.984 11:43:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:32:29.241 11:43:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:32:29.241 11:43:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:32:29.241 11:43:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:32:29.241 [2024-06-10 11:43:13.132467] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:32:29.241 [2024-06-10 11:43:13.132486] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:32:29.241 [2024-06-10 11:43:13.132521] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:32:29.241 [2024-06-10 11:43:13.132703] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:32:29.241 [2024-06-10 11:43:13.132712] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17859f0 name Existed_Raid, state offline 00:32:29.241 11:43:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 171894 00:32:29.241 11:43:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@949 -- # '[' -z 171894 ']' 00:32:29.242 11:43:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # kill -0 171894 00:32:29.242 11:43:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # uname 00:32:29.242 11:43:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:32:29.242 11:43:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 171894 00:32:29.499 11:43:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:32:29.499 11:43:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:32:29.499 11:43:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # echo 'killing process with pid 171894' 00:32:29.499 killing process with pid 171894 00:32:29.499 11:43:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # kill 171894 00:32:29.499 [2024-06-10 11:43:13.198536] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:32:29.499 11:43:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@973 -- # wait 171894 00:32:29.499 [2024-06-10 11:43:13.224225] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:32:29.499 11:43:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:32:29.499 00:32:29.499 real 0m22.042s 00:32:29.499 user 0m40.202s 00:32:29.499 sys 0m4.272s 00:32:29.499 11:43:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # xtrace_disable 00:32:29.499 11:43:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:32:29.499 ************************************ 00:32:29.499 END TEST raid_state_function_test_sb 00:32:29.499 ************************************ 00:32:29.757 11:43:13 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 3 00:32:29.757 11:43:13 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:32:29.757 11:43:13 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:32:29.757 11:43:13 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:32:29.757 ************************************ 00:32:29.757 START TEST raid_superblock_test 00:32:29.757 ************************************ 00:32:29.757 11:43:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # raid_superblock_test raid1 3 00:32:29.757 11:43:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:32:29.757 11:43:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:32:29.757 11:43:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:32:29.757 11:43:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:32:29.757 11:43:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:32:29.757 11:43:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:32:29.757 11:43:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:32:29.757 11:43:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:32:29.757 11:43:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:32:29.757 11:43:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:32:29.757 11:43:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:32:29.757 11:43:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:32:29.758 11:43:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:32:29.758 11:43:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:32:29.758 11:43:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:32:29.758 11:43:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=175461 00:32:29.758 11:43:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 175461 /var/tmp/spdk-raid.sock 00:32:29.758 11:43:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:32:29.758 11:43:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@830 -- # '[' -z 175461 ']' 00:32:29.758 11:43:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:32:29.758 11:43:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:32:29.758 11:43:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:32:29.758 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:32:29.758 11:43:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:32:29.758 11:43:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:32:29.758 [2024-06-10 11:43:13.566150] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:32:29.758 [2024-06-10 11:43:13.566207] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid175461 ] 00:32:29.758 [2024-06-10 11:43:13.653875] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:30.016 [2024-06-10 11:43:13.733241] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:32:30.016 [2024-06-10 11:43:13.789475] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:32:30.016 [2024-06-10 11:43:13.789507] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:32:30.582 11:43:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:32:30.582 11:43:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@863 -- # return 0 00:32:30.582 11:43:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:32:30.582 11:43:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:32:30.582 11:43:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:32:30.582 11:43:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:32:30.582 11:43:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:32:30.582 11:43:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:32:30.582 11:43:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:32:30.582 11:43:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:32:30.582 11:43:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:32:30.582 malloc1 00:32:30.839 11:43:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:32:30.839 [2024-06-10 11:43:14.684892] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:32:30.839 [2024-06-10 11:43:14.684932] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:32:30.839 [2024-06-10 11:43:14.684949] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b3a100 00:32:30.839 [2024-06-10 11:43:14.684958] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:32:30.839 [2024-06-10 11:43:14.686208] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:32:30.839 [2024-06-10 11:43:14.686230] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:32:30.839 pt1 00:32:30.839 11:43:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:32:30.839 11:43:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:32:30.839 11:43:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:32:30.839 11:43:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:32:30.839 11:43:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:32:30.839 11:43:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:32:30.839 11:43:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:32:30.839 11:43:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:32:30.839 11:43:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:32:31.097 malloc2 00:32:31.097 11:43:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:32:31.354 [2024-06-10 11:43:15.049708] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:32:31.354 [2024-06-10 11:43:15.049745] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:32:31.354 [2024-06-10 11:43:15.049759] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b3b500 00:32:31.354 [2024-06-10 11:43:15.049767] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:32:31.354 [2024-06-10 11:43:15.050814] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:32:31.354 [2024-06-10 11:43:15.050836] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:32:31.354 pt2 00:32:31.354 11:43:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:32:31.354 11:43:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:32:31.354 11:43:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:32:31.354 11:43:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:32:31.354 11:43:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:32:31.354 11:43:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:32:31.354 11:43:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:32:31.354 11:43:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:32:31.354 11:43:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:32:31.354 malloc3 00:32:31.354 11:43:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:32:31.612 [2024-06-10 11:43:15.398223] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:32:31.612 [2024-06-10 11:43:15.398261] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:32:31.612 [2024-06-10 11:43:15.398273] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ce57a0 00:32:31.612 [2024-06-10 11:43:15.398282] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:32:31.612 [2024-06-10 11:43:15.399346] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:32:31.612 [2024-06-10 11:43:15.399367] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:32:31.612 pt3 00:32:31.612 11:43:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:32:31.612 11:43:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:32:31.612 11:43:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:32:31.869 [2024-06-10 11:43:15.562665] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:32:31.869 [2024-06-10 11:43:15.563553] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:32:31.869 [2024-06-10 11:43:15.563591] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:32:31.869 [2024-06-10 11:43:15.563702] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1ce8d40 00:32:31.869 [2024-06-10 11:43:15.563709] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:32:31.869 [2024-06-10 11:43:15.563844] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b3ae90 00:32:31.869 [2024-06-10 11:43:15.563954] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1ce8d40 00:32:31.869 [2024-06-10 11:43:15.563961] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1ce8d40 00:32:31.869 [2024-06-10 11:43:15.564027] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:32:31.869 11:43:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:32:31.869 11:43:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:32:31.869 11:43:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:32:31.869 11:43:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:31.869 11:43:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:31.869 11:43:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:32:31.869 11:43:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:31.869 11:43:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:31.869 11:43:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:31.869 11:43:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:31.869 11:43:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:31.869 11:43:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:32:31.869 11:43:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:31.869 "name": "raid_bdev1", 00:32:31.869 "uuid": "0f0ba238-31cf-4f98-ab85-315eb39c5e13", 00:32:31.869 "strip_size_kb": 0, 00:32:31.869 "state": "online", 00:32:31.869 "raid_level": "raid1", 00:32:31.869 "superblock": true, 00:32:31.869 "num_base_bdevs": 3, 00:32:31.869 "num_base_bdevs_discovered": 3, 00:32:31.869 "num_base_bdevs_operational": 3, 00:32:31.869 "base_bdevs_list": [ 00:32:31.869 { 00:32:31.869 "name": "pt1", 00:32:31.869 "uuid": "00000000-0000-0000-0000-000000000001", 00:32:31.869 "is_configured": true, 00:32:31.869 "data_offset": 2048, 00:32:31.869 "data_size": 63488 00:32:31.869 }, 00:32:31.869 { 00:32:31.869 "name": "pt2", 00:32:31.869 "uuid": "00000000-0000-0000-0000-000000000002", 00:32:31.869 "is_configured": true, 00:32:31.869 "data_offset": 2048, 00:32:31.869 "data_size": 63488 00:32:31.869 }, 00:32:31.869 { 00:32:31.869 "name": "pt3", 00:32:31.869 "uuid": "00000000-0000-0000-0000-000000000003", 00:32:31.869 "is_configured": true, 00:32:31.869 "data_offset": 2048, 00:32:31.869 "data_size": 63488 00:32:31.869 } 00:32:31.869 ] 00:32:31.869 }' 00:32:31.869 11:43:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:31.869 11:43:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:32:32.433 11:43:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:32:32.433 11:43:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:32:32.433 11:43:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:32:32.433 11:43:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:32:32.433 11:43:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:32:32.433 11:43:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:32:32.433 11:43:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:32:32.433 11:43:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:32:32.691 [2024-06-10 11:43:16.388935] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:32:32.691 11:43:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:32:32.691 "name": "raid_bdev1", 00:32:32.691 "aliases": [ 00:32:32.691 "0f0ba238-31cf-4f98-ab85-315eb39c5e13" 00:32:32.691 ], 00:32:32.691 "product_name": "Raid Volume", 00:32:32.691 "block_size": 512, 00:32:32.691 "num_blocks": 63488, 00:32:32.691 "uuid": "0f0ba238-31cf-4f98-ab85-315eb39c5e13", 00:32:32.691 "assigned_rate_limits": { 00:32:32.691 "rw_ios_per_sec": 0, 00:32:32.691 "rw_mbytes_per_sec": 0, 00:32:32.691 "r_mbytes_per_sec": 0, 00:32:32.691 "w_mbytes_per_sec": 0 00:32:32.691 }, 00:32:32.691 "claimed": false, 00:32:32.691 "zoned": false, 00:32:32.691 "supported_io_types": { 00:32:32.691 "read": true, 00:32:32.691 "write": true, 00:32:32.691 "unmap": false, 00:32:32.691 "write_zeroes": true, 00:32:32.691 "flush": false, 00:32:32.691 "reset": true, 00:32:32.691 "compare": false, 00:32:32.691 "compare_and_write": false, 00:32:32.691 "abort": false, 00:32:32.691 "nvme_admin": false, 00:32:32.691 "nvme_io": false 00:32:32.691 }, 00:32:32.691 "memory_domains": [ 00:32:32.691 { 00:32:32.691 "dma_device_id": "system", 00:32:32.691 "dma_device_type": 1 00:32:32.691 }, 00:32:32.691 { 00:32:32.691 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:32.691 "dma_device_type": 2 00:32:32.691 }, 00:32:32.691 { 00:32:32.691 "dma_device_id": "system", 00:32:32.691 "dma_device_type": 1 00:32:32.691 }, 00:32:32.691 { 00:32:32.691 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:32.691 "dma_device_type": 2 00:32:32.691 }, 00:32:32.691 { 00:32:32.691 "dma_device_id": "system", 00:32:32.691 "dma_device_type": 1 00:32:32.691 }, 00:32:32.691 { 00:32:32.691 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:32.691 "dma_device_type": 2 00:32:32.691 } 00:32:32.691 ], 00:32:32.691 "driver_specific": { 00:32:32.691 "raid": { 00:32:32.691 "uuid": "0f0ba238-31cf-4f98-ab85-315eb39c5e13", 00:32:32.691 "strip_size_kb": 0, 00:32:32.691 "state": "online", 00:32:32.691 "raid_level": "raid1", 00:32:32.691 "superblock": true, 00:32:32.691 "num_base_bdevs": 3, 00:32:32.691 "num_base_bdevs_discovered": 3, 00:32:32.691 "num_base_bdevs_operational": 3, 00:32:32.691 "base_bdevs_list": [ 00:32:32.691 { 00:32:32.691 "name": "pt1", 00:32:32.691 "uuid": "00000000-0000-0000-0000-000000000001", 00:32:32.691 "is_configured": true, 00:32:32.691 "data_offset": 2048, 00:32:32.691 "data_size": 63488 00:32:32.691 }, 00:32:32.691 { 00:32:32.691 "name": "pt2", 00:32:32.691 "uuid": "00000000-0000-0000-0000-000000000002", 00:32:32.691 "is_configured": true, 00:32:32.691 "data_offset": 2048, 00:32:32.691 "data_size": 63488 00:32:32.691 }, 00:32:32.691 { 00:32:32.691 "name": "pt3", 00:32:32.691 "uuid": "00000000-0000-0000-0000-000000000003", 00:32:32.691 "is_configured": true, 00:32:32.691 "data_offset": 2048, 00:32:32.691 "data_size": 63488 00:32:32.691 } 00:32:32.691 ] 00:32:32.691 } 00:32:32.691 } 00:32:32.691 }' 00:32:32.691 11:43:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:32:32.691 11:43:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:32:32.691 pt2 00:32:32.691 pt3' 00:32:32.691 11:43:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:32:32.691 11:43:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:32:32.691 11:43:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:32:32.691 11:43:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:32:32.691 "name": "pt1", 00:32:32.691 "aliases": [ 00:32:32.691 "00000000-0000-0000-0000-000000000001" 00:32:32.691 ], 00:32:32.691 "product_name": "passthru", 00:32:32.691 "block_size": 512, 00:32:32.691 "num_blocks": 65536, 00:32:32.691 "uuid": "00000000-0000-0000-0000-000000000001", 00:32:32.691 "assigned_rate_limits": { 00:32:32.691 "rw_ios_per_sec": 0, 00:32:32.691 "rw_mbytes_per_sec": 0, 00:32:32.691 "r_mbytes_per_sec": 0, 00:32:32.691 "w_mbytes_per_sec": 0 00:32:32.691 }, 00:32:32.691 "claimed": true, 00:32:32.691 "claim_type": "exclusive_write", 00:32:32.691 "zoned": false, 00:32:32.691 "supported_io_types": { 00:32:32.691 "read": true, 00:32:32.691 "write": true, 00:32:32.691 "unmap": true, 00:32:32.691 "write_zeroes": true, 00:32:32.691 "flush": true, 00:32:32.691 "reset": true, 00:32:32.691 "compare": false, 00:32:32.691 "compare_and_write": false, 00:32:32.691 "abort": true, 00:32:32.691 "nvme_admin": false, 00:32:32.691 "nvme_io": false 00:32:32.691 }, 00:32:32.691 "memory_domains": [ 00:32:32.691 { 00:32:32.691 "dma_device_id": "system", 00:32:32.691 "dma_device_type": 1 00:32:32.691 }, 00:32:32.691 { 00:32:32.691 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:32.691 "dma_device_type": 2 00:32:32.691 } 00:32:32.691 ], 00:32:32.691 "driver_specific": { 00:32:32.691 "passthru": { 00:32:32.691 "name": "pt1", 00:32:32.691 "base_bdev_name": "malloc1" 00:32:32.691 } 00:32:32.691 } 00:32:32.691 }' 00:32:32.691 11:43:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:32:32.948 11:43:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:32:32.948 11:43:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:32:32.948 11:43:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:32:32.948 11:43:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:32:32.948 11:43:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:32:32.948 11:43:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:32:32.948 11:43:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:32:32.948 11:43:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:32:32.948 11:43:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:32:32.948 11:43:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:32:33.205 11:43:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:32:33.205 11:43:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:32:33.205 11:43:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:32:33.205 11:43:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:32:33.205 11:43:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:32:33.205 "name": "pt2", 00:32:33.205 "aliases": [ 00:32:33.205 "00000000-0000-0000-0000-000000000002" 00:32:33.205 ], 00:32:33.205 "product_name": "passthru", 00:32:33.205 "block_size": 512, 00:32:33.205 "num_blocks": 65536, 00:32:33.205 "uuid": "00000000-0000-0000-0000-000000000002", 00:32:33.205 "assigned_rate_limits": { 00:32:33.205 "rw_ios_per_sec": 0, 00:32:33.205 "rw_mbytes_per_sec": 0, 00:32:33.205 "r_mbytes_per_sec": 0, 00:32:33.205 "w_mbytes_per_sec": 0 00:32:33.205 }, 00:32:33.205 "claimed": true, 00:32:33.205 "claim_type": "exclusive_write", 00:32:33.205 "zoned": false, 00:32:33.205 "supported_io_types": { 00:32:33.205 "read": true, 00:32:33.205 "write": true, 00:32:33.205 "unmap": true, 00:32:33.205 "write_zeroes": true, 00:32:33.205 "flush": true, 00:32:33.205 "reset": true, 00:32:33.205 "compare": false, 00:32:33.205 "compare_and_write": false, 00:32:33.205 "abort": true, 00:32:33.205 "nvme_admin": false, 00:32:33.205 "nvme_io": false 00:32:33.205 }, 00:32:33.205 "memory_domains": [ 00:32:33.205 { 00:32:33.205 "dma_device_id": "system", 00:32:33.205 "dma_device_type": 1 00:32:33.205 }, 00:32:33.205 { 00:32:33.205 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:33.205 "dma_device_type": 2 00:32:33.205 } 00:32:33.205 ], 00:32:33.205 "driver_specific": { 00:32:33.205 "passthru": { 00:32:33.205 "name": "pt2", 00:32:33.205 "base_bdev_name": "malloc2" 00:32:33.205 } 00:32:33.205 } 00:32:33.205 }' 00:32:33.205 11:43:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:32:33.205 11:43:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:32:33.462 11:43:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:32:33.462 11:43:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:32:33.462 11:43:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:32:33.462 11:43:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:32:33.462 11:43:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:32:33.462 11:43:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:32:33.462 11:43:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:32:33.462 11:43:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:32:33.462 11:43:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:32:33.719 11:43:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:32:33.719 11:43:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:32:33.719 11:43:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:32:33.719 11:43:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:32:33.719 11:43:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:32:33.719 "name": "pt3", 00:32:33.719 "aliases": [ 00:32:33.719 "00000000-0000-0000-0000-000000000003" 00:32:33.719 ], 00:32:33.719 "product_name": "passthru", 00:32:33.719 "block_size": 512, 00:32:33.719 "num_blocks": 65536, 00:32:33.719 "uuid": "00000000-0000-0000-0000-000000000003", 00:32:33.719 "assigned_rate_limits": { 00:32:33.719 "rw_ios_per_sec": 0, 00:32:33.719 "rw_mbytes_per_sec": 0, 00:32:33.719 "r_mbytes_per_sec": 0, 00:32:33.719 "w_mbytes_per_sec": 0 00:32:33.719 }, 00:32:33.719 "claimed": true, 00:32:33.719 "claim_type": "exclusive_write", 00:32:33.719 "zoned": false, 00:32:33.719 "supported_io_types": { 00:32:33.719 "read": true, 00:32:33.719 "write": true, 00:32:33.719 "unmap": true, 00:32:33.719 "write_zeroes": true, 00:32:33.719 "flush": true, 00:32:33.719 "reset": true, 00:32:33.719 "compare": false, 00:32:33.719 "compare_and_write": false, 00:32:33.719 "abort": true, 00:32:33.719 "nvme_admin": false, 00:32:33.719 "nvme_io": false 00:32:33.719 }, 00:32:33.719 "memory_domains": [ 00:32:33.719 { 00:32:33.719 "dma_device_id": "system", 00:32:33.719 "dma_device_type": 1 00:32:33.719 }, 00:32:33.719 { 00:32:33.719 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:33.719 "dma_device_type": 2 00:32:33.719 } 00:32:33.719 ], 00:32:33.719 "driver_specific": { 00:32:33.719 "passthru": { 00:32:33.719 "name": "pt3", 00:32:33.719 "base_bdev_name": "malloc3" 00:32:33.719 } 00:32:33.719 } 00:32:33.719 }' 00:32:33.719 11:43:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:32:33.719 11:43:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:32:33.977 11:43:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:32:33.977 11:43:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:32:33.977 11:43:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:32:33.977 11:43:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:32:33.977 11:43:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:32:33.977 11:43:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:32:33.977 11:43:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:32:33.977 11:43:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:32:33.977 11:43:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:32:33.977 11:43:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:32:33.977 11:43:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:32:33.977 11:43:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:32:34.234 [2024-06-10 11:43:18.057265] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:32:34.234 11:43:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=0f0ba238-31cf-4f98-ab85-315eb39c5e13 00:32:34.234 11:43:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 0f0ba238-31cf-4f98-ab85-315eb39c5e13 ']' 00:32:34.234 11:43:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:32:34.491 [2024-06-10 11:43:18.225548] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:32:34.491 [2024-06-10 11:43:18.225568] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:32:34.491 [2024-06-10 11:43:18.225606] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:32:34.491 [2024-06-10 11:43:18.225655] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:32:34.491 [2024-06-10 11:43:18.225664] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ce8d40 name raid_bdev1, state offline 00:32:34.491 11:43:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:34.491 11:43:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:32:34.491 11:43:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:32:34.491 11:43:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:32:34.491 11:43:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:32:34.491 11:43:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:32:34.749 11:43:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:32:34.749 11:43:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:32:35.006 11:43:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:32:35.006 11:43:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:32:35.006 11:43:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:32:35.006 11:43:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:32:35.264 11:43:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:32:35.264 11:43:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:32:35.264 11:43:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@649 -- # local es=0 00:32:35.264 11:43:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:32:35.264 11:43:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:32:35.264 11:43:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:32:35.264 11:43:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:32:35.264 11:43:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:32:35.264 11:43:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:32:35.264 11:43:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:32:35.264 11:43:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:32:35.264 11:43:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:32:35.264 11:43:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:32:35.522 [2024-06-10 11:43:19.252180] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:32:35.522 [2024-06-10 11:43:19.253228] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:32:35.522 [2024-06-10 11:43:19.253259] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:32:35.522 [2024-06-10 11:43:19.253294] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:32:35.522 [2024-06-10 11:43:19.253323] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:32:35.522 [2024-06-10 11:43:19.253354] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:32:35.522 [2024-06-10 11:43:19.253366] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:32:35.522 [2024-06-10 11:43:19.253374] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ce6ad0 name raid_bdev1, state configuring 00:32:35.522 request: 00:32:35.522 { 00:32:35.522 "name": "raid_bdev1", 00:32:35.522 "raid_level": "raid1", 00:32:35.522 "base_bdevs": [ 00:32:35.522 "malloc1", 00:32:35.522 "malloc2", 00:32:35.522 "malloc3" 00:32:35.522 ], 00:32:35.522 "superblock": false, 00:32:35.522 "method": "bdev_raid_create", 00:32:35.522 "req_id": 1 00:32:35.522 } 00:32:35.522 Got JSON-RPC error response 00:32:35.522 response: 00:32:35.522 { 00:32:35.522 "code": -17, 00:32:35.522 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:32:35.522 } 00:32:35.522 11:43:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # es=1 00:32:35.522 11:43:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:32:35.522 11:43:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:32:35.522 11:43:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:32:35.522 11:43:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:35.522 11:43:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:32:35.522 11:43:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:32:35.522 11:43:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:32:35.522 11:43:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:32:35.780 [2024-06-10 11:43:19.597217] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:32:35.780 [2024-06-10 11:43:19.597263] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:32:35.780 [2024-06-10 11:43:19.597293] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ce7b50 00:32:35.780 [2024-06-10 11:43:19.597302] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:32:35.780 [2024-06-10 11:43:19.598601] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:32:35.780 [2024-06-10 11:43:19.598625] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:32:35.780 [2024-06-10 11:43:19.598682] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:32:35.780 [2024-06-10 11:43:19.598704] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:32:35.780 pt1 00:32:35.780 11:43:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:32:35.780 11:43:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:32:35.780 11:43:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:32:35.780 11:43:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:35.780 11:43:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:35.780 11:43:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:32:35.781 11:43:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:35.781 11:43:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:35.781 11:43:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:35.781 11:43:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:35.781 11:43:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:32:35.781 11:43:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:36.038 11:43:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:36.038 "name": "raid_bdev1", 00:32:36.038 "uuid": "0f0ba238-31cf-4f98-ab85-315eb39c5e13", 00:32:36.038 "strip_size_kb": 0, 00:32:36.038 "state": "configuring", 00:32:36.038 "raid_level": "raid1", 00:32:36.038 "superblock": true, 00:32:36.038 "num_base_bdevs": 3, 00:32:36.038 "num_base_bdevs_discovered": 1, 00:32:36.038 "num_base_bdevs_operational": 3, 00:32:36.038 "base_bdevs_list": [ 00:32:36.038 { 00:32:36.038 "name": "pt1", 00:32:36.038 "uuid": "00000000-0000-0000-0000-000000000001", 00:32:36.038 "is_configured": true, 00:32:36.038 "data_offset": 2048, 00:32:36.038 "data_size": 63488 00:32:36.038 }, 00:32:36.038 { 00:32:36.038 "name": null, 00:32:36.038 "uuid": "00000000-0000-0000-0000-000000000002", 00:32:36.038 "is_configured": false, 00:32:36.038 "data_offset": 2048, 00:32:36.038 "data_size": 63488 00:32:36.038 }, 00:32:36.038 { 00:32:36.038 "name": null, 00:32:36.038 "uuid": "00000000-0000-0000-0000-000000000003", 00:32:36.038 "is_configured": false, 00:32:36.038 "data_offset": 2048, 00:32:36.038 "data_size": 63488 00:32:36.038 } 00:32:36.038 ] 00:32:36.038 }' 00:32:36.038 11:43:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:36.038 11:43:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:32:36.604 11:43:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:32:36.604 11:43:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:32:36.604 [2024-06-10 11:43:20.447436] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:32:36.604 [2024-06-10 11:43:20.447486] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:32:36.604 [2024-06-10 11:43:20.447502] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ce9900 00:32:36.604 [2024-06-10 11:43:20.447511] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:32:36.604 [2024-06-10 11:43:20.447782] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:32:36.604 [2024-06-10 11:43:20.447793] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:32:36.604 [2024-06-10 11:43:20.447846] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:32:36.604 [2024-06-10 11:43:20.447861] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:32:36.604 pt2 00:32:36.604 11:43:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:32:36.862 [2024-06-10 11:43:20.623897] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:32:36.862 11:43:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:32:36.862 11:43:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:32:36.862 11:43:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:32:36.862 11:43:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:36.862 11:43:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:36.862 11:43:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:32:36.862 11:43:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:36.862 11:43:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:36.862 11:43:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:36.862 11:43:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:36.862 11:43:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:36.862 11:43:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:32:37.120 11:43:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:37.120 "name": "raid_bdev1", 00:32:37.120 "uuid": "0f0ba238-31cf-4f98-ab85-315eb39c5e13", 00:32:37.120 "strip_size_kb": 0, 00:32:37.120 "state": "configuring", 00:32:37.120 "raid_level": "raid1", 00:32:37.120 "superblock": true, 00:32:37.120 "num_base_bdevs": 3, 00:32:37.120 "num_base_bdevs_discovered": 1, 00:32:37.120 "num_base_bdevs_operational": 3, 00:32:37.120 "base_bdevs_list": [ 00:32:37.120 { 00:32:37.120 "name": "pt1", 00:32:37.120 "uuid": "00000000-0000-0000-0000-000000000001", 00:32:37.120 "is_configured": true, 00:32:37.120 "data_offset": 2048, 00:32:37.120 "data_size": 63488 00:32:37.120 }, 00:32:37.120 { 00:32:37.120 "name": null, 00:32:37.120 "uuid": "00000000-0000-0000-0000-000000000002", 00:32:37.120 "is_configured": false, 00:32:37.120 "data_offset": 2048, 00:32:37.120 "data_size": 63488 00:32:37.120 }, 00:32:37.120 { 00:32:37.120 "name": null, 00:32:37.120 "uuid": "00000000-0000-0000-0000-000000000003", 00:32:37.120 "is_configured": false, 00:32:37.120 "data_offset": 2048, 00:32:37.120 "data_size": 63488 00:32:37.120 } 00:32:37.120 ] 00:32:37.120 }' 00:32:37.120 11:43:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:37.120 11:43:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:32:37.378 11:43:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:32:37.378 11:43:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:32:37.378 11:43:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:32:37.662 [2024-06-10 11:43:21.478100] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:32:37.662 [2024-06-10 11:43:21.478142] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:32:37.662 [2024-06-10 11:43:21.478157] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b3a330 00:32:37.662 [2024-06-10 11:43:21.478165] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:32:37.662 [2024-06-10 11:43:21.478431] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:32:37.662 [2024-06-10 11:43:21.478443] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:32:37.662 [2024-06-10 11:43:21.478493] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:32:37.662 [2024-06-10 11:43:21.478507] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:32:37.662 pt2 00:32:37.662 11:43:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:32:37.662 11:43:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:32:37.662 11:43:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:32:37.951 [2024-06-10 11:43:21.662576] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:32:37.951 [2024-06-10 11:43:21.662610] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:32:37.951 [2024-06-10 11:43:21.662622] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ceaef0 00:32:37.951 [2024-06-10 11:43:21.662631] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:32:37.951 [2024-06-10 11:43:21.662873] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:32:37.951 [2024-06-10 11:43:21.662885] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:32:37.951 [2024-06-10 11:43:21.662930] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:32:37.951 [2024-06-10 11:43:21.662943] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:32:37.951 [2024-06-10 11:43:21.663040] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1ce4a60 00:32:37.951 [2024-06-10 11:43:21.663047] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:32:37.951 [2024-06-10 11:43:21.663163] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b302e0 00:32:37.951 [2024-06-10 11:43:21.663253] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1ce4a60 00:32:37.951 [2024-06-10 11:43:21.663260] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1ce4a60 00:32:37.951 [2024-06-10 11:43:21.663326] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:32:37.951 pt3 00:32:37.951 11:43:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:32:37.951 11:43:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:32:37.951 11:43:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:32:37.951 11:43:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:32:37.951 11:43:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:32:37.951 11:43:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:37.951 11:43:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:37.951 11:43:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:32:37.951 11:43:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:37.951 11:43:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:37.951 11:43:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:37.951 11:43:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:37.951 11:43:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:37.951 11:43:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:32:37.951 11:43:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:37.951 "name": "raid_bdev1", 00:32:37.951 "uuid": "0f0ba238-31cf-4f98-ab85-315eb39c5e13", 00:32:37.951 "strip_size_kb": 0, 00:32:37.951 "state": "online", 00:32:37.951 "raid_level": "raid1", 00:32:37.951 "superblock": true, 00:32:37.951 "num_base_bdevs": 3, 00:32:37.951 "num_base_bdevs_discovered": 3, 00:32:37.951 "num_base_bdevs_operational": 3, 00:32:37.951 "base_bdevs_list": [ 00:32:37.951 { 00:32:37.951 "name": "pt1", 00:32:37.951 "uuid": "00000000-0000-0000-0000-000000000001", 00:32:37.951 "is_configured": true, 00:32:37.951 "data_offset": 2048, 00:32:37.951 "data_size": 63488 00:32:37.951 }, 00:32:37.951 { 00:32:37.951 "name": "pt2", 00:32:37.951 "uuid": "00000000-0000-0000-0000-000000000002", 00:32:37.951 "is_configured": true, 00:32:37.951 "data_offset": 2048, 00:32:37.951 "data_size": 63488 00:32:37.951 }, 00:32:37.951 { 00:32:37.951 "name": "pt3", 00:32:37.951 "uuid": "00000000-0000-0000-0000-000000000003", 00:32:37.951 "is_configured": true, 00:32:37.951 "data_offset": 2048, 00:32:37.951 "data_size": 63488 00:32:37.951 } 00:32:37.951 ] 00:32:37.951 }' 00:32:37.951 11:43:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:37.951 11:43:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:32:38.517 11:43:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:32:38.517 11:43:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:32:38.517 11:43:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:32:38.517 11:43:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:32:38.517 11:43:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:32:38.517 11:43:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:32:38.517 11:43:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:32:38.517 11:43:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:32:38.776 [2024-06-10 11:43:22.512927] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:32:38.776 11:43:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:32:38.776 "name": "raid_bdev1", 00:32:38.776 "aliases": [ 00:32:38.776 "0f0ba238-31cf-4f98-ab85-315eb39c5e13" 00:32:38.776 ], 00:32:38.776 "product_name": "Raid Volume", 00:32:38.776 "block_size": 512, 00:32:38.776 "num_blocks": 63488, 00:32:38.776 "uuid": "0f0ba238-31cf-4f98-ab85-315eb39c5e13", 00:32:38.776 "assigned_rate_limits": { 00:32:38.776 "rw_ios_per_sec": 0, 00:32:38.776 "rw_mbytes_per_sec": 0, 00:32:38.776 "r_mbytes_per_sec": 0, 00:32:38.776 "w_mbytes_per_sec": 0 00:32:38.776 }, 00:32:38.776 "claimed": false, 00:32:38.776 "zoned": false, 00:32:38.776 "supported_io_types": { 00:32:38.776 "read": true, 00:32:38.776 "write": true, 00:32:38.776 "unmap": false, 00:32:38.776 "write_zeroes": true, 00:32:38.776 "flush": false, 00:32:38.776 "reset": true, 00:32:38.776 "compare": false, 00:32:38.776 "compare_and_write": false, 00:32:38.776 "abort": false, 00:32:38.776 "nvme_admin": false, 00:32:38.776 "nvme_io": false 00:32:38.776 }, 00:32:38.776 "memory_domains": [ 00:32:38.776 { 00:32:38.776 "dma_device_id": "system", 00:32:38.776 "dma_device_type": 1 00:32:38.776 }, 00:32:38.776 { 00:32:38.776 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:38.776 "dma_device_type": 2 00:32:38.776 }, 00:32:38.776 { 00:32:38.776 "dma_device_id": "system", 00:32:38.776 "dma_device_type": 1 00:32:38.776 }, 00:32:38.776 { 00:32:38.776 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:38.776 "dma_device_type": 2 00:32:38.776 }, 00:32:38.776 { 00:32:38.776 "dma_device_id": "system", 00:32:38.776 "dma_device_type": 1 00:32:38.776 }, 00:32:38.776 { 00:32:38.776 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:38.776 "dma_device_type": 2 00:32:38.776 } 00:32:38.776 ], 00:32:38.776 "driver_specific": { 00:32:38.776 "raid": { 00:32:38.776 "uuid": "0f0ba238-31cf-4f98-ab85-315eb39c5e13", 00:32:38.776 "strip_size_kb": 0, 00:32:38.776 "state": "online", 00:32:38.776 "raid_level": "raid1", 00:32:38.776 "superblock": true, 00:32:38.776 "num_base_bdevs": 3, 00:32:38.776 "num_base_bdevs_discovered": 3, 00:32:38.776 "num_base_bdevs_operational": 3, 00:32:38.776 "base_bdevs_list": [ 00:32:38.776 { 00:32:38.776 "name": "pt1", 00:32:38.776 "uuid": "00000000-0000-0000-0000-000000000001", 00:32:38.776 "is_configured": true, 00:32:38.776 "data_offset": 2048, 00:32:38.776 "data_size": 63488 00:32:38.776 }, 00:32:38.776 { 00:32:38.776 "name": "pt2", 00:32:38.776 "uuid": "00000000-0000-0000-0000-000000000002", 00:32:38.776 "is_configured": true, 00:32:38.776 "data_offset": 2048, 00:32:38.776 "data_size": 63488 00:32:38.776 }, 00:32:38.776 { 00:32:38.776 "name": "pt3", 00:32:38.776 "uuid": "00000000-0000-0000-0000-000000000003", 00:32:38.776 "is_configured": true, 00:32:38.776 "data_offset": 2048, 00:32:38.776 "data_size": 63488 00:32:38.776 } 00:32:38.776 ] 00:32:38.776 } 00:32:38.776 } 00:32:38.776 }' 00:32:38.776 11:43:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:32:38.776 11:43:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:32:38.776 pt2 00:32:38.776 pt3' 00:32:38.776 11:43:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:32:38.776 11:43:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:32:38.776 11:43:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:32:39.034 11:43:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:32:39.034 "name": "pt1", 00:32:39.034 "aliases": [ 00:32:39.034 "00000000-0000-0000-0000-000000000001" 00:32:39.034 ], 00:32:39.034 "product_name": "passthru", 00:32:39.034 "block_size": 512, 00:32:39.034 "num_blocks": 65536, 00:32:39.034 "uuid": "00000000-0000-0000-0000-000000000001", 00:32:39.034 "assigned_rate_limits": { 00:32:39.034 "rw_ios_per_sec": 0, 00:32:39.034 "rw_mbytes_per_sec": 0, 00:32:39.034 "r_mbytes_per_sec": 0, 00:32:39.034 "w_mbytes_per_sec": 0 00:32:39.034 }, 00:32:39.034 "claimed": true, 00:32:39.034 "claim_type": "exclusive_write", 00:32:39.034 "zoned": false, 00:32:39.034 "supported_io_types": { 00:32:39.034 "read": true, 00:32:39.034 "write": true, 00:32:39.034 "unmap": true, 00:32:39.034 "write_zeroes": true, 00:32:39.034 "flush": true, 00:32:39.034 "reset": true, 00:32:39.034 "compare": false, 00:32:39.034 "compare_and_write": false, 00:32:39.034 "abort": true, 00:32:39.034 "nvme_admin": false, 00:32:39.034 "nvme_io": false 00:32:39.034 }, 00:32:39.034 "memory_domains": [ 00:32:39.034 { 00:32:39.034 "dma_device_id": "system", 00:32:39.034 "dma_device_type": 1 00:32:39.034 }, 00:32:39.034 { 00:32:39.034 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:39.034 "dma_device_type": 2 00:32:39.034 } 00:32:39.034 ], 00:32:39.034 "driver_specific": { 00:32:39.034 "passthru": { 00:32:39.034 "name": "pt1", 00:32:39.034 "base_bdev_name": "malloc1" 00:32:39.034 } 00:32:39.034 } 00:32:39.034 }' 00:32:39.034 11:43:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:32:39.034 11:43:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:32:39.034 11:43:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:32:39.034 11:43:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:32:39.034 11:43:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:32:39.034 11:43:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:32:39.034 11:43:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:32:39.034 11:43:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:32:39.292 11:43:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:32:39.292 11:43:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:32:39.292 11:43:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:32:39.292 11:43:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:32:39.292 11:43:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:32:39.292 11:43:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:32:39.293 11:43:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:32:39.551 11:43:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:32:39.551 "name": "pt2", 00:32:39.551 "aliases": [ 00:32:39.551 "00000000-0000-0000-0000-000000000002" 00:32:39.551 ], 00:32:39.551 "product_name": "passthru", 00:32:39.551 "block_size": 512, 00:32:39.551 "num_blocks": 65536, 00:32:39.551 "uuid": "00000000-0000-0000-0000-000000000002", 00:32:39.551 "assigned_rate_limits": { 00:32:39.551 "rw_ios_per_sec": 0, 00:32:39.551 "rw_mbytes_per_sec": 0, 00:32:39.551 "r_mbytes_per_sec": 0, 00:32:39.551 "w_mbytes_per_sec": 0 00:32:39.551 }, 00:32:39.551 "claimed": true, 00:32:39.551 "claim_type": "exclusive_write", 00:32:39.551 "zoned": false, 00:32:39.551 "supported_io_types": { 00:32:39.551 "read": true, 00:32:39.551 "write": true, 00:32:39.551 "unmap": true, 00:32:39.551 "write_zeroes": true, 00:32:39.551 "flush": true, 00:32:39.551 "reset": true, 00:32:39.551 "compare": false, 00:32:39.551 "compare_and_write": false, 00:32:39.551 "abort": true, 00:32:39.551 "nvme_admin": false, 00:32:39.551 "nvme_io": false 00:32:39.551 }, 00:32:39.551 "memory_domains": [ 00:32:39.551 { 00:32:39.551 "dma_device_id": "system", 00:32:39.551 "dma_device_type": 1 00:32:39.551 }, 00:32:39.551 { 00:32:39.551 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:39.551 "dma_device_type": 2 00:32:39.551 } 00:32:39.551 ], 00:32:39.551 "driver_specific": { 00:32:39.551 "passthru": { 00:32:39.551 "name": "pt2", 00:32:39.551 "base_bdev_name": "malloc2" 00:32:39.551 } 00:32:39.551 } 00:32:39.551 }' 00:32:39.551 11:43:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:32:39.551 11:43:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:32:39.551 11:43:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:32:39.551 11:43:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:32:39.551 11:43:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:32:39.551 11:43:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:32:39.551 11:43:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:32:39.551 11:43:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:32:39.551 11:43:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:32:39.551 11:43:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:32:39.809 11:43:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:32:39.809 11:43:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:32:39.809 11:43:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:32:39.809 11:43:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:32:39.809 11:43:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:32:39.809 11:43:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:32:39.810 "name": "pt3", 00:32:39.810 "aliases": [ 00:32:39.810 "00000000-0000-0000-0000-000000000003" 00:32:39.810 ], 00:32:39.810 "product_name": "passthru", 00:32:39.810 "block_size": 512, 00:32:39.810 "num_blocks": 65536, 00:32:39.810 "uuid": "00000000-0000-0000-0000-000000000003", 00:32:39.810 "assigned_rate_limits": { 00:32:39.810 "rw_ios_per_sec": 0, 00:32:39.810 "rw_mbytes_per_sec": 0, 00:32:39.810 "r_mbytes_per_sec": 0, 00:32:39.810 "w_mbytes_per_sec": 0 00:32:39.810 }, 00:32:39.810 "claimed": true, 00:32:39.810 "claim_type": "exclusive_write", 00:32:39.810 "zoned": false, 00:32:39.810 "supported_io_types": { 00:32:39.810 "read": true, 00:32:39.810 "write": true, 00:32:39.810 "unmap": true, 00:32:39.810 "write_zeroes": true, 00:32:39.810 "flush": true, 00:32:39.810 "reset": true, 00:32:39.810 "compare": false, 00:32:39.810 "compare_and_write": false, 00:32:39.810 "abort": true, 00:32:39.810 "nvme_admin": false, 00:32:39.810 "nvme_io": false 00:32:39.810 }, 00:32:39.810 "memory_domains": [ 00:32:39.810 { 00:32:39.810 "dma_device_id": "system", 00:32:39.810 "dma_device_type": 1 00:32:39.810 }, 00:32:39.810 { 00:32:39.810 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:39.810 "dma_device_type": 2 00:32:39.810 } 00:32:39.810 ], 00:32:39.810 "driver_specific": { 00:32:39.810 "passthru": { 00:32:39.810 "name": "pt3", 00:32:39.810 "base_bdev_name": "malloc3" 00:32:39.810 } 00:32:39.810 } 00:32:39.810 }' 00:32:39.810 11:43:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:32:40.067 11:43:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:32:40.067 11:43:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:32:40.067 11:43:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:32:40.067 11:43:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:32:40.067 11:43:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:32:40.067 11:43:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:32:40.067 11:43:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:32:40.067 11:43:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:32:40.067 11:43:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:32:40.067 11:43:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:32:40.325 11:43:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:32:40.325 11:43:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:32:40.325 11:43:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:32:40.325 [2024-06-10 11:43:24.177252] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:32:40.325 11:43:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 0f0ba238-31cf-4f98-ab85-315eb39c5e13 '!=' 0f0ba238-31cf-4f98-ab85-315eb39c5e13 ']' 00:32:40.325 11:43:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:32:40.325 11:43:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:32:40.325 11:43:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:32:40.325 11:43:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:32:40.584 [2024-06-10 11:43:24.349552] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:32:40.584 11:43:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:32:40.584 11:43:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:32:40.584 11:43:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:32:40.584 11:43:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:40.584 11:43:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:40.584 11:43:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:32:40.584 11:43:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:40.584 11:43:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:40.584 11:43:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:40.584 11:43:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:40.584 11:43:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:40.584 11:43:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:32:40.842 11:43:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:40.842 "name": "raid_bdev1", 00:32:40.842 "uuid": "0f0ba238-31cf-4f98-ab85-315eb39c5e13", 00:32:40.842 "strip_size_kb": 0, 00:32:40.842 "state": "online", 00:32:40.842 "raid_level": "raid1", 00:32:40.842 "superblock": true, 00:32:40.842 "num_base_bdevs": 3, 00:32:40.842 "num_base_bdevs_discovered": 2, 00:32:40.842 "num_base_bdevs_operational": 2, 00:32:40.842 "base_bdevs_list": [ 00:32:40.842 { 00:32:40.842 "name": null, 00:32:40.842 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:40.842 "is_configured": false, 00:32:40.842 "data_offset": 2048, 00:32:40.842 "data_size": 63488 00:32:40.842 }, 00:32:40.842 { 00:32:40.842 "name": "pt2", 00:32:40.842 "uuid": "00000000-0000-0000-0000-000000000002", 00:32:40.842 "is_configured": true, 00:32:40.842 "data_offset": 2048, 00:32:40.842 "data_size": 63488 00:32:40.842 }, 00:32:40.842 { 00:32:40.842 "name": "pt3", 00:32:40.842 "uuid": "00000000-0000-0000-0000-000000000003", 00:32:40.842 "is_configured": true, 00:32:40.842 "data_offset": 2048, 00:32:40.842 "data_size": 63488 00:32:40.842 } 00:32:40.842 ] 00:32:40.842 }' 00:32:40.842 11:43:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:40.842 11:43:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:32:41.101 11:43:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:32:41.359 [2024-06-10 11:43:25.167632] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:32:41.359 [2024-06-10 11:43:25.167656] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:32:41.359 [2024-06-10 11:43:25.167699] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:32:41.359 [2024-06-10 11:43:25.167737] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:32:41.359 [2024-06-10 11:43:25.167745] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ce4a60 name raid_bdev1, state offline 00:32:41.359 11:43:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:41.359 11:43:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:32:41.619 11:43:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:32:41.619 11:43:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:32:41.619 11:43:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:32:41.619 11:43:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:32:41.619 11:43:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:32:41.619 11:43:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:32:41.619 11:43:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:32:41.619 11:43:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:32:41.877 11:43:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:32:41.877 11:43:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:32:41.877 11:43:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:32:41.877 11:43:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:32:41.877 11:43:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:32:42.137 [2024-06-10 11:43:25.877569] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:32:42.137 [2024-06-10 11:43:25.877608] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:32:42.137 [2024-06-10 11:43:25.877620] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ce9140 00:32:42.137 [2024-06-10 11:43:25.877627] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:32:42.137 [2024-06-10 11:43:25.878845] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:32:42.137 [2024-06-10 11:43:25.878878] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:32:42.137 [2024-06-10 11:43:25.878930] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:32:42.137 [2024-06-10 11:43:25.878952] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:32:42.137 pt2 00:32:42.137 11:43:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:32:42.137 11:43:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:32:42.137 11:43:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:32:42.137 11:43:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:42.137 11:43:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:42.137 11:43:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:32:42.137 11:43:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:42.137 11:43:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:42.137 11:43:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:42.137 11:43:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:42.137 11:43:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:32:42.137 11:43:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:42.137 11:43:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:42.137 "name": "raid_bdev1", 00:32:42.137 "uuid": "0f0ba238-31cf-4f98-ab85-315eb39c5e13", 00:32:42.137 "strip_size_kb": 0, 00:32:42.137 "state": "configuring", 00:32:42.137 "raid_level": "raid1", 00:32:42.137 "superblock": true, 00:32:42.137 "num_base_bdevs": 3, 00:32:42.137 "num_base_bdevs_discovered": 1, 00:32:42.137 "num_base_bdevs_operational": 2, 00:32:42.137 "base_bdevs_list": [ 00:32:42.137 { 00:32:42.137 "name": null, 00:32:42.137 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:42.137 "is_configured": false, 00:32:42.137 "data_offset": 2048, 00:32:42.137 "data_size": 63488 00:32:42.137 }, 00:32:42.137 { 00:32:42.137 "name": "pt2", 00:32:42.137 "uuid": "00000000-0000-0000-0000-000000000002", 00:32:42.137 "is_configured": true, 00:32:42.137 "data_offset": 2048, 00:32:42.137 "data_size": 63488 00:32:42.137 }, 00:32:42.137 { 00:32:42.137 "name": null, 00:32:42.137 "uuid": "00000000-0000-0000-0000-000000000003", 00:32:42.137 "is_configured": false, 00:32:42.137 "data_offset": 2048, 00:32:42.137 "data_size": 63488 00:32:42.137 } 00:32:42.137 ] 00:32:42.137 }' 00:32:42.137 11:43:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:42.137 11:43:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:32:42.706 11:43:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:32:42.706 11:43:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:32:42.706 11:43:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=2 00:32:42.706 11:43:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:32:42.964 [2024-06-10 11:43:26.679675] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:32:42.965 [2024-06-10 11:43:26.679720] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:32:42.965 [2024-06-10 11:43:26.679735] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ce9620 00:32:42.965 [2024-06-10 11:43:26.679744] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:32:42.965 [2024-06-10 11:43:26.680018] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:32:42.965 [2024-06-10 11:43:26.680031] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:32:42.965 [2024-06-10 11:43:26.680080] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:32:42.965 [2024-06-10 11:43:26.680105] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:32:42.965 [2024-06-10 11:43:26.680180] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1ce7f40 00:32:42.965 [2024-06-10 11:43:26.680187] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:32:42.965 [2024-06-10 11:43:26.680305] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1cec790 00:32:42.965 [2024-06-10 11:43:26.680393] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1ce7f40 00:32:42.965 [2024-06-10 11:43:26.680399] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1ce7f40 00:32:42.965 [2024-06-10 11:43:26.680465] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:32:42.965 pt3 00:32:42.965 11:43:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:32:42.965 11:43:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:32:42.965 11:43:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:32:42.965 11:43:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:42.965 11:43:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:42.965 11:43:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:32:42.965 11:43:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:42.965 11:43:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:42.965 11:43:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:42.965 11:43:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:42.965 11:43:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:42.965 11:43:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:32:42.965 11:43:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:42.965 "name": "raid_bdev1", 00:32:42.965 "uuid": "0f0ba238-31cf-4f98-ab85-315eb39c5e13", 00:32:42.965 "strip_size_kb": 0, 00:32:42.965 "state": "online", 00:32:42.965 "raid_level": "raid1", 00:32:42.965 "superblock": true, 00:32:42.965 "num_base_bdevs": 3, 00:32:42.965 "num_base_bdevs_discovered": 2, 00:32:42.965 "num_base_bdevs_operational": 2, 00:32:42.965 "base_bdevs_list": [ 00:32:42.965 { 00:32:42.965 "name": null, 00:32:42.965 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:42.965 "is_configured": false, 00:32:42.965 "data_offset": 2048, 00:32:42.965 "data_size": 63488 00:32:42.965 }, 00:32:42.965 { 00:32:42.965 "name": "pt2", 00:32:42.965 "uuid": "00000000-0000-0000-0000-000000000002", 00:32:42.965 "is_configured": true, 00:32:42.965 "data_offset": 2048, 00:32:42.965 "data_size": 63488 00:32:42.965 }, 00:32:42.965 { 00:32:42.965 "name": "pt3", 00:32:42.965 "uuid": "00000000-0000-0000-0000-000000000003", 00:32:42.965 "is_configured": true, 00:32:42.965 "data_offset": 2048, 00:32:42.965 "data_size": 63488 00:32:42.965 } 00:32:42.965 ] 00:32:42.965 }' 00:32:42.965 11:43:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:42.965 11:43:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:32:43.531 11:43:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:32:43.789 [2024-06-10 11:43:27.497789] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:32:43.789 [2024-06-10 11:43:27.497814] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:32:43.789 [2024-06-10 11:43:27.497857] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:32:43.789 [2024-06-10 11:43:27.497902] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:32:43.789 [2024-06-10 11:43:27.497911] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ce7f40 name raid_bdev1, state offline 00:32:43.789 11:43:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:43.789 11:43:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:32:43.789 11:43:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:32:43.789 11:43:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:32:43.790 11:43:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 3 -gt 2 ']' 00:32:43.790 11:43:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@533 -- # i=2 00:32:43.790 11:43:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@534 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:32:44.047 11:43:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:32:44.304 [2024-06-10 11:43:28.023132] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:32:44.304 [2024-06-10 11:43:28.023169] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:32:44.304 [2024-06-10 11:43:28.023198] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ce5f50 00:32:44.304 [2024-06-10 11:43:28.023206] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:32:44.305 [2024-06-10 11:43:28.024415] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:32:44.305 [2024-06-10 11:43:28.024439] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:32:44.305 [2024-06-10 11:43:28.024487] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:32:44.305 [2024-06-10 11:43:28.024508] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:32:44.305 [2024-06-10 11:43:28.024579] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:32:44.305 [2024-06-10 11:43:28.024588] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:32:44.305 [2024-06-10 11:43:28.024597] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1cea060 name raid_bdev1, state configuring 00:32:44.305 [2024-06-10 11:43:28.024613] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:32:44.305 pt1 00:32:44.305 11:43:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 3 -gt 2 ']' 00:32:44.305 11:43:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@544 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:32:44.305 11:43:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:32:44.305 11:43:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:32:44.305 11:43:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:44.305 11:43:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:44.305 11:43:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:32:44.305 11:43:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:44.305 11:43:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:44.305 11:43:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:44.305 11:43:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:44.305 11:43:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:44.305 11:43:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:32:44.305 11:43:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:44.305 "name": "raid_bdev1", 00:32:44.305 "uuid": "0f0ba238-31cf-4f98-ab85-315eb39c5e13", 00:32:44.305 "strip_size_kb": 0, 00:32:44.305 "state": "configuring", 00:32:44.305 "raid_level": "raid1", 00:32:44.305 "superblock": true, 00:32:44.305 "num_base_bdevs": 3, 00:32:44.305 "num_base_bdevs_discovered": 1, 00:32:44.305 "num_base_bdevs_operational": 2, 00:32:44.305 "base_bdevs_list": [ 00:32:44.305 { 00:32:44.305 "name": null, 00:32:44.305 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:44.305 "is_configured": false, 00:32:44.305 "data_offset": 2048, 00:32:44.305 "data_size": 63488 00:32:44.305 }, 00:32:44.305 { 00:32:44.305 "name": "pt2", 00:32:44.305 "uuid": "00000000-0000-0000-0000-000000000002", 00:32:44.305 "is_configured": true, 00:32:44.305 "data_offset": 2048, 00:32:44.305 "data_size": 63488 00:32:44.305 }, 00:32:44.305 { 00:32:44.305 "name": null, 00:32:44.305 "uuid": "00000000-0000-0000-0000-000000000003", 00:32:44.305 "is_configured": false, 00:32:44.305 "data_offset": 2048, 00:32:44.305 "data_size": 63488 00:32:44.305 } 00:32:44.305 ] 00:32:44.305 }' 00:32:44.305 11:43:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:44.305 11:43:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:32:44.870 11:43:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs configuring 00:32:44.870 11:43:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:32:45.127 11:43:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # [[ false == \f\a\l\s\e ]] 00:32:45.127 11:43:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@548 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:32:45.127 [2024-06-10 11:43:29.061832] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:32:45.127 [2024-06-10 11:43:29.061881] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:32:45.127 [2024-06-10 11:43:29.061895] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b38a00 00:32:45.127 [2024-06-10 11:43:29.061904] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:32:45.127 [2024-06-10 11:43:29.062164] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:32:45.127 [2024-06-10 11:43:29.062176] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:32:45.127 [2024-06-10 11:43:29.062222] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:32:45.127 [2024-06-10 11:43:29.062235] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:32:45.127 [2024-06-10 11:43:29.062307] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1ce81c0 00:32:45.127 [2024-06-10 11:43:29.062314] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:32:45.127 [2024-06-10 11:43:29.062433] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1cec790 00:32:45.127 [2024-06-10 11:43:29.062523] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1ce81c0 00:32:45.127 [2024-06-10 11:43:29.062530] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1ce81c0 00:32:45.127 [2024-06-10 11:43:29.062598] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:32:45.127 pt3 00:32:45.384 11:43:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:32:45.384 11:43:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:32:45.384 11:43:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:32:45.384 11:43:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:45.384 11:43:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:45.384 11:43:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:32:45.384 11:43:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:45.384 11:43:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:45.385 11:43:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:45.385 11:43:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:45.385 11:43:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:45.385 11:43:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:32:45.385 11:43:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:45.385 "name": "raid_bdev1", 00:32:45.385 "uuid": "0f0ba238-31cf-4f98-ab85-315eb39c5e13", 00:32:45.385 "strip_size_kb": 0, 00:32:45.385 "state": "online", 00:32:45.385 "raid_level": "raid1", 00:32:45.385 "superblock": true, 00:32:45.385 "num_base_bdevs": 3, 00:32:45.385 "num_base_bdevs_discovered": 2, 00:32:45.385 "num_base_bdevs_operational": 2, 00:32:45.385 "base_bdevs_list": [ 00:32:45.385 { 00:32:45.385 "name": null, 00:32:45.385 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:45.385 "is_configured": false, 00:32:45.385 "data_offset": 2048, 00:32:45.385 "data_size": 63488 00:32:45.385 }, 00:32:45.385 { 00:32:45.385 "name": "pt2", 00:32:45.385 "uuid": "00000000-0000-0000-0000-000000000002", 00:32:45.385 "is_configured": true, 00:32:45.385 "data_offset": 2048, 00:32:45.385 "data_size": 63488 00:32:45.385 }, 00:32:45.385 { 00:32:45.385 "name": "pt3", 00:32:45.385 "uuid": "00000000-0000-0000-0000-000000000003", 00:32:45.385 "is_configured": true, 00:32:45.385 "data_offset": 2048, 00:32:45.385 "data_size": 63488 00:32:45.385 } 00:32:45.385 ] 00:32:45.385 }' 00:32:45.385 11:43:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:45.385 11:43:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:32:45.950 11:43:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:32:45.950 11:43:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:32:46.208 11:43:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:32:46.208 11:43:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:32:46.208 11:43:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:32:46.208 [2024-06-10 11:43:30.084649] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:32:46.208 11:43:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' 0f0ba238-31cf-4f98-ab85-315eb39c5e13 '!=' 0f0ba238-31cf-4f98-ab85-315eb39c5e13 ']' 00:32:46.208 11:43:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 175461 00:32:46.208 11:43:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@949 -- # '[' -z 175461 ']' 00:32:46.208 11:43:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # kill -0 175461 00:32:46.208 11:43:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # uname 00:32:46.208 11:43:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:32:46.208 11:43:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 175461 00:32:46.208 11:43:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:32:46.208 11:43:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:32:46.208 11:43:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 175461' 00:32:46.208 killing process with pid 175461 00:32:46.208 11:43:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # kill 175461 00:32:46.208 [2024-06-10 11:43:30.145786] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:32:46.208 [2024-06-10 11:43:30.145829] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:32:46.208 [2024-06-10 11:43:30.145873] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:32:46.208 [2024-06-10 11:43:30.145882] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ce81c0 name raid_bdev1, state offline 00:32:46.208 11:43:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@973 -- # wait 175461 00:32:46.467 [2024-06-10 11:43:30.175849] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:32:46.467 11:43:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:32:46.467 00:32:46.467 real 0m16.869s 00:32:46.467 user 0m30.623s 00:32:46.467 sys 0m3.246s 00:32:46.467 11:43:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:32:46.467 11:43:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:32:46.467 ************************************ 00:32:46.467 END TEST raid_superblock_test 00:32:46.467 ************************************ 00:32:46.725 11:43:30 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 3 read 00:32:46.725 11:43:30 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:32:46.725 11:43:30 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:32:46.725 11:43:30 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:32:46.725 ************************************ 00:32:46.725 START TEST raid_read_error_test 00:32:46.725 ************************************ 00:32:46.725 11:43:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test raid1 3 read 00:32:46.725 11:43:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:32:46.725 11:43:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:32:46.725 11:43:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:32:46.725 11:43:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:32:46.725 11:43:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:32:46.725 11:43:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:32:46.725 11:43:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:32:46.725 11:43:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:32:46.725 11:43:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:32:46.725 11:43:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:32:46.725 11:43:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:32:46.725 11:43:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:32:46.725 11:43:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:32:46.725 11:43:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:32:46.725 11:43:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:32:46.725 11:43:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:32:46.725 11:43:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:32:46.725 11:43:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:32:46.725 11:43:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:32:46.725 11:43:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:32:46.725 11:43:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:32:46.725 11:43:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:32:46.725 11:43:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:32:46.725 11:43:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:32:46.725 11:43:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.hQj8Jt9HZy 00:32:46.725 11:43:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=178074 00:32:46.725 11:43:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 178074 /var/tmp/spdk-raid.sock 00:32:46.725 11:43:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:32:46.725 11:43:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@830 -- # '[' -z 178074 ']' 00:32:46.725 11:43:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:32:46.725 11:43:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:32:46.725 11:43:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:32:46.725 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:32:46.725 11:43:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:32:46.725 11:43:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:32:46.725 [2024-06-10 11:43:30.529901] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:32:46.725 [2024-06-10 11:43:30.529953] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid178074 ] 00:32:46.725 [2024-06-10 11:43:30.616273] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:46.983 [2024-06-10 11:43:30.701266] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:32:46.983 [2024-06-10 11:43:30.758396] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:32:46.983 [2024-06-10 11:43:30.758422] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:32:47.548 11:43:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:32:47.548 11:43:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@863 -- # return 0 00:32:47.548 11:43:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:32:47.548 11:43:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:32:47.806 BaseBdev1_malloc 00:32:47.806 11:43:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:32:47.806 true 00:32:47.806 11:43:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:32:48.063 [2024-06-10 11:43:31.859897] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:32:48.063 [2024-06-10 11:43:31.859934] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:32:48.063 [2024-06-10 11:43:31.859947] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdc8b10 00:32:48.063 [2024-06-10 11:43:31.859971] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:32:48.063 [2024-06-10 11:43:31.861335] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:32:48.063 [2024-06-10 11:43:31.861357] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:32:48.063 BaseBdev1 00:32:48.063 11:43:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:32:48.064 11:43:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:32:48.320 BaseBdev2_malloc 00:32:48.320 11:43:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:32:48.320 true 00:32:48.320 11:43:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:32:48.577 [2024-06-10 11:43:32.381265] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:32:48.577 [2024-06-10 11:43:32.381301] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:32:48.577 [2024-06-10 11:43:32.381315] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdcd280 00:32:48.577 [2024-06-10 11:43:32.381324] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:32:48.577 [2024-06-10 11:43:32.382468] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:32:48.577 [2024-06-10 11:43:32.382491] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:32:48.577 BaseBdev2 00:32:48.577 11:43:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:32:48.577 11:43:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:32:48.835 BaseBdev3_malloc 00:32:48.835 11:43:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:32:48.835 true 00:32:48.835 11:43:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:32:49.094 [2024-06-10 11:43:32.891238] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:32:49.094 [2024-06-10 11:43:32.891273] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:32:49.094 [2024-06-10 11:43:32.891305] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdcfab0 00:32:49.094 [2024-06-10 11:43:32.891313] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:32:49.094 [2024-06-10 11:43:32.892491] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:32:49.094 [2024-06-10 11:43:32.892513] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:32:49.094 BaseBdev3 00:32:49.094 11:43:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:32:49.352 [2024-06-10 11:43:33.063716] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:32:49.352 [2024-06-10 11:43:33.064720] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:32:49.352 [2024-06-10 11:43:33.064767] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:32:49.352 [2024-06-10 11:43:33.064929] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xdce0b0 00:32:49.352 [2024-06-10 11:43:33.064938] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:32:49.352 [2024-06-10 11:43:33.065080] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xdcd740 00:32:49.352 [2024-06-10 11:43:33.065200] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xdce0b0 00:32:49.352 [2024-06-10 11:43:33.065207] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xdce0b0 00:32:49.352 [2024-06-10 11:43:33.065279] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:32:49.352 11:43:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:32:49.352 11:43:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:32:49.352 11:43:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:32:49.352 11:43:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:49.352 11:43:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:49.352 11:43:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:32:49.352 11:43:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:49.352 11:43:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:49.352 11:43:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:49.352 11:43:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:49.352 11:43:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:49.352 11:43:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:32:49.352 11:43:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:49.352 "name": "raid_bdev1", 00:32:49.352 "uuid": "22f3300e-9833-47b1-a2d8-68c8642cc44b", 00:32:49.352 "strip_size_kb": 0, 00:32:49.352 "state": "online", 00:32:49.352 "raid_level": "raid1", 00:32:49.352 "superblock": true, 00:32:49.352 "num_base_bdevs": 3, 00:32:49.352 "num_base_bdevs_discovered": 3, 00:32:49.352 "num_base_bdevs_operational": 3, 00:32:49.352 "base_bdevs_list": [ 00:32:49.352 { 00:32:49.352 "name": "BaseBdev1", 00:32:49.352 "uuid": "27147e94-7909-5f04-9425-35e266e0682f", 00:32:49.352 "is_configured": true, 00:32:49.352 "data_offset": 2048, 00:32:49.352 "data_size": 63488 00:32:49.352 }, 00:32:49.352 { 00:32:49.352 "name": "BaseBdev2", 00:32:49.352 "uuid": "7c22e3c1-ddd7-5e4c-99aa-ab5f98c0e8ce", 00:32:49.352 "is_configured": true, 00:32:49.352 "data_offset": 2048, 00:32:49.352 "data_size": 63488 00:32:49.352 }, 00:32:49.352 { 00:32:49.352 "name": "BaseBdev3", 00:32:49.352 "uuid": "809c297b-c6b7-54d3-bd83-9420a951ffc2", 00:32:49.352 "is_configured": true, 00:32:49.352 "data_offset": 2048, 00:32:49.352 "data_size": 63488 00:32:49.352 } 00:32:49.352 ] 00:32:49.352 }' 00:32:49.352 11:43:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:49.352 11:43:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:32:49.917 11:43:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:32:49.917 11:43:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:32:49.917 [2024-06-10 11:43:33.837918] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc19400 00:32:50.850 11:43:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:32:51.108 11:43:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:32:51.108 11:43:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:32:51.108 11:43:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:32:51.108 11:43:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:32:51.108 11:43:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:32:51.108 11:43:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:32:51.108 11:43:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:32:51.108 11:43:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:51.108 11:43:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:51.108 11:43:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:32:51.108 11:43:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:51.108 11:43:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:51.108 11:43:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:51.108 11:43:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:51.108 11:43:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:32:51.108 11:43:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:51.366 11:43:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:51.366 "name": "raid_bdev1", 00:32:51.366 "uuid": "22f3300e-9833-47b1-a2d8-68c8642cc44b", 00:32:51.366 "strip_size_kb": 0, 00:32:51.366 "state": "online", 00:32:51.366 "raid_level": "raid1", 00:32:51.366 "superblock": true, 00:32:51.366 "num_base_bdevs": 3, 00:32:51.366 "num_base_bdevs_discovered": 3, 00:32:51.366 "num_base_bdevs_operational": 3, 00:32:51.366 "base_bdevs_list": [ 00:32:51.366 { 00:32:51.366 "name": "BaseBdev1", 00:32:51.366 "uuid": "27147e94-7909-5f04-9425-35e266e0682f", 00:32:51.366 "is_configured": true, 00:32:51.366 "data_offset": 2048, 00:32:51.366 "data_size": 63488 00:32:51.366 }, 00:32:51.366 { 00:32:51.366 "name": "BaseBdev2", 00:32:51.366 "uuid": "7c22e3c1-ddd7-5e4c-99aa-ab5f98c0e8ce", 00:32:51.366 "is_configured": true, 00:32:51.366 "data_offset": 2048, 00:32:51.366 "data_size": 63488 00:32:51.366 }, 00:32:51.366 { 00:32:51.366 "name": "BaseBdev3", 00:32:51.366 "uuid": "809c297b-c6b7-54d3-bd83-9420a951ffc2", 00:32:51.366 "is_configured": true, 00:32:51.366 "data_offset": 2048, 00:32:51.366 "data_size": 63488 00:32:51.366 } 00:32:51.366 ] 00:32:51.366 }' 00:32:51.366 11:43:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:51.366 11:43:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:32:51.930 11:43:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:32:51.930 [2024-06-10 11:43:35.776791] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:32:51.930 [2024-06-10 11:43:35.776830] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:32:51.930 [2024-06-10 11:43:35.778906] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:32:51.930 [2024-06-10 11:43:35.778929] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:32:51.930 [2024-06-10 11:43:35.778992] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:32:51.930 [2024-06-10 11:43:35.778999] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xdce0b0 name raid_bdev1, state offline 00:32:51.930 0 00:32:51.930 11:43:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 178074 00:32:51.930 11:43:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@949 -- # '[' -z 178074 ']' 00:32:51.930 11:43:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # kill -0 178074 00:32:51.930 11:43:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # uname 00:32:51.930 11:43:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:32:51.930 11:43:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 178074 00:32:51.930 11:43:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:32:51.930 11:43:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:32:51.930 11:43:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 178074' 00:32:51.930 killing process with pid 178074 00:32:51.930 11:43:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # kill 178074 00:32:51.930 [2024-06-10 11:43:35.835150] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:32:51.930 11:43:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@973 -- # wait 178074 00:32:51.931 [2024-06-10 11:43:35.855170] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:32:52.188 11:43:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.hQj8Jt9HZy 00:32:52.188 11:43:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:32:52.188 11:43:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:32:52.188 11:43:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:32:52.188 11:43:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:32:52.188 11:43:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:32:52.188 11:43:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:32:52.188 11:43:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:32:52.188 00:32:52.189 real 0m5.605s 00:32:52.189 user 0m8.560s 00:32:52.189 sys 0m0.976s 00:32:52.189 11:43:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:32:52.189 11:43:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:32:52.189 ************************************ 00:32:52.189 END TEST raid_read_error_test 00:32:52.189 ************************************ 00:32:52.189 11:43:36 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 3 write 00:32:52.189 11:43:36 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:32:52.189 11:43:36 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:32:52.189 11:43:36 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:32:52.447 ************************************ 00:32:52.447 START TEST raid_write_error_test 00:32:52.447 ************************************ 00:32:52.447 11:43:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test raid1 3 write 00:32:52.447 11:43:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:32:52.447 11:43:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:32:52.447 11:43:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:32:52.447 11:43:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:32:52.447 11:43:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:32:52.447 11:43:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:32:52.447 11:43:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:32:52.447 11:43:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:32:52.447 11:43:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:32:52.447 11:43:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:32:52.447 11:43:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:32:52.447 11:43:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:32:52.447 11:43:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:32:52.447 11:43:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:32:52.447 11:43:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:32:52.447 11:43:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:32:52.447 11:43:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:32:52.447 11:43:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:32:52.447 11:43:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:32:52.447 11:43:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:32:52.447 11:43:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:32:52.447 11:43:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:32:52.447 11:43:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:32:52.447 11:43:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:32:52.447 11:43:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.4QgVtStPlI 00:32:52.447 11:43:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=178921 00:32:52.447 11:43:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 178921 /var/tmp/spdk-raid.sock 00:32:52.447 11:43:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:32:52.447 11:43:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@830 -- # '[' -z 178921 ']' 00:32:52.447 11:43:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:32:52.447 11:43:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:32:52.447 11:43:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:32:52.447 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:32:52.447 11:43:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:32:52.447 11:43:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:32:52.447 [2024-06-10 11:43:36.221664] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:32:52.447 [2024-06-10 11:43:36.221718] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid178921 ] 00:32:52.447 [2024-06-10 11:43:36.309786] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:52.705 [2024-06-10 11:43:36.395646] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:32:52.705 [2024-06-10 11:43:36.452209] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:32:52.705 [2024-06-10 11:43:36.452242] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:32:53.270 11:43:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:32:53.270 11:43:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@863 -- # return 0 00:32:53.270 11:43:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:32:53.270 11:43:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:32:53.270 BaseBdev1_malloc 00:32:53.270 11:43:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:32:53.528 true 00:32:53.528 11:43:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:32:53.786 [2024-06-10 11:43:37.511389] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:32:53.786 [2024-06-10 11:43:37.511427] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:32:53.786 [2024-06-10 11:43:37.511439] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2462b10 00:32:53.786 [2024-06-10 11:43:37.511464] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:32:53.786 [2024-06-10 11:43:37.512646] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:32:53.786 [2024-06-10 11:43:37.512667] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:32:53.786 BaseBdev1 00:32:53.786 11:43:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:32:53.786 11:43:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:32:53.786 BaseBdev2_malloc 00:32:53.786 11:43:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:32:54.043 true 00:32:54.043 11:43:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:32:54.301 [2024-06-10 11:43:38.056496] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:32:54.301 [2024-06-10 11:43:38.056528] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:32:54.301 [2024-06-10 11:43:38.056541] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2467280 00:32:54.301 [2024-06-10 11:43:38.056549] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:32:54.301 [2024-06-10 11:43:38.057560] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:32:54.301 [2024-06-10 11:43:38.057581] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:32:54.301 BaseBdev2 00:32:54.301 11:43:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:32:54.301 11:43:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:32:54.301 BaseBdev3_malloc 00:32:54.559 11:43:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:32:54.559 true 00:32:54.559 11:43:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:32:54.817 [2024-06-10 11:43:38.557489] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:32:54.817 [2024-06-10 11:43:38.557526] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:32:54.817 [2024-06-10 11:43:38.557540] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2469ab0 00:32:54.817 [2024-06-10 11:43:38.557549] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:32:54.817 [2024-06-10 11:43:38.558572] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:32:54.817 [2024-06-10 11:43:38.558594] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:32:54.817 BaseBdev3 00:32:54.817 11:43:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:32:54.817 [2024-06-10 11:43:38.729964] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:32:54.817 [2024-06-10 11:43:38.730744] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:32:54.817 [2024-06-10 11:43:38.730787] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:32:54.817 [2024-06-10 11:43:38.730934] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x24680b0 00:32:54.817 [2024-06-10 11:43:38.730941] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:32:54.817 [2024-06-10 11:43:38.731060] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2467740 00:32:54.817 [2024-06-10 11:43:38.731163] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x24680b0 00:32:54.817 [2024-06-10 11:43:38.731170] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x24680b0 00:32:54.817 [2024-06-10 11:43:38.731232] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:32:54.817 11:43:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:32:54.817 11:43:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:32:54.817 11:43:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:32:54.817 11:43:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:54.817 11:43:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:54.817 11:43:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:32:54.817 11:43:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:54.817 11:43:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:54.817 11:43:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:54.817 11:43:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:54.817 11:43:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:54.817 11:43:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:32:55.076 11:43:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:55.076 "name": "raid_bdev1", 00:32:55.076 "uuid": "fa7195da-4482-4886-83fa-c9bcb722e7c2", 00:32:55.076 "strip_size_kb": 0, 00:32:55.076 "state": "online", 00:32:55.076 "raid_level": "raid1", 00:32:55.076 "superblock": true, 00:32:55.076 "num_base_bdevs": 3, 00:32:55.076 "num_base_bdevs_discovered": 3, 00:32:55.076 "num_base_bdevs_operational": 3, 00:32:55.076 "base_bdevs_list": [ 00:32:55.076 { 00:32:55.076 "name": "BaseBdev1", 00:32:55.076 "uuid": "3284fbdc-e9d3-565c-86cf-f6d06e018f68", 00:32:55.076 "is_configured": true, 00:32:55.076 "data_offset": 2048, 00:32:55.076 "data_size": 63488 00:32:55.076 }, 00:32:55.076 { 00:32:55.076 "name": "BaseBdev2", 00:32:55.076 "uuid": "be6a9765-c600-5653-a85f-f5e4ee1df45b", 00:32:55.076 "is_configured": true, 00:32:55.076 "data_offset": 2048, 00:32:55.076 "data_size": 63488 00:32:55.076 }, 00:32:55.076 { 00:32:55.076 "name": "BaseBdev3", 00:32:55.076 "uuid": "588e19db-039b-5751-8990-86f8f8838571", 00:32:55.076 "is_configured": true, 00:32:55.076 "data_offset": 2048, 00:32:55.076 "data_size": 63488 00:32:55.076 } 00:32:55.076 ] 00:32:55.076 }' 00:32:55.076 11:43:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:55.076 11:43:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:32:55.642 11:43:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:32:55.642 11:43:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:32:55.642 [2024-06-10 11:43:39.488116] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22b3400 00:32:56.576 11:43:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:32:56.835 [2024-06-10 11:43:40.573452] bdev_raid.c:2221:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:32:56.835 [2024-06-10 11:43:40.573506] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:32:56.835 [2024-06-10 11:43:40.573675] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x22b3400 00:32:56.835 11:43:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:32:56.835 11:43:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:32:56.835 11:43:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:32:56.835 11:43:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=2 00:32:56.835 11:43:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:32:56.835 11:43:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:32:56.835 11:43:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:32:56.835 11:43:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:56.835 11:43:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:56.835 11:43:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:32:56.835 11:43:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:56.835 11:43:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:56.835 11:43:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:56.835 11:43:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:56.835 11:43:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:56.835 11:43:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:32:56.835 11:43:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:56.835 "name": "raid_bdev1", 00:32:56.835 "uuid": "fa7195da-4482-4886-83fa-c9bcb722e7c2", 00:32:56.835 "strip_size_kb": 0, 00:32:56.835 "state": "online", 00:32:56.835 "raid_level": "raid1", 00:32:56.835 "superblock": true, 00:32:56.835 "num_base_bdevs": 3, 00:32:56.835 "num_base_bdevs_discovered": 2, 00:32:56.835 "num_base_bdevs_operational": 2, 00:32:56.835 "base_bdevs_list": [ 00:32:56.835 { 00:32:56.835 "name": null, 00:32:56.835 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:56.835 "is_configured": false, 00:32:56.835 "data_offset": 2048, 00:32:56.835 "data_size": 63488 00:32:56.835 }, 00:32:56.835 { 00:32:56.835 "name": "BaseBdev2", 00:32:56.835 "uuid": "be6a9765-c600-5653-a85f-f5e4ee1df45b", 00:32:56.835 "is_configured": true, 00:32:56.835 "data_offset": 2048, 00:32:56.835 "data_size": 63488 00:32:56.835 }, 00:32:56.835 { 00:32:56.835 "name": "BaseBdev3", 00:32:56.835 "uuid": "588e19db-039b-5751-8990-86f8f8838571", 00:32:56.835 "is_configured": true, 00:32:56.835 "data_offset": 2048, 00:32:56.835 "data_size": 63488 00:32:56.835 } 00:32:56.835 ] 00:32:56.835 }' 00:32:56.835 11:43:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:56.835 11:43:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:32:57.403 11:43:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:32:57.661 [2024-06-10 11:43:41.436886] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:32:57.661 [2024-06-10 11:43:41.436919] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:32:57.661 [2024-06-10 11:43:41.439025] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:32:57.661 [2024-06-10 11:43:41.439048] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:32:57.661 [2024-06-10 11:43:41.439099] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:32:57.661 [2024-06-10 11:43:41.439107] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24680b0 name raid_bdev1, state offline 00:32:57.661 0 00:32:57.661 11:43:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 178921 00:32:57.661 11:43:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@949 -- # '[' -z 178921 ']' 00:32:57.661 11:43:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # kill -0 178921 00:32:57.661 11:43:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # uname 00:32:57.661 11:43:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:32:57.661 11:43:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 178921 00:32:57.661 11:43:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:32:57.661 11:43:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:32:57.661 11:43:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 178921' 00:32:57.661 killing process with pid 178921 00:32:57.661 11:43:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # kill 178921 00:32:57.661 [2024-06-10 11:43:41.505384] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:32:57.661 11:43:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@973 -- # wait 178921 00:32:57.661 [2024-06-10 11:43:41.525789] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:32:57.920 11:43:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.4QgVtStPlI 00:32:57.920 11:43:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:32:57.920 11:43:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:32:57.920 11:43:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:32:57.920 11:43:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:32:57.920 11:43:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:32:57.920 11:43:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:32:57.920 11:43:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:32:57.920 00:32:57.920 real 0m5.585s 00:32:57.920 user 0m8.463s 00:32:57.920 sys 0m1.037s 00:32:57.920 11:43:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:32:57.920 11:43:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:32:57.920 ************************************ 00:32:57.920 END TEST raid_write_error_test 00:32:57.920 ************************************ 00:32:57.920 11:43:41 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:32:57.920 11:43:41 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:32:57.920 11:43:41 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 4 false 00:32:57.920 11:43:41 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:32:57.920 11:43:41 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:32:57.920 11:43:41 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:32:57.920 ************************************ 00:32:57.920 START TEST raid_state_function_test 00:32:57.920 ************************************ 00:32:57.920 11:43:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # raid_state_function_test raid0 4 false 00:32:57.920 11:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:32:57.920 11:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:32:57.920 11:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:32:57.920 11:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:32:57.920 11:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:32:57.920 11:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:32:57.920 11:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:32:57.920 11:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:32:57.920 11:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:32:57.920 11:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:32:57.920 11:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:32:57.920 11:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:32:57.920 11:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:32:57.920 11:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:32:57.920 11:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:32:57.920 11:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:32:57.920 11:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:32:57.920 11:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:32:57.920 11:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:32:57.920 11:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:32:57.920 11:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:32:57.920 11:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:32:57.920 11:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:32:57.920 11:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:32:57.920 11:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:32:57.920 11:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:32:57.920 11:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:32:57.920 11:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:32:57.920 11:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:32:57.920 11:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=179843 00:32:57.920 11:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 179843' 00:32:57.920 Process raid pid: 179843 00:32:57.920 11:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 179843 /var/tmp/spdk-raid.sock 00:32:57.920 11:43:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@830 -- # '[' -z 179843 ']' 00:32:57.920 11:43:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:32:57.920 11:43:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:32:57.920 11:43:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:32:57.920 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:32:57.920 11:43:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:32:57.920 11:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:32:57.920 11:43:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:32:58.179 [2024-06-10 11:43:41.883302] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:32:58.179 [2024-06-10 11:43:41.883352] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:32:58.179 [2024-06-10 11:43:41.971722] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:58.179 [2024-06-10 11:43:42.059630] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:32:58.179 [2024-06-10 11:43:42.119463] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:32:58.179 [2024-06-10 11:43:42.119484] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:32:58.745 11:43:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:32:58.745 11:43:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@863 -- # return 0 00:32:58.745 11:43:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:32:59.005 [2024-06-10 11:43:42.822683] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:32:59.005 [2024-06-10 11:43:42.822718] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:32:59.005 [2024-06-10 11:43:42.822725] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:32:59.005 [2024-06-10 11:43:42.822734] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:32:59.005 [2024-06-10 11:43:42.822739] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:32:59.005 [2024-06-10 11:43:42.822747] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:32:59.005 [2024-06-10 11:43:42.822752] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:32:59.005 [2024-06-10 11:43:42.822760] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:32:59.005 11:43:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:32:59.005 11:43:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:32:59.005 11:43:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:32:59.005 11:43:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:32:59.005 11:43:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:32:59.005 11:43:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:32:59.005 11:43:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:59.005 11:43:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:59.005 11:43:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:59.005 11:43:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:59.005 11:43:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:59.005 11:43:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:32:59.264 11:43:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:59.264 "name": "Existed_Raid", 00:32:59.264 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:59.264 "strip_size_kb": 64, 00:32:59.264 "state": "configuring", 00:32:59.264 "raid_level": "raid0", 00:32:59.264 "superblock": false, 00:32:59.264 "num_base_bdevs": 4, 00:32:59.264 "num_base_bdevs_discovered": 0, 00:32:59.264 "num_base_bdevs_operational": 4, 00:32:59.264 "base_bdevs_list": [ 00:32:59.264 { 00:32:59.264 "name": "BaseBdev1", 00:32:59.264 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:59.264 "is_configured": false, 00:32:59.264 "data_offset": 0, 00:32:59.264 "data_size": 0 00:32:59.264 }, 00:32:59.264 { 00:32:59.264 "name": "BaseBdev2", 00:32:59.264 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:59.264 "is_configured": false, 00:32:59.264 "data_offset": 0, 00:32:59.264 "data_size": 0 00:32:59.264 }, 00:32:59.264 { 00:32:59.264 "name": "BaseBdev3", 00:32:59.264 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:59.264 "is_configured": false, 00:32:59.264 "data_offset": 0, 00:32:59.264 "data_size": 0 00:32:59.264 }, 00:32:59.264 { 00:32:59.264 "name": "BaseBdev4", 00:32:59.264 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:59.264 "is_configured": false, 00:32:59.264 "data_offset": 0, 00:32:59.264 "data_size": 0 00:32:59.264 } 00:32:59.264 ] 00:32:59.264 }' 00:32:59.264 11:43:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:59.264 11:43:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:32:59.830 11:43:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:32:59.830 [2024-06-10 11:43:43.684830] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:32:59.830 [2024-06-10 11:43:43.684861] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13be550 name Existed_Raid, state configuring 00:32:59.830 11:43:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:33:00.089 [2024-06-10 11:43:43.857294] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:33:00.089 [2024-06-10 11:43:43.857317] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:33:00.089 [2024-06-10 11:43:43.857324] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:33:00.089 [2024-06-10 11:43:43.857331] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:33:00.089 [2024-06-10 11:43:43.857336] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:33:00.089 [2024-06-10 11:43:43.857343] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:33:00.089 [2024-06-10 11:43:43.857349] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:33:00.089 [2024-06-10 11:43:43.857356] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:33:00.089 11:43:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:33:00.347 [2024-06-10 11:43:44.038522] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:33:00.347 BaseBdev1 00:33:00.347 11:43:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:33:00.347 11:43:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:33:00.347 11:43:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:33:00.347 11:43:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:33:00.347 11:43:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:33:00.347 11:43:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:33:00.348 11:43:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:33:00.348 11:43:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:33:00.606 [ 00:33:00.606 { 00:33:00.606 "name": "BaseBdev1", 00:33:00.606 "aliases": [ 00:33:00.606 "faa32ebf-5b2c-4091-911f-c08f164d77cf" 00:33:00.606 ], 00:33:00.606 "product_name": "Malloc disk", 00:33:00.606 "block_size": 512, 00:33:00.606 "num_blocks": 65536, 00:33:00.606 "uuid": "faa32ebf-5b2c-4091-911f-c08f164d77cf", 00:33:00.606 "assigned_rate_limits": { 00:33:00.606 "rw_ios_per_sec": 0, 00:33:00.606 "rw_mbytes_per_sec": 0, 00:33:00.606 "r_mbytes_per_sec": 0, 00:33:00.606 "w_mbytes_per_sec": 0 00:33:00.606 }, 00:33:00.606 "claimed": true, 00:33:00.606 "claim_type": "exclusive_write", 00:33:00.606 "zoned": false, 00:33:00.606 "supported_io_types": { 00:33:00.606 "read": true, 00:33:00.606 "write": true, 00:33:00.606 "unmap": true, 00:33:00.606 "write_zeroes": true, 00:33:00.606 "flush": true, 00:33:00.606 "reset": true, 00:33:00.606 "compare": false, 00:33:00.606 "compare_and_write": false, 00:33:00.606 "abort": true, 00:33:00.606 "nvme_admin": false, 00:33:00.606 "nvme_io": false 00:33:00.606 }, 00:33:00.606 "memory_domains": [ 00:33:00.606 { 00:33:00.606 "dma_device_id": "system", 00:33:00.606 "dma_device_type": 1 00:33:00.606 }, 00:33:00.606 { 00:33:00.606 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:00.606 "dma_device_type": 2 00:33:00.606 } 00:33:00.606 ], 00:33:00.606 "driver_specific": {} 00:33:00.606 } 00:33:00.606 ] 00:33:00.606 11:43:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:33:00.606 11:43:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:33:00.606 11:43:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:33:00.606 11:43:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:33:00.606 11:43:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:33:00.606 11:43:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:33:00.606 11:43:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:33:00.606 11:43:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:00.606 11:43:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:00.606 11:43:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:00.606 11:43:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:00.606 11:43:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:00.606 11:43:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:33:00.864 11:43:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:00.864 "name": "Existed_Raid", 00:33:00.864 "uuid": "00000000-0000-0000-0000-000000000000", 00:33:00.864 "strip_size_kb": 64, 00:33:00.864 "state": "configuring", 00:33:00.864 "raid_level": "raid0", 00:33:00.864 "superblock": false, 00:33:00.864 "num_base_bdevs": 4, 00:33:00.864 "num_base_bdevs_discovered": 1, 00:33:00.864 "num_base_bdevs_operational": 4, 00:33:00.864 "base_bdevs_list": [ 00:33:00.864 { 00:33:00.864 "name": "BaseBdev1", 00:33:00.864 "uuid": "faa32ebf-5b2c-4091-911f-c08f164d77cf", 00:33:00.864 "is_configured": true, 00:33:00.864 "data_offset": 0, 00:33:00.864 "data_size": 65536 00:33:00.864 }, 00:33:00.864 { 00:33:00.864 "name": "BaseBdev2", 00:33:00.864 "uuid": "00000000-0000-0000-0000-000000000000", 00:33:00.864 "is_configured": false, 00:33:00.864 "data_offset": 0, 00:33:00.864 "data_size": 0 00:33:00.864 }, 00:33:00.864 { 00:33:00.864 "name": "BaseBdev3", 00:33:00.864 "uuid": "00000000-0000-0000-0000-000000000000", 00:33:00.864 "is_configured": false, 00:33:00.864 "data_offset": 0, 00:33:00.864 "data_size": 0 00:33:00.864 }, 00:33:00.864 { 00:33:00.864 "name": "BaseBdev4", 00:33:00.864 "uuid": "00000000-0000-0000-0000-000000000000", 00:33:00.864 "is_configured": false, 00:33:00.864 "data_offset": 0, 00:33:00.864 "data_size": 0 00:33:00.865 } 00:33:00.865 ] 00:33:00.865 }' 00:33:00.865 11:43:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:00.865 11:43:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:33:01.122 11:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:33:01.379 [2024-06-10 11:43:45.161430] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:33:01.379 [2024-06-10 11:43:45.161462] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13bddc0 name Existed_Raid, state configuring 00:33:01.379 11:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:33:01.636 [2024-06-10 11:43:45.337923] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:33:01.636 [2024-06-10 11:43:45.338944] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:33:01.636 [2024-06-10 11:43:45.338970] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:33:01.636 [2024-06-10 11:43:45.338977] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:33:01.636 [2024-06-10 11:43:45.338985] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:33:01.636 [2024-06-10 11:43:45.338990] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:33:01.636 [2024-06-10 11:43:45.338998] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:33:01.636 11:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:33:01.636 11:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:33:01.636 11:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:33:01.636 11:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:33:01.636 11:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:33:01.636 11:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:33:01.636 11:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:33:01.636 11:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:33:01.636 11:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:01.636 11:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:01.636 11:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:01.636 11:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:01.636 11:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:01.636 11:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:33:01.636 11:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:01.636 "name": "Existed_Raid", 00:33:01.636 "uuid": "00000000-0000-0000-0000-000000000000", 00:33:01.636 "strip_size_kb": 64, 00:33:01.636 "state": "configuring", 00:33:01.636 "raid_level": "raid0", 00:33:01.636 "superblock": false, 00:33:01.636 "num_base_bdevs": 4, 00:33:01.636 "num_base_bdevs_discovered": 1, 00:33:01.636 "num_base_bdevs_operational": 4, 00:33:01.636 "base_bdevs_list": [ 00:33:01.636 { 00:33:01.636 "name": "BaseBdev1", 00:33:01.636 "uuid": "faa32ebf-5b2c-4091-911f-c08f164d77cf", 00:33:01.636 "is_configured": true, 00:33:01.636 "data_offset": 0, 00:33:01.636 "data_size": 65536 00:33:01.636 }, 00:33:01.636 { 00:33:01.636 "name": "BaseBdev2", 00:33:01.636 "uuid": "00000000-0000-0000-0000-000000000000", 00:33:01.636 "is_configured": false, 00:33:01.636 "data_offset": 0, 00:33:01.636 "data_size": 0 00:33:01.636 }, 00:33:01.636 { 00:33:01.636 "name": "BaseBdev3", 00:33:01.636 "uuid": "00000000-0000-0000-0000-000000000000", 00:33:01.636 "is_configured": false, 00:33:01.636 "data_offset": 0, 00:33:01.636 "data_size": 0 00:33:01.636 }, 00:33:01.636 { 00:33:01.636 "name": "BaseBdev4", 00:33:01.637 "uuid": "00000000-0000-0000-0000-000000000000", 00:33:01.637 "is_configured": false, 00:33:01.637 "data_offset": 0, 00:33:01.637 "data_size": 0 00:33:01.637 } 00:33:01.637 ] 00:33:01.637 }' 00:33:01.637 11:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:01.637 11:43:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:33:02.202 11:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:33:02.460 [2024-06-10 11:43:46.155008] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:33:02.460 BaseBdev2 00:33:02.460 11:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:33:02.460 11:43:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:33:02.460 11:43:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:33:02.460 11:43:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:33:02.460 11:43:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:33:02.460 11:43:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:33:02.460 11:43:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:33:02.460 11:43:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:33:02.719 [ 00:33:02.719 { 00:33:02.719 "name": "BaseBdev2", 00:33:02.719 "aliases": [ 00:33:02.719 "739fd3ab-2ea3-4afd-bf7c-1936a000df17" 00:33:02.719 ], 00:33:02.719 "product_name": "Malloc disk", 00:33:02.719 "block_size": 512, 00:33:02.719 "num_blocks": 65536, 00:33:02.719 "uuid": "739fd3ab-2ea3-4afd-bf7c-1936a000df17", 00:33:02.719 "assigned_rate_limits": { 00:33:02.719 "rw_ios_per_sec": 0, 00:33:02.719 "rw_mbytes_per_sec": 0, 00:33:02.719 "r_mbytes_per_sec": 0, 00:33:02.719 "w_mbytes_per_sec": 0 00:33:02.719 }, 00:33:02.719 "claimed": true, 00:33:02.719 "claim_type": "exclusive_write", 00:33:02.719 "zoned": false, 00:33:02.719 "supported_io_types": { 00:33:02.719 "read": true, 00:33:02.719 "write": true, 00:33:02.719 "unmap": true, 00:33:02.719 "write_zeroes": true, 00:33:02.719 "flush": true, 00:33:02.719 "reset": true, 00:33:02.719 "compare": false, 00:33:02.719 "compare_and_write": false, 00:33:02.719 "abort": true, 00:33:02.719 "nvme_admin": false, 00:33:02.719 "nvme_io": false 00:33:02.719 }, 00:33:02.719 "memory_domains": [ 00:33:02.719 { 00:33:02.719 "dma_device_id": "system", 00:33:02.719 "dma_device_type": 1 00:33:02.719 }, 00:33:02.719 { 00:33:02.719 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:02.719 "dma_device_type": 2 00:33:02.719 } 00:33:02.719 ], 00:33:02.719 "driver_specific": {} 00:33:02.719 } 00:33:02.719 ] 00:33:02.719 11:43:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:33:02.719 11:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:33:02.719 11:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:33:02.719 11:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:33:02.719 11:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:33:02.719 11:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:33:02.719 11:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:33:02.719 11:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:33:02.719 11:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:33:02.719 11:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:02.719 11:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:02.719 11:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:02.719 11:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:02.719 11:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:02.719 11:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:33:02.977 11:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:02.977 "name": "Existed_Raid", 00:33:02.977 "uuid": "00000000-0000-0000-0000-000000000000", 00:33:02.977 "strip_size_kb": 64, 00:33:02.977 "state": "configuring", 00:33:02.977 "raid_level": "raid0", 00:33:02.977 "superblock": false, 00:33:02.977 "num_base_bdevs": 4, 00:33:02.977 "num_base_bdevs_discovered": 2, 00:33:02.977 "num_base_bdevs_operational": 4, 00:33:02.977 "base_bdevs_list": [ 00:33:02.977 { 00:33:02.977 "name": "BaseBdev1", 00:33:02.977 "uuid": "faa32ebf-5b2c-4091-911f-c08f164d77cf", 00:33:02.977 "is_configured": true, 00:33:02.977 "data_offset": 0, 00:33:02.977 "data_size": 65536 00:33:02.977 }, 00:33:02.977 { 00:33:02.977 "name": "BaseBdev2", 00:33:02.977 "uuid": "739fd3ab-2ea3-4afd-bf7c-1936a000df17", 00:33:02.977 "is_configured": true, 00:33:02.977 "data_offset": 0, 00:33:02.977 "data_size": 65536 00:33:02.977 }, 00:33:02.977 { 00:33:02.977 "name": "BaseBdev3", 00:33:02.977 "uuid": "00000000-0000-0000-0000-000000000000", 00:33:02.977 "is_configured": false, 00:33:02.977 "data_offset": 0, 00:33:02.977 "data_size": 0 00:33:02.977 }, 00:33:02.977 { 00:33:02.977 "name": "BaseBdev4", 00:33:02.977 "uuid": "00000000-0000-0000-0000-000000000000", 00:33:02.977 "is_configured": false, 00:33:02.977 "data_offset": 0, 00:33:02.977 "data_size": 0 00:33:02.977 } 00:33:02.977 ] 00:33:02.977 }' 00:33:02.977 11:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:02.977 11:43:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:33:03.236 11:43:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:33:03.519 [2024-06-10 11:43:47.333004] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:33:03.519 BaseBdev3 00:33:03.519 11:43:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:33:03.519 11:43:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:33:03.519 11:43:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:33:03.519 11:43:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:33:03.519 11:43:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:33:03.519 11:43:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:33:03.519 11:43:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:33:03.819 11:43:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:33:03.819 [ 00:33:03.819 { 00:33:03.819 "name": "BaseBdev3", 00:33:03.819 "aliases": [ 00:33:03.819 "64468cba-ca56-405c-933f-3828843893e2" 00:33:03.819 ], 00:33:03.819 "product_name": "Malloc disk", 00:33:03.819 "block_size": 512, 00:33:03.819 "num_blocks": 65536, 00:33:03.819 "uuid": "64468cba-ca56-405c-933f-3828843893e2", 00:33:03.819 "assigned_rate_limits": { 00:33:03.819 "rw_ios_per_sec": 0, 00:33:03.819 "rw_mbytes_per_sec": 0, 00:33:03.819 "r_mbytes_per_sec": 0, 00:33:03.819 "w_mbytes_per_sec": 0 00:33:03.819 }, 00:33:03.819 "claimed": true, 00:33:03.819 "claim_type": "exclusive_write", 00:33:03.819 "zoned": false, 00:33:03.819 "supported_io_types": { 00:33:03.819 "read": true, 00:33:03.819 "write": true, 00:33:03.819 "unmap": true, 00:33:03.819 "write_zeroes": true, 00:33:03.819 "flush": true, 00:33:03.819 "reset": true, 00:33:03.819 "compare": false, 00:33:03.819 "compare_and_write": false, 00:33:03.819 "abort": true, 00:33:03.819 "nvme_admin": false, 00:33:03.819 "nvme_io": false 00:33:03.819 }, 00:33:03.819 "memory_domains": [ 00:33:03.819 { 00:33:03.819 "dma_device_id": "system", 00:33:03.819 "dma_device_type": 1 00:33:03.819 }, 00:33:03.819 { 00:33:03.819 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:03.819 "dma_device_type": 2 00:33:03.819 } 00:33:03.819 ], 00:33:03.819 "driver_specific": {} 00:33:03.819 } 00:33:03.819 ] 00:33:03.819 11:43:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:33:03.819 11:43:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:33:03.819 11:43:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:33:03.819 11:43:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:33:03.819 11:43:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:33:03.819 11:43:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:33:03.819 11:43:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:33:03.819 11:43:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:33:03.819 11:43:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:33:03.819 11:43:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:03.819 11:43:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:03.819 11:43:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:03.819 11:43:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:03.819 11:43:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:03.819 11:43:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:33:04.083 11:43:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:04.083 "name": "Existed_Raid", 00:33:04.083 "uuid": "00000000-0000-0000-0000-000000000000", 00:33:04.083 "strip_size_kb": 64, 00:33:04.083 "state": "configuring", 00:33:04.083 "raid_level": "raid0", 00:33:04.083 "superblock": false, 00:33:04.083 "num_base_bdevs": 4, 00:33:04.083 "num_base_bdevs_discovered": 3, 00:33:04.083 "num_base_bdevs_operational": 4, 00:33:04.083 "base_bdevs_list": [ 00:33:04.083 { 00:33:04.083 "name": "BaseBdev1", 00:33:04.083 "uuid": "faa32ebf-5b2c-4091-911f-c08f164d77cf", 00:33:04.083 "is_configured": true, 00:33:04.083 "data_offset": 0, 00:33:04.083 "data_size": 65536 00:33:04.083 }, 00:33:04.083 { 00:33:04.083 "name": "BaseBdev2", 00:33:04.083 "uuid": "739fd3ab-2ea3-4afd-bf7c-1936a000df17", 00:33:04.083 "is_configured": true, 00:33:04.083 "data_offset": 0, 00:33:04.083 "data_size": 65536 00:33:04.083 }, 00:33:04.083 { 00:33:04.083 "name": "BaseBdev3", 00:33:04.083 "uuid": "64468cba-ca56-405c-933f-3828843893e2", 00:33:04.083 "is_configured": true, 00:33:04.083 "data_offset": 0, 00:33:04.083 "data_size": 65536 00:33:04.083 }, 00:33:04.083 { 00:33:04.083 "name": "BaseBdev4", 00:33:04.083 "uuid": "00000000-0000-0000-0000-000000000000", 00:33:04.083 "is_configured": false, 00:33:04.083 "data_offset": 0, 00:33:04.083 "data_size": 0 00:33:04.083 } 00:33:04.083 ] 00:33:04.083 }' 00:33:04.083 11:43:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:04.083 11:43:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:33:04.648 11:43:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:33:04.648 [2024-06-10 11:43:48.516071] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:33:04.648 [2024-06-10 11:43:48.516105] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x13bee20 00:33:04.648 [2024-06-10 11:43:48.516112] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:33:04.648 [2024-06-10 11:43:48.516284] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13bfa70 00:33:04.648 [2024-06-10 11:43:48.516368] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x13bee20 00:33:04.648 [2024-06-10 11:43:48.516375] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x13bee20 00:33:04.648 [2024-06-10 11:43:48.516511] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:33:04.648 BaseBdev4 00:33:04.648 11:43:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:33:04.648 11:43:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev4 00:33:04.648 11:43:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:33:04.648 11:43:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:33:04.648 11:43:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:33:04.648 11:43:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:33:04.648 11:43:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:33:04.906 11:43:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:33:05.164 [ 00:33:05.164 { 00:33:05.164 "name": "BaseBdev4", 00:33:05.164 "aliases": [ 00:33:05.164 "6a7e508b-db70-4cf1-b059-753cb54e7e08" 00:33:05.164 ], 00:33:05.164 "product_name": "Malloc disk", 00:33:05.164 "block_size": 512, 00:33:05.164 "num_blocks": 65536, 00:33:05.164 "uuid": "6a7e508b-db70-4cf1-b059-753cb54e7e08", 00:33:05.164 "assigned_rate_limits": { 00:33:05.164 "rw_ios_per_sec": 0, 00:33:05.164 "rw_mbytes_per_sec": 0, 00:33:05.164 "r_mbytes_per_sec": 0, 00:33:05.164 "w_mbytes_per_sec": 0 00:33:05.164 }, 00:33:05.164 "claimed": true, 00:33:05.164 "claim_type": "exclusive_write", 00:33:05.164 "zoned": false, 00:33:05.164 "supported_io_types": { 00:33:05.164 "read": true, 00:33:05.164 "write": true, 00:33:05.164 "unmap": true, 00:33:05.164 "write_zeroes": true, 00:33:05.164 "flush": true, 00:33:05.164 "reset": true, 00:33:05.164 "compare": false, 00:33:05.164 "compare_and_write": false, 00:33:05.164 "abort": true, 00:33:05.164 "nvme_admin": false, 00:33:05.164 "nvme_io": false 00:33:05.164 }, 00:33:05.164 "memory_domains": [ 00:33:05.164 { 00:33:05.164 "dma_device_id": "system", 00:33:05.164 "dma_device_type": 1 00:33:05.164 }, 00:33:05.164 { 00:33:05.164 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:05.164 "dma_device_type": 2 00:33:05.164 } 00:33:05.164 ], 00:33:05.164 "driver_specific": {} 00:33:05.164 } 00:33:05.164 ] 00:33:05.164 11:43:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:33:05.164 11:43:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:33:05.164 11:43:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:33:05.164 11:43:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:33:05.164 11:43:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:33:05.165 11:43:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:33:05.165 11:43:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:33:05.165 11:43:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:33:05.165 11:43:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:33:05.165 11:43:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:05.165 11:43:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:05.165 11:43:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:05.165 11:43:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:05.165 11:43:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:05.165 11:43:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:33:05.165 11:43:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:05.165 "name": "Existed_Raid", 00:33:05.165 "uuid": "c4d42a13-7c9c-4c1a-9301-5d17a6e56042", 00:33:05.165 "strip_size_kb": 64, 00:33:05.165 "state": "online", 00:33:05.165 "raid_level": "raid0", 00:33:05.165 "superblock": false, 00:33:05.165 "num_base_bdevs": 4, 00:33:05.165 "num_base_bdevs_discovered": 4, 00:33:05.165 "num_base_bdevs_operational": 4, 00:33:05.165 "base_bdevs_list": [ 00:33:05.165 { 00:33:05.165 "name": "BaseBdev1", 00:33:05.165 "uuid": "faa32ebf-5b2c-4091-911f-c08f164d77cf", 00:33:05.165 "is_configured": true, 00:33:05.165 "data_offset": 0, 00:33:05.165 "data_size": 65536 00:33:05.165 }, 00:33:05.165 { 00:33:05.165 "name": "BaseBdev2", 00:33:05.165 "uuid": "739fd3ab-2ea3-4afd-bf7c-1936a000df17", 00:33:05.165 "is_configured": true, 00:33:05.165 "data_offset": 0, 00:33:05.165 "data_size": 65536 00:33:05.165 }, 00:33:05.165 { 00:33:05.165 "name": "BaseBdev3", 00:33:05.165 "uuid": "64468cba-ca56-405c-933f-3828843893e2", 00:33:05.165 "is_configured": true, 00:33:05.165 "data_offset": 0, 00:33:05.165 "data_size": 65536 00:33:05.165 }, 00:33:05.165 { 00:33:05.165 "name": "BaseBdev4", 00:33:05.165 "uuid": "6a7e508b-db70-4cf1-b059-753cb54e7e08", 00:33:05.165 "is_configured": true, 00:33:05.165 "data_offset": 0, 00:33:05.165 "data_size": 65536 00:33:05.165 } 00:33:05.165 ] 00:33:05.165 }' 00:33:05.165 11:43:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:05.165 11:43:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:33:05.731 11:43:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:33:05.731 11:43:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:33:05.731 11:43:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:33:05.731 11:43:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:33:05.731 11:43:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:33:05.731 11:43:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:33:05.731 11:43:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:33:05.731 11:43:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:33:05.989 [2024-06-10 11:43:49.679377] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:33:05.989 11:43:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:33:05.989 "name": "Existed_Raid", 00:33:05.989 "aliases": [ 00:33:05.989 "c4d42a13-7c9c-4c1a-9301-5d17a6e56042" 00:33:05.989 ], 00:33:05.989 "product_name": "Raid Volume", 00:33:05.989 "block_size": 512, 00:33:05.989 "num_blocks": 262144, 00:33:05.989 "uuid": "c4d42a13-7c9c-4c1a-9301-5d17a6e56042", 00:33:05.989 "assigned_rate_limits": { 00:33:05.989 "rw_ios_per_sec": 0, 00:33:05.989 "rw_mbytes_per_sec": 0, 00:33:05.989 "r_mbytes_per_sec": 0, 00:33:05.989 "w_mbytes_per_sec": 0 00:33:05.989 }, 00:33:05.989 "claimed": false, 00:33:05.989 "zoned": false, 00:33:05.989 "supported_io_types": { 00:33:05.989 "read": true, 00:33:05.989 "write": true, 00:33:05.989 "unmap": true, 00:33:05.989 "write_zeroes": true, 00:33:05.989 "flush": true, 00:33:05.989 "reset": true, 00:33:05.989 "compare": false, 00:33:05.989 "compare_and_write": false, 00:33:05.989 "abort": false, 00:33:05.989 "nvme_admin": false, 00:33:05.989 "nvme_io": false 00:33:05.989 }, 00:33:05.989 "memory_domains": [ 00:33:05.989 { 00:33:05.989 "dma_device_id": "system", 00:33:05.989 "dma_device_type": 1 00:33:05.989 }, 00:33:05.989 { 00:33:05.989 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:05.989 "dma_device_type": 2 00:33:05.989 }, 00:33:05.989 { 00:33:05.989 "dma_device_id": "system", 00:33:05.989 "dma_device_type": 1 00:33:05.989 }, 00:33:05.989 { 00:33:05.989 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:05.989 "dma_device_type": 2 00:33:05.989 }, 00:33:05.989 { 00:33:05.989 "dma_device_id": "system", 00:33:05.989 "dma_device_type": 1 00:33:05.989 }, 00:33:05.989 { 00:33:05.989 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:05.989 "dma_device_type": 2 00:33:05.989 }, 00:33:05.989 { 00:33:05.989 "dma_device_id": "system", 00:33:05.989 "dma_device_type": 1 00:33:05.989 }, 00:33:05.989 { 00:33:05.989 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:05.989 "dma_device_type": 2 00:33:05.989 } 00:33:05.989 ], 00:33:05.989 "driver_specific": { 00:33:05.989 "raid": { 00:33:05.989 "uuid": "c4d42a13-7c9c-4c1a-9301-5d17a6e56042", 00:33:05.989 "strip_size_kb": 64, 00:33:05.989 "state": "online", 00:33:05.989 "raid_level": "raid0", 00:33:05.989 "superblock": false, 00:33:05.989 "num_base_bdevs": 4, 00:33:05.989 "num_base_bdevs_discovered": 4, 00:33:05.989 "num_base_bdevs_operational": 4, 00:33:05.989 "base_bdevs_list": [ 00:33:05.989 { 00:33:05.989 "name": "BaseBdev1", 00:33:05.989 "uuid": "faa32ebf-5b2c-4091-911f-c08f164d77cf", 00:33:05.989 "is_configured": true, 00:33:05.989 "data_offset": 0, 00:33:05.989 "data_size": 65536 00:33:05.989 }, 00:33:05.989 { 00:33:05.989 "name": "BaseBdev2", 00:33:05.989 "uuid": "739fd3ab-2ea3-4afd-bf7c-1936a000df17", 00:33:05.989 "is_configured": true, 00:33:05.989 "data_offset": 0, 00:33:05.989 "data_size": 65536 00:33:05.989 }, 00:33:05.989 { 00:33:05.989 "name": "BaseBdev3", 00:33:05.989 "uuid": "64468cba-ca56-405c-933f-3828843893e2", 00:33:05.989 "is_configured": true, 00:33:05.989 "data_offset": 0, 00:33:05.989 "data_size": 65536 00:33:05.989 }, 00:33:05.989 { 00:33:05.989 "name": "BaseBdev4", 00:33:05.989 "uuid": "6a7e508b-db70-4cf1-b059-753cb54e7e08", 00:33:05.989 "is_configured": true, 00:33:05.989 "data_offset": 0, 00:33:05.989 "data_size": 65536 00:33:05.989 } 00:33:05.989 ] 00:33:05.989 } 00:33:05.989 } 00:33:05.989 }' 00:33:05.989 11:43:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:33:05.989 11:43:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:33:05.989 BaseBdev2 00:33:05.989 BaseBdev3 00:33:05.989 BaseBdev4' 00:33:05.989 11:43:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:33:05.989 11:43:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:33:05.989 11:43:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:33:05.989 11:43:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:33:05.989 "name": "BaseBdev1", 00:33:05.989 "aliases": [ 00:33:05.989 "faa32ebf-5b2c-4091-911f-c08f164d77cf" 00:33:05.989 ], 00:33:05.989 "product_name": "Malloc disk", 00:33:05.989 "block_size": 512, 00:33:05.989 "num_blocks": 65536, 00:33:05.989 "uuid": "faa32ebf-5b2c-4091-911f-c08f164d77cf", 00:33:05.989 "assigned_rate_limits": { 00:33:05.989 "rw_ios_per_sec": 0, 00:33:05.989 "rw_mbytes_per_sec": 0, 00:33:05.989 "r_mbytes_per_sec": 0, 00:33:05.989 "w_mbytes_per_sec": 0 00:33:05.989 }, 00:33:05.989 "claimed": true, 00:33:05.989 "claim_type": "exclusive_write", 00:33:05.989 "zoned": false, 00:33:05.989 "supported_io_types": { 00:33:05.989 "read": true, 00:33:05.989 "write": true, 00:33:05.989 "unmap": true, 00:33:05.989 "write_zeroes": true, 00:33:05.989 "flush": true, 00:33:05.989 "reset": true, 00:33:05.989 "compare": false, 00:33:05.989 "compare_and_write": false, 00:33:05.989 "abort": true, 00:33:05.989 "nvme_admin": false, 00:33:05.989 "nvme_io": false 00:33:05.989 }, 00:33:05.989 "memory_domains": [ 00:33:05.989 { 00:33:05.989 "dma_device_id": "system", 00:33:05.989 "dma_device_type": 1 00:33:05.989 }, 00:33:05.989 { 00:33:05.989 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:05.989 "dma_device_type": 2 00:33:05.989 } 00:33:05.989 ], 00:33:05.989 "driver_specific": {} 00:33:05.989 }' 00:33:05.989 11:43:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:06.247 11:43:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:06.247 11:43:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:33:06.247 11:43:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:06.247 11:43:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:06.247 11:43:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:33:06.247 11:43:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:06.247 11:43:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:06.247 11:43:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:33:06.248 11:43:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:06.248 11:43:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:06.505 11:43:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:33:06.505 11:43:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:33:06.505 11:43:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:33:06.505 11:43:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:33:06.505 11:43:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:33:06.505 "name": "BaseBdev2", 00:33:06.505 "aliases": [ 00:33:06.505 "739fd3ab-2ea3-4afd-bf7c-1936a000df17" 00:33:06.505 ], 00:33:06.505 "product_name": "Malloc disk", 00:33:06.505 "block_size": 512, 00:33:06.505 "num_blocks": 65536, 00:33:06.505 "uuid": "739fd3ab-2ea3-4afd-bf7c-1936a000df17", 00:33:06.505 "assigned_rate_limits": { 00:33:06.505 "rw_ios_per_sec": 0, 00:33:06.505 "rw_mbytes_per_sec": 0, 00:33:06.505 "r_mbytes_per_sec": 0, 00:33:06.505 "w_mbytes_per_sec": 0 00:33:06.506 }, 00:33:06.506 "claimed": true, 00:33:06.506 "claim_type": "exclusive_write", 00:33:06.506 "zoned": false, 00:33:06.506 "supported_io_types": { 00:33:06.506 "read": true, 00:33:06.506 "write": true, 00:33:06.506 "unmap": true, 00:33:06.506 "write_zeroes": true, 00:33:06.506 "flush": true, 00:33:06.506 "reset": true, 00:33:06.506 "compare": false, 00:33:06.506 "compare_and_write": false, 00:33:06.506 "abort": true, 00:33:06.506 "nvme_admin": false, 00:33:06.506 "nvme_io": false 00:33:06.506 }, 00:33:06.506 "memory_domains": [ 00:33:06.506 { 00:33:06.506 "dma_device_id": "system", 00:33:06.506 "dma_device_type": 1 00:33:06.506 }, 00:33:06.506 { 00:33:06.506 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:06.506 "dma_device_type": 2 00:33:06.506 } 00:33:06.506 ], 00:33:06.506 "driver_specific": {} 00:33:06.506 }' 00:33:06.506 11:43:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:06.506 11:43:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:06.506 11:43:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:33:06.506 11:43:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:06.764 11:43:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:06.764 11:43:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:33:06.764 11:43:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:06.764 11:43:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:06.764 11:43:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:33:06.764 11:43:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:06.764 11:43:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:06.764 11:43:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:33:06.764 11:43:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:33:06.764 11:43:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:33:06.764 11:43:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:33:07.022 11:43:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:33:07.022 "name": "BaseBdev3", 00:33:07.022 "aliases": [ 00:33:07.022 "64468cba-ca56-405c-933f-3828843893e2" 00:33:07.022 ], 00:33:07.022 "product_name": "Malloc disk", 00:33:07.022 "block_size": 512, 00:33:07.022 "num_blocks": 65536, 00:33:07.022 "uuid": "64468cba-ca56-405c-933f-3828843893e2", 00:33:07.022 "assigned_rate_limits": { 00:33:07.022 "rw_ios_per_sec": 0, 00:33:07.022 "rw_mbytes_per_sec": 0, 00:33:07.022 "r_mbytes_per_sec": 0, 00:33:07.022 "w_mbytes_per_sec": 0 00:33:07.022 }, 00:33:07.022 "claimed": true, 00:33:07.022 "claim_type": "exclusive_write", 00:33:07.022 "zoned": false, 00:33:07.022 "supported_io_types": { 00:33:07.022 "read": true, 00:33:07.022 "write": true, 00:33:07.022 "unmap": true, 00:33:07.022 "write_zeroes": true, 00:33:07.022 "flush": true, 00:33:07.022 "reset": true, 00:33:07.022 "compare": false, 00:33:07.022 "compare_and_write": false, 00:33:07.022 "abort": true, 00:33:07.022 "nvme_admin": false, 00:33:07.022 "nvme_io": false 00:33:07.022 }, 00:33:07.022 "memory_domains": [ 00:33:07.022 { 00:33:07.022 "dma_device_id": "system", 00:33:07.022 "dma_device_type": 1 00:33:07.022 }, 00:33:07.022 { 00:33:07.022 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:07.022 "dma_device_type": 2 00:33:07.022 } 00:33:07.022 ], 00:33:07.022 "driver_specific": {} 00:33:07.022 }' 00:33:07.022 11:43:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:07.022 11:43:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:07.022 11:43:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:33:07.022 11:43:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:07.022 11:43:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:07.280 11:43:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:33:07.280 11:43:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:07.280 11:43:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:07.280 11:43:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:33:07.280 11:43:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:07.280 11:43:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:07.280 11:43:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:33:07.280 11:43:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:33:07.280 11:43:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:33:07.280 11:43:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:33:07.538 11:43:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:33:07.538 "name": "BaseBdev4", 00:33:07.538 "aliases": [ 00:33:07.538 "6a7e508b-db70-4cf1-b059-753cb54e7e08" 00:33:07.538 ], 00:33:07.538 "product_name": "Malloc disk", 00:33:07.538 "block_size": 512, 00:33:07.538 "num_blocks": 65536, 00:33:07.538 "uuid": "6a7e508b-db70-4cf1-b059-753cb54e7e08", 00:33:07.538 "assigned_rate_limits": { 00:33:07.538 "rw_ios_per_sec": 0, 00:33:07.538 "rw_mbytes_per_sec": 0, 00:33:07.538 "r_mbytes_per_sec": 0, 00:33:07.538 "w_mbytes_per_sec": 0 00:33:07.538 }, 00:33:07.538 "claimed": true, 00:33:07.538 "claim_type": "exclusive_write", 00:33:07.538 "zoned": false, 00:33:07.538 "supported_io_types": { 00:33:07.538 "read": true, 00:33:07.538 "write": true, 00:33:07.538 "unmap": true, 00:33:07.538 "write_zeroes": true, 00:33:07.538 "flush": true, 00:33:07.538 "reset": true, 00:33:07.538 "compare": false, 00:33:07.538 "compare_and_write": false, 00:33:07.538 "abort": true, 00:33:07.538 "nvme_admin": false, 00:33:07.538 "nvme_io": false 00:33:07.538 }, 00:33:07.538 "memory_domains": [ 00:33:07.538 { 00:33:07.538 "dma_device_id": "system", 00:33:07.539 "dma_device_type": 1 00:33:07.539 }, 00:33:07.539 { 00:33:07.539 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:07.539 "dma_device_type": 2 00:33:07.539 } 00:33:07.539 ], 00:33:07.539 "driver_specific": {} 00:33:07.539 }' 00:33:07.539 11:43:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:07.539 11:43:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:07.539 11:43:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:33:07.539 11:43:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:07.539 11:43:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:07.539 11:43:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:33:07.539 11:43:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:07.797 11:43:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:07.797 11:43:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:33:07.797 11:43:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:07.797 11:43:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:07.797 11:43:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:33:07.797 11:43:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:33:08.054 [2024-06-10 11:43:51.760545] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:33:08.054 [2024-06-10 11:43:51.760575] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:33:08.054 [2024-06-10 11:43:51.760608] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:33:08.054 11:43:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:33:08.054 11:43:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:33:08.054 11:43:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:33:08.054 11:43:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:33:08.054 11:43:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:33:08.054 11:43:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 3 00:33:08.054 11:43:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:33:08.054 11:43:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:33:08.054 11:43:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:33:08.054 11:43:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:33:08.055 11:43:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:33:08.055 11:43:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:08.055 11:43:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:08.055 11:43:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:08.055 11:43:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:08.055 11:43:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:08.055 11:43:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:33:08.055 11:43:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:08.055 "name": "Existed_Raid", 00:33:08.055 "uuid": "c4d42a13-7c9c-4c1a-9301-5d17a6e56042", 00:33:08.055 "strip_size_kb": 64, 00:33:08.055 "state": "offline", 00:33:08.055 "raid_level": "raid0", 00:33:08.055 "superblock": false, 00:33:08.055 "num_base_bdevs": 4, 00:33:08.055 "num_base_bdevs_discovered": 3, 00:33:08.055 "num_base_bdevs_operational": 3, 00:33:08.055 "base_bdevs_list": [ 00:33:08.055 { 00:33:08.055 "name": null, 00:33:08.055 "uuid": "00000000-0000-0000-0000-000000000000", 00:33:08.055 "is_configured": false, 00:33:08.055 "data_offset": 0, 00:33:08.055 "data_size": 65536 00:33:08.055 }, 00:33:08.055 { 00:33:08.055 "name": "BaseBdev2", 00:33:08.055 "uuid": "739fd3ab-2ea3-4afd-bf7c-1936a000df17", 00:33:08.055 "is_configured": true, 00:33:08.055 "data_offset": 0, 00:33:08.055 "data_size": 65536 00:33:08.055 }, 00:33:08.055 { 00:33:08.055 "name": "BaseBdev3", 00:33:08.055 "uuid": "64468cba-ca56-405c-933f-3828843893e2", 00:33:08.055 "is_configured": true, 00:33:08.055 "data_offset": 0, 00:33:08.055 "data_size": 65536 00:33:08.055 }, 00:33:08.055 { 00:33:08.055 "name": "BaseBdev4", 00:33:08.055 "uuid": "6a7e508b-db70-4cf1-b059-753cb54e7e08", 00:33:08.055 "is_configured": true, 00:33:08.055 "data_offset": 0, 00:33:08.055 "data_size": 65536 00:33:08.055 } 00:33:08.055 ] 00:33:08.055 }' 00:33:08.055 11:43:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:08.055 11:43:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:33:08.624 11:43:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:33:08.624 11:43:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:33:08.624 11:43:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:08.624 11:43:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:33:08.881 11:43:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:33:08.881 11:43:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:33:08.881 11:43:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:33:08.881 [2024-06-10 11:43:52.780016] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:33:08.881 11:43:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:33:08.881 11:43:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:33:08.881 11:43:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:08.882 11:43:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:33:09.139 11:43:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:33:09.139 11:43:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:33:09.139 11:43:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:33:09.397 [2024-06-10 11:43:53.140756] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:33:09.397 11:43:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:33:09.397 11:43:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:33:09.397 11:43:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:09.397 11:43:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:33:09.397 11:43:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:33:09.397 11:43:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:33:09.397 11:43:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:33:09.655 [2024-06-10 11:43:53.493533] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:33:09.655 [2024-06-10 11:43:53.493564] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13bee20 name Existed_Raid, state offline 00:33:09.655 11:43:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:33:09.655 11:43:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:33:09.655 11:43:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:09.655 11:43:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:33:09.913 11:43:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:33:09.913 11:43:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:33:09.913 11:43:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:33:09.913 11:43:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:33:09.913 11:43:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:33:09.913 11:43:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:33:10.171 BaseBdev2 00:33:10.171 11:43:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:33:10.171 11:43:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:33:10.171 11:43:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:33:10.171 11:43:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:33:10.171 11:43:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:33:10.171 11:43:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:33:10.172 11:43:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:33:10.172 11:43:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:33:10.430 [ 00:33:10.430 { 00:33:10.430 "name": "BaseBdev2", 00:33:10.430 "aliases": [ 00:33:10.430 "8ff96db1-ea2b-49fc-a05d-fda3fb829110" 00:33:10.430 ], 00:33:10.430 "product_name": "Malloc disk", 00:33:10.430 "block_size": 512, 00:33:10.430 "num_blocks": 65536, 00:33:10.430 "uuid": "8ff96db1-ea2b-49fc-a05d-fda3fb829110", 00:33:10.430 "assigned_rate_limits": { 00:33:10.430 "rw_ios_per_sec": 0, 00:33:10.430 "rw_mbytes_per_sec": 0, 00:33:10.430 "r_mbytes_per_sec": 0, 00:33:10.430 "w_mbytes_per_sec": 0 00:33:10.430 }, 00:33:10.430 "claimed": false, 00:33:10.430 "zoned": false, 00:33:10.430 "supported_io_types": { 00:33:10.430 "read": true, 00:33:10.430 "write": true, 00:33:10.430 "unmap": true, 00:33:10.430 "write_zeroes": true, 00:33:10.430 "flush": true, 00:33:10.430 "reset": true, 00:33:10.430 "compare": false, 00:33:10.430 "compare_and_write": false, 00:33:10.430 "abort": true, 00:33:10.430 "nvme_admin": false, 00:33:10.430 "nvme_io": false 00:33:10.430 }, 00:33:10.430 "memory_domains": [ 00:33:10.430 { 00:33:10.430 "dma_device_id": "system", 00:33:10.430 "dma_device_type": 1 00:33:10.430 }, 00:33:10.430 { 00:33:10.430 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:10.430 "dma_device_type": 2 00:33:10.430 } 00:33:10.430 ], 00:33:10.430 "driver_specific": {} 00:33:10.430 } 00:33:10.430 ] 00:33:10.430 11:43:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:33:10.430 11:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:33:10.430 11:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:33:10.430 11:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:33:10.688 BaseBdev3 00:33:10.688 11:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:33:10.688 11:43:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:33:10.688 11:43:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:33:10.688 11:43:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:33:10.688 11:43:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:33:10.688 11:43:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:33:10.688 11:43:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:33:10.688 11:43:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:33:10.947 [ 00:33:10.947 { 00:33:10.947 "name": "BaseBdev3", 00:33:10.947 "aliases": [ 00:33:10.947 "662f1fb0-0c43-43f5-b127-1c967f90a579" 00:33:10.947 ], 00:33:10.947 "product_name": "Malloc disk", 00:33:10.947 "block_size": 512, 00:33:10.947 "num_blocks": 65536, 00:33:10.947 "uuid": "662f1fb0-0c43-43f5-b127-1c967f90a579", 00:33:10.947 "assigned_rate_limits": { 00:33:10.947 "rw_ios_per_sec": 0, 00:33:10.947 "rw_mbytes_per_sec": 0, 00:33:10.947 "r_mbytes_per_sec": 0, 00:33:10.947 "w_mbytes_per_sec": 0 00:33:10.947 }, 00:33:10.947 "claimed": false, 00:33:10.947 "zoned": false, 00:33:10.947 "supported_io_types": { 00:33:10.947 "read": true, 00:33:10.947 "write": true, 00:33:10.947 "unmap": true, 00:33:10.947 "write_zeroes": true, 00:33:10.947 "flush": true, 00:33:10.947 "reset": true, 00:33:10.947 "compare": false, 00:33:10.947 "compare_and_write": false, 00:33:10.947 "abort": true, 00:33:10.947 "nvme_admin": false, 00:33:10.947 "nvme_io": false 00:33:10.947 }, 00:33:10.947 "memory_domains": [ 00:33:10.947 { 00:33:10.947 "dma_device_id": "system", 00:33:10.947 "dma_device_type": 1 00:33:10.947 }, 00:33:10.947 { 00:33:10.947 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:10.947 "dma_device_type": 2 00:33:10.947 } 00:33:10.947 ], 00:33:10.947 "driver_specific": {} 00:33:10.947 } 00:33:10.947 ] 00:33:10.947 11:43:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:33:10.947 11:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:33:10.947 11:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:33:10.947 11:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:33:11.206 BaseBdev4 00:33:11.206 11:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:33:11.206 11:43:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev4 00:33:11.206 11:43:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:33:11.206 11:43:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:33:11.206 11:43:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:33:11.206 11:43:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:33:11.206 11:43:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:33:11.206 11:43:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:33:11.465 [ 00:33:11.465 { 00:33:11.465 "name": "BaseBdev4", 00:33:11.465 "aliases": [ 00:33:11.465 "276db33c-3b68-4490-8d75-8a921a9eaf19" 00:33:11.465 ], 00:33:11.465 "product_name": "Malloc disk", 00:33:11.465 "block_size": 512, 00:33:11.465 "num_blocks": 65536, 00:33:11.465 "uuid": "276db33c-3b68-4490-8d75-8a921a9eaf19", 00:33:11.465 "assigned_rate_limits": { 00:33:11.465 "rw_ios_per_sec": 0, 00:33:11.465 "rw_mbytes_per_sec": 0, 00:33:11.465 "r_mbytes_per_sec": 0, 00:33:11.465 "w_mbytes_per_sec": 0 00:33:11.465 }, 00:33:11.465 "claimed": false, 00:33:11.465 "zoned": false, 00:33:11.465 "supported_io_types": { 00:33:11.465 "read": true, 00:33:11.465 "write": true, 00:33:11.465 "unmap": true, 00:33:11.465 "write_zeroes": true, 00:33:11.465 "flush": true, 00:33:11.465 "reset": true, 00:33:11.465 "compare": false, 00:33:11.465 "compare_and_write": false, 00:33:11.465 "abort": true, 00:33:11.465 "nvme_admin": false, 00:33:11.465 "nvme_io": false 00:33:11.465 }, 00:33:11.465 "memory_domains": [ 00:33:11.465 { 00:33:11.465 "dma_device_id": "system", 00:33:11.465 "dma_device_type": 1 00:33:11.465 }, 00:33:11.465 { 00:33:11.465 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:11.465 "dma_device_type": 2 00:33:11.465 } 00:33:11.465 ], 00:33:11.465 "driver_specific": {} 00:33:11.465 } 00:33:11.465 ] 00:33:11.465 11:43:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:33:11.465 11:43:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:33:11.465 11:43:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:33:11.465 11:43:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:33:11.724 [2024-06-10 11:43:55.413163] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:33:11.724 [2024-06-10 11:43:55.413197] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:33:11.724 [2024-06-10 11:43:55.413211] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:33:11.724 [2024-06-10 11:43:55.414282] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:33:11.724 [2024-06-10 11:43:55.414312] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:33:11.724 11:43:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:33:11.724 11:43:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:33:11.724 11:43:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:33:11.724 11:43:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:33:11.724 11:43:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:33:11.724 11:43:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:33:11.724 11:43:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:11.724 11:43:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:11.724 11:43:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:11.724 11:43:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:11.724 11:43:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:11.724 11:43:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:33:11.724 11:43:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:11.724 "name": "Existed_Raid", 00:33:11.724 "uuid": "00000000-0000-0000-0000-000000000000", 00:33:11.724 "strip_size_kb": 64, 00:33:11.724 "state": "configuring", 00:33:11.724 "raid_level": "raid0", 00:33:11.724 "superblock": false, 00:33:11.724 "num_base_bdevs": 4, 00:33:11.724 "num_base_bdevs_discovered": 3, 00:33:11.724 "num_base_bdevs_operational": 4, 00:33:11.724 "base_bdevs_list": [ 00:33:11.724 { 00:33:11.724 "name": "BaseBdev1", 00:33:11.724 "uuid": "00000000-0000-0000-0000-000000000000", 00:33:11.724 "is_configured": false, 00:33:11.724 "data_offset": 0, 00:33:11.724 "data_size": 0 00:33:11.724 }, 00:33:11.724 { 00:33:11.724 "name": "BaseBdev2", 00:33:11.724 "uuid": "8ff96db1-ea2b-49fc-a05d-fda3fb829110", 00:33:11.724 "is_configured": true, 00:33:11.724 "data_offset": 0, 00:33:11.724 "data_size": 65536 00:33:11.724 }, 00:33:11.724 { 00:33:11.724 "name": "BaseBdev3", 00:33:11.724 "uuid": "662f1fb0-0c43-43f5-b127-1c967f90a579", 00:33:11.724 "is_configured": true, 00:33:11.724 "data_offset": 0, 00:33:11.724 "data_size": 65536 00:33:11.724 }, 00:33:11.724 { 00:33:11.724 "name": "BaseBdev4", 00:33:11.724 "uuid": "276db33c-3b68-4490-8d75-8a921a9eaf19", 00:33:11.724 "is_configured": true, 00:33:11.724 "data_offset": 0, 00:33:11.724 "data_size": 65536 00:33:11.724 } 00:33:11.724 ] 00:33:11.724 }' 00:33:11.724 11:43:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:11.724 11:43:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:33:12.291 11:43:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:33:12.291 [2024-06-10 11:43:56.231253] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:33:12.550 11:43:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:33:12.550 11:43:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:33:12.550 11:43:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:33:12.550 11:43:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:33:12.550 11:43:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:33:12.550 11:43:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:33:12.550 11:43:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:12.550 11:43:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:12.550 11:43:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:12.550 11:43:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:12.550 11:43:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:12.550 11:43:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:33:12.551 11:43:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:12.551 "name": "Existed_Raid", 00:33:12.551 "uuid": "00000000-0000-0000-0000-000000000000", 00:33:12.551 "strip_size_kb": 64, 00:33:12.551 "state": "configuring", 00:33:12.551 "raid_level": "raid0", 00:33:12.551 "superblock": false, 00:33:12.551 "num_base_bdevs": 4, 00:33:12.551 "num_base_bdevs_discovered": 2, 00:33:12.551 "num_base_bdevs_operational": 4, 00:33:12.551 "base_bdevs_list": [ 00:33:12.551 { 00:33:12.551 "name": "BaseBdev1", 00:33:12.551 "uuid": "00000000-0000-0000-0000-000000000000", 00:33:12.551 "is_configured": false, 00:33:12.551 "data_offset": 0, 00:33:12.551 "data_size": 0 00:33:12.551 }, 00:33:12.551 { 00:33:12.551 "name": null, 00:33:12.551 "uuid": "8ff96db1-ea2b-49fc-a05d-fda3fb829110", 00:33:12.551 "is_configured": false, 00:33:12.551 "data_offset": 0, 00:33:12.551 "data_size": 65536 00:33:12.551 }, 00:33:12.551 { 00:33:12.551 "name": "BaseBdev3", 00:33:12.551 "uuid": "662f1fb0-0c43-43f5-b127-1c967f90a579", 00:33:12.551 "is_configured": true, 00:33:12.551 "data_offset": 0, 00:33:12.551 "data_size": 65536 00:33:12.551 }, 00:33:12.551 { 00:33:12.551 "name": "BaseBdev4", 00:33:12.551 "uuid": "276db33c-3b68-4490-8d75-8a921a9eaf19", 00:33:12.551 "is_configured": true, 00:33:12.551 "data_offset": 0, 00:33:12.551 "data_size": 65536 00:33:12.551 } 00:33:12.551 ] 00:33:12.551 }' 00:33:12.551 11:43:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:12.551 11:43:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:33:13.118 11:43:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:13.118 11:43:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:33:13.376 11:43:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:33:13.376 11:43:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:33:13.376 [2024-06-10 11:43:57.261739] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:33:13.376 BaseBdev1 00:33:13.376 11:43:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:33:13.376 11:43:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:33:13.376 11:43:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:33:13.376 11:43:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:33:13.376 11:43:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:33:13.376 11:43:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:33:13.376 11:43:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:33:13.634 11:43:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:33:13.893 [ 00:33:13.893 { 00:33:13.893 "name": "BaseBdev1", 00:33:13.893 "aliases": [ 00:33:13.893 "9b8b61a3-19ac-4606-91fd-a7ad5b3965f1" 00:33:13.893 ], 00:33:13.893 "product_name": "Malloc disk", 00:33:13.893 "block_size": 512, 00:33:13.893 "num_blocks": 65536, 00:33:13.893 "uuid": "9b8b61a3-19ac-4606-91fd-a7ad5b3965f1", 00:33:13.893 "assigned_rate_limits": { 00:33:13.893 "rw_ios_per_sec": 0, 00:33:13.893 "rw_mbytes_per_sec": 0, 00:33:13.893 "r_mbytes_per_sec": 0, 00:33:13.893 "w_mbytes_per_sec": 0 00:33:13.893 }, 00:33:13.893 "claimed": true, 00:33:13.893 "claim_type": "exclusive_write", 00:33:13.893 "zoned": false, 00:33:13.893 "supported_io_types": { 00:33:13.893 "read": true, 00:33:13.893 "write": true, 00:33:13.893 "unmap": true, 00:33:13.893 "write_zeroes": true, 00:33:13.893 "flush": true, 00:33:13.893 "reset": true, 00:33:13.893 "compare": false, 00:33:13.893 "compare_and_write": false, 00:33:13.893 "abort": true, 00:33:13.893 "nvme_admin": false, 00:33:13.893 "nvme_io": false 00:33:13.893 }, 00:33:13.893 "memory_domains": [ 00:33:13.893 { 00:33:13.893 "dma_device_id": "system", 00:33:13.893 "dma_device_type": 1 00:33:13.893 }, 00:33:13.893 { 00:33:13.893 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:13.893 "dma_device_type": 2 00:33:13.893 } 00:33:13.893 ], 00:33:13.893 "driver_specific": {} 00:33:13.893 } 00:33:13.893 ] 00:33:13.893 11:43:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:33:13.893 11:43:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:33:13.893 11:43:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:33:13.893 11:43:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:33:13.893 11:43:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:33:13.893 11:43:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:33:13.893 11:43:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:33:13.893 11:43:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:13.893 11:43:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:13.893 11:43:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:13.893 11:43:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:13.893 11:43:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:13.893 11:43:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:33:13.893 11:43:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:13.893 "name": "Existed_Raid", 00:33:13.893 "uuid": "00000000-0000-0000-0000-000000000000", 00:33:13.893 "strip_size_kb": 64, 00:33:13.893 "state": "configuring", 00:33:13.893 "raid_level": "raid0", 00:33:13.893 "superblock": false, 00:33:13.893 "num_base_bdevs": 4, 00:33:13.893 "num_base_bdevs_discovered": 3, 00:33:13.893 "num_base_bdevs_operational": 4, 00:33:13.893 "base_bdevs_list": [ 00:33:13.893 { 00:33:13.893 "name": "BaseBdev1", 00:33:13.893 "uuid": "9b8b61a3-19ac-4606-91fd-a7ad5b3965f1", 00:33:13.893 "is_configured": true, 00:33:13.893 "data_offset": 0, 00:33:13.893 "data_size": 65536 00:33:13.893 }, 00:33:13.893 { 00:33:13.893 "name": null, 00:33:13.893 "uuid": "8ff96db1-ea2b-49fc-a05d-fda3fb829110", 00:33:13.893 "is_configured": false, 00:33:13.893 "data_offset": 0, 00:33:13.893 "data_size": 65536 00:33:13.893 }, 00:33:13.893 { 00:33:13.893 "name": "BaseBdev3", 00:33:13.893 "uuid": "662f1fb0-0c43-43f5-b127-1c967f90a579", 00:33:13.893 "is_configured": true, 00:33:13.893 "data_offset": 0, 00:33:13.893 "data_size": 65536 00:33:13.893 }, 00:33:13.893 { 00:33:13.893 "name": "BaseBdev4", 00:33:13.893 "uuid": "276db33c-3b68-4490-8d75-8a921a9eaf19", 00:33:13.893 "is_configured": true, 00:33:13.893 "data_offset": 0, 00:33:13.893 "data_size": 65536 00:33:13.893 } 00:33:13.893 ] 00:33:13.893 }' 00:33:13.894 11:43:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:13.894 11:43:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:33:14.459 11:43:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:14.459 11:43:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:33:14.718 11:43:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:33:14.718 11:43:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:33:14.718 [2024-06-10 11:43:58.629298] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:33:14.718 11:43:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:33:14.718 11:43:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:33:14.718 11:43:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:33:14.718 11:43:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:33:14.718 11:43:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:33:14.718 11:43:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:33:14.718 11:43:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:14.718 11:43:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:14.718 11:43:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:14.718 11:43:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:14.718 11:43:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:14.718 11:43:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:33:14.976 11:43:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:14.976 "name": "Existed_Raid", 00:33:14.976 "uuid": "00000000-0000-0000-0000-000000000000", 00:33:14.976 "strip_size_kb": 64, 00:33:14.976 "state": "configuring", 00:33:14.976 "raid_level": "raid0", 00:33:14.976 "superblock": false, 00:33:14.976 "num_base_bdevs": 4, 00:33:14.976 "num_base_bdevs_discovered": 2, 00:33:14.976 "num_base_bdevs_operational": 4, 00:33:14.976 "base_bdevs_list": [ 00:33:14.976 { 00:33:14.976 "name": "BaseBdev1", 00:33:14.976 "uuid": "9b8b61a3-19ac-4606-91fd-a7ad5b3965f1", 00:33:14.976 "is_configured": true, 00:33:14.976 "data_offset": 0, 00:33:14.976 "data_size": 65536 00:33:14.976 }, 00:33:14.976 { 00:33:14.976 "name": null, 00:33:14.976 "uuid": "8ff96db1-ea2b-49fc-a05d-fda3fb829110", 00:33:14.976 "is_configured": false, 00:33:14.976 "data_offset": 0, 00:33:14.976 "data_size": 65536 00:33:14.976 }, 00:33:14.976 { 00:33:14.976 "name": null, 00:33:14.976 "uuid": "662f1fb0-0c43-43f5-b127-1c967f90a579", 00:33:14.976 "is_configured": false, 00:33:14.976 "data_offset": 0, 00:33:14.976 "data_size": 65536 00:33:14.976 }, 00:33:14.976 { 00:33:14.976 "name": "BaseBdev4", 00:33:14.976 "uuid": "276db33c-3b68-4490-8d75-8a921a9eaf19", 00:33:14.976 "is_configured": true, 00:33:14.976 "data_offset": 0, 00:33:14.976 "data_size": 65536 00:33:14.976 } 00:33:14.976 ] 00:33:14.976 }' 00:33:14.976 11:43:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:14.976 11:43:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:33:15.541 11:43:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:15.541 11:43:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:33:15.801 11:43:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:33:15.801 11:43:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:33:15.801 [2024-06-10 11:43:59.660010] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:33:15.801 11:43:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:33:15.801 11:43:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:33:15.801 11:43:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:33:15.801 11:43:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:33:15.801 11:43:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:33:15.801 11:43:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:33:15.801 11:43:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:15.801 11:43:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:15.801 11:43:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:15.801 11:43:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:15.801 11:43:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:15.801 11:43:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:33:16.060 11:43:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:16.060 "name": "Existed_Raid", 00:33:16.060 "uuid": "00000000-0000-0000-0000-000000000000", 00:33:16.060 "strip_size_kb": 64, 00:33:16.060 "state": "configuring", 00:33:16.060 "raid_level": "raid0", 00:33:16.060 "superblock": false, 00:33:16.060 "num_base_bdevs": 4, 00:33:16.060 "num_base_bdevs_discovered": 3, 00:33:16.060 "num_base_bdevs_operational": 4, 00:33:16.060 "base_bdevs_list": [ 00:33:16.060 { 00:33:16.060 "name": "BaseBdev1", 00:33:16.060 "uuid": "9b8b61a3-19ac-4606-91fd-a7ad5b3965f1", 00:33:16.060 "is_configured": true, 00:33:16.060 "data_offset": 0, 00:33:16.060 "data_size": 65536 00:33:16.060 }, 00:33:16.060 { 00:33:16.060 "name": null, 00:33:16.060 "uuid": "8ff96db1-ea2b-49fc-a05d-fda3fb829110", 00:33:16.060 "is_configured": false, 00:33:16.060 "data_offset": 0, 00:33:16.060 "data_size": 65536 00:33:16.060 }, 00:33:16.060 { 00:33:16.060 "name": "BaseBdev3", 00:33:16.060 "uuid": "662f1fb0-0c43-43f5-b127-1c967f90a579", 00:33:16.060 "is_configured": true, 00:33:16.060 "data_offset": 0, 00:33:16.060 "data_size": 65536 00:33:16.060 }, 00:33:16.060 { 00:33:16.060 "name": "BaseBdev4", 00:33:16.060 "uuid": "276db33c-3b68-4490-8d75-8a921a9eaf19", 00:33:16.060 "is_configured": true, 00:33:16.060 "data_offset": 0, 00:33:16.060 "data_size": 65536 00:33:16.060 } 00:33:16.060 ] 00:33:16.060 }' 00:33:16.060 11:43:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:16.060 11:43:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:33:16.625 11:44:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:16.625 11:44:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:33:16.625 11:44:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:33:16.625 11:44:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:33:16.883 [2024-06-10 11:44:00.654596] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:33:16.883 11:44:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:33:16.883 11:44:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:33:16.883 11:44:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:33:16.883 11:44:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:33:16.883 11:44:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:33:16.883 11:44:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:33:16.883 11:44:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:16.883 11:44:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:16.883 11:44:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:16.883 11:44:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:16.883 11:44:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:16.883 11:44:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:33:17.140 11:44:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:17.140 "name": "Existed_Raid", 00:33:17.140 "uuid": "00000000-0000-0000-0000-000000000000", 00:33:17.140 "strip_size_kb": 64, 00:33:17.140 "state": "configuring", 00:33:17.140 "raid_level": "raid0", 00:33:17.140 "superblock": false, 00:33:17.140 "num_base_bdevs": 4, 00:33:17.140 "num_base_bdevs_discovered": 2, 00:33:17.140 "num_base_bdevs_operational": 4, 00:33:17.140 "base_bdevs_list": [ 00:33:17.140 { 00:33:17.140 "name": null, 00:33:17.140 "uuid": "9b8b61a3-19ac-4606-91fd-a7ad5b3965f1", 00:33:17.140 "is_configured": false, 00:33:17.140 "data_offset": 0, 00:33:17.140 "data_size": 65536 00:33:17.140 }, 00:33:17.140 { 00:33:17.140 "name": null, 00:33:17.140 "uuid": "8ff96db1-ea2b-49fc-a05d-fda3fb829110", 00:33:17.140 "is_configured": false, 00:33:17.140 "data_offset": 0, 00:33:17.140 "data_size": 65536 00:33:17.141 }, 00:33:17.141 { 00:33:17.141 "name": "BaseBdev3", 00:33:17.141 "uuid": "662f1fb0-0c43-43f5-b127-1c967f90a579", 00:33:17.141 "is_configured": true, 00:33:17.141 "data_offset": 0, 00:33:17.141 "data_size": 65536 00:33:17.141 }, 00:33:17.141 { 00:33:17.141 "name": "BaseBdev4", 00:33:17.141 "uuid": "276db33c-3b68-4490-8d75-8a921a9eaf19", 00:33:17.141 "is_configured": true, 00:33:17.141 "data_offset": 0, 00:33:17.141 "data_size": 65536 00:33:17.141 } 00:33:17.141 ] 00:33:17.141 }' 00:33:17.141 11:44:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:17.141 11:44:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:33:17.706 11:44:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:17.706 11:44:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:33:17.706 11:44:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:33:17.706 11:44:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:33:17.964 [2024-06-10 11:44:01.699342] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:33:17.964 11:44:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:33:17.964 11:44:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:33:17.964 11:44:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:33:17.964 11:44:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:33:17.964 11:44:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:33:17.964 11:44:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:33:17.964 11:44:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:17.964 11:44:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:17.964 11:44:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:17.964 11:44:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:17.964 11:44:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:33:17.964 11:44:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:17.964 11:44:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:17.964 "name": "Existed_Raid", 00:33:17.964 "uuid": "00000000-0000-0000-0000-000000000000", 00:33:17.964 "strip_size_kb": 64, 00:33:17.964 "state": "configuring", 00:33:17.964 "raid_level": "raid0", 00:33:17.964 "superblock": false, 00:33:17.964 "num_base_bdevs": 4, 00:33:17.964 "num_base_bdevs_discovered": 3, 00:33:17.964 "num_base_bdevs_operational": 4, 00:33:17.964 "base_bdevs_list": [ 00:33:17.964 { 00:33:17.964 "name": null, 00:33:17.964 "uuid": "9b8b61a3-19ac-4606-91fd-a7ad5b3965f1", 00:33:17.964 "is_configured": false, 00:33:17.964 "data_offset": 0, 00:33:17.964 "data_size": 65536 00:33:17.964 }, 00:33:17.964 { 00:33:17.964 "name": "BaseBdev2", 00:33:17.964 "uuid": "8ff96db1-ea2b-49fc-a05d-fda3fb829110", 00:33:17.964 "is_configured": true, 00:33:17.964 "data_offset": 0, 00:33:17.964 "data_size": 65536 00:33:17.964 }, 00:33:17.964 { 00:33:17.964 "name": "BaseBdev3", 00:33:17.964 "uuid": "662f1fb0-0c43-43f5-b127-1c967f90a579", 00:33:17.964 "is_configured": true, 00:33:17.964 "data_offset": 0, 00:33:17.964 "data_size": 65536 00:33:17.964 }, 00:33:17.964 { 00:33:17.964 "name": "BaseBdev4", 00:33:17.964 "uuid": "276db33c-3b68-4490-8d75-8a921a9eaf19", 00:33:17.964 "is_configured": true, 00:33:17.964 "data_offset": 0, 00:33:17.964 "data_size": 65536 00:33:17.964 } 00:33:17.964 ] 00:33:17.964 }' 00:33:17.964 11:44:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:17.964 11:44:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:33:18.529 11:44:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:33:18.529 11:44:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:18.787 11:44:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:33:18.787 11:44:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:18.787 11:44:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:33:18.787 11:44:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 9b8b61a3-19ac-4606-91fd-a7ad5b3965f1 00:33:19.046 [2024-06-10 11:44:02.873121] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:33:19.046 [2024-06-10 11:44:02.873151] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x13bfca0 00:33:19.046 [2024-06-10 11:44:02.873157] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:33:19.046 [2024-06-10 11:44:02.873290] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13c40c0 00:33:19.046 [2024-06-10 11:44:02.873367] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x13bfca0 00:33:19.046 [2024-06-10 11:44:02.873373] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x13bfca0 00:33:19.046 [2024-06-10 11:44:02.873487] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:33:19.046 NewBaseBdev 00:33:19.046 11:44:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:33:19.046 11:44:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=NewBaseBdev 00:33:19.046 11:44:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:33:19.046 11:44:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:33:19.046 11:44:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:33:19.046 11:44:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:33:19.046 11:44:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:33:19.304 11:44:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:33:19.304 [ 00:33:19.304 { 00:33:19.304 "name": "NewBaseBdev", 00:33:19.304 "aliases": [ 00:33:19.304 "9b8b61a3-19ac-4606-91fd-a7ad5b3965f1" 00:33:19.304 ], 00:33:19.304 "product_name": "Malloc disk", 00:33:19.304 "block_size": 512, 00:33:19.304 "num_blocks": 65536, 00:33:19.304 "uuid": "9b8b61a3-19ac-4606-91fd-a7ad5b3965f1", 00:33:19.304 "assigned_rate_limits": { 00:33:19.304 "rw_ios_per_sec": 0, 00:33:19.304 "rw_mbytes_per_sec": 0, 00:33:19.304 "r_mbytes_per_sec": 0, 00:33:19.304 "w_mbytes_per_sec": 0 00:33:19.304 }, 00:33:19.304 "claimed": true, 00:33:19.304 "claim_type": "exclusive_write", 00:33:19.304 "zoned": false, 00:33:19.304 "supported_io_types": { 00:33:19.304 "read": true, 00:33:19.304 "write": true, 00:33:19.304 "unmap": true, 00:33:19.304 "write_zeroes": true, 00:33:19.304 "flush": true, 00:33:19.304 "reset": true, 00:33:19.304 "compare": false, 00:33:19.304 "compare_and_write": false, 00:33:19.304 "abort": true, 00:33:19.304 "nvme_admin": false, 00:33:19.304 "nvme_io": false 00:33:19.304 }, 00:33:19.304 "memory_domains": [ 00:33:19.304 { 00:33:19.304 "dma_device_id": "system", 00:33:19.304 "dma_device_type": 1 00:33:19.304 }, 00:33:19.304 { 00:33:19.304 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:19.304 "dma_device_type": 2 00:33:19.304 } 00:33:19.304 ], 00:33:19.304 "driver_specific": {} 00:33:19.304 } 00:33:19.304 ] 00:33:19.304 11:44:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:33:19.304 11:44:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:33:19.304 11:44:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:33:19.304 11:44:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:33:19.304 11:44:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:33:19.304 11:44:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:33:19.304 11:44:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:33:19.304 11:44:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:19.304 11:44:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:19.304 11:44:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:19.304 11:44:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:19.304 11:44:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:19.304 11:44:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:33:19.562 11:44:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:19.562 "name": "Existed_Raid", 00:33:19.562 "uuid": "eb4595df-b500-406b-9d87-22cb6999760f", 00:33:19.562 "strip_size_kb": 64, 00:33:19.562 "state": "online", 00:33:19.562 "raid_level": "raid0", 00:33:19.562 "superblock": false, 00:33:19.562 "num_base_bdevs": 4, 00:33:19.562 "num_base_bdevs_discovered": 4, 00:33:19.562 "num_base_bdevs_operational": 4, 00:33:19.562 "base_bdevs_list": [ 00:33:19.562 { 00:33:19.562 "name": "NewBaseBdev", 00:33:19.562 "uuid": "9b8b61a3-19ac-4606-91fd-a7ad5b3965f1", 00:33:19.562 "is_configured": true, 00:33:19.562 "data_offset": 0, 00:33:19.562 "data_size": 65536 00:33:19.562 }, 00:33:19.562 { 00:33:19.562 "name": "BaseBdev2", 00:33:19.562 "uuid": "8ff96db1-ea2b-49fc-a05d-fda3fb829110", 00:33:19.562 "is_configured": true, 00:33:19.562 "data_offset": 0, 00:33:19.562 "data_size": 65536 00:33:19.562 }, 00:33:19.562 { 00:33:19.562 "name": "BaseBdev3", 00:33:19.562 "uuid": "662f1fb0-0c43-43f5-b127-1c967f90a579", 00:33:19.562 "is_configured": true, 00:33:19.562 "data_offset": 0, 00:33:19.562 "data_size": 65536 00:33:19.562 }, 00:33:19.562 { 00:33:19.562 "name": "BaseBdev4", 00:33:19.562 "uuid": "276db33c-3b68-4490-8d75-8a921a9eaf19", 00:33:19.562 "is_configured": true, 00:33:19.562 "data_offset": 0, 00:33:19.562 "data_size": 65536 00:33:19.562 } 00:33:19.562 ] 00:33:19.562 }' 00:33:19.562 11:44:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:19.562 11:44:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:33:20.128 11:44:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:33:20.128 11:44:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:33:20.128 11:44:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:33:20.128 11:44:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:33:20.128 11:44:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:33:20.128 11:44:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:33:20.128 11:44:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:33:20.128 11:44:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:33:20.128 [2024-06-10 11:44:04.052371] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:33:20.386 11:44:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:33:20.386 "name": "Existed_Raid", 00:33:20.386 "aliases": [ 00:33:20.386 "eb4595df-b500-406b-9d87-22cb6999760f" 00:33:20.386 ], 00:33:20.386 "product_name": "Raid Volume", 00:33:20.386 "block_size": 512, 00:33:20.386 "num_blocks": 262144, 00:33:20.386 "uuid": "eb4595df-b500-406b-9d87-22cb6999760f", 00:33:20.386 "assigned_rate_limits": { 00:33:20.386 "rw_ios_per_sec": 0, 00:33:20.386 "rw_mbytes_per_sec": 0, 00:33:20.386 "r_mbytes_per_sec": 0, 00:33:20.386 "w_mbytes_per_sec": 0 00:33:20.386 }, 00:33:20.386 "claimed": false, 00:33:20.386 "zoned": false, 00:33:20.386 "supported_io_types": { 00:33:20.386 "read": true, 00:33:20.386 "write": true, 00:33:20.386 "unmap": true, 00:33:20.386 "write_zeroes": true, 00:33:20.386 "flush": true, 00:33:20.386 "reset": true, 00:33:20.386 "compare": false, 00:33:20.386 "compare_and_write": false, 00:33:20.386 "abort": false, 00:33:20.386 "nvme_admin": false, 00:33:20.386 "nvme_io": false 00:33:20.386 }, 00:33:20.386 "memory_domains": [ 00:33:20.386 { 00:33:20.386 "dma_device_id": "system", 00:33:20.386 "dma_device_type": 1 00:33:20.387 }, 00:33:20.387 { 00:33:20.387 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:20.387 "dma_device_type": 2 00:33:20.387 }, 00:33:20.387 { 00:33:20.387 "dma_device_id": "system", 00:33:20.387 "dma_device_type": 1 00:33:20.387 }, 00:33:20.387 { 00:33:20.387 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:20.387 "dma_device_type": 2 00:33:20.387 }, 00:33:20.387 { 00:33:20.387 "dma_device_id": "system", 00:33:20.387 "dma_device_type": 1 00:33:20.387 }, 00:33:20.387 { 00:33:20.387 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:20.387 "dma_device_type": 2 00:33:20.387 }, 00:33:20.387 { 00:33:20.387 "dma_device_id": "system", 00:33:20.387 "dma_device_type": 1 00:33:20.387 }, 00:33:20.387 { 00:33:20.387 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:20.387 "dma_device_type": 2 00:33:20.387 } 00:33:20.387 ], 00:33:20.387 "driver_specific": { 00:33:20.387 "raid": { 00:33:20.387 "uuid": "eb4595df-b500-406b-9d87-22cb6999760f", 00:33:20.387 "strip_size_kb": 64, 00:33:20.387 "state": "online", 00:33:20.387 "raid_level": "raid0", 00:33:20.387 "superblock": false, 00:33:20.387 "num_base_bdevs": 4, 00:33:20.387 "num_base_bdevs_discovered": 4, 00:33:20.387 "num_base_bdevs_operational": 4, 00:33:20.387 "base_bdevs_list": [ 00:33:20.387 { 00:33:20.387 "name": "NewBaseBdev", 00:33:20.387 "uuid": "9b8b61a3-19ac-4606-91fd-a7ad5b3965f1", 00:33:20.387 "is_configured": true, 00:33:20.387 "data_offset": 0, 00:33:20.387 "data_size": 65536 00:33:20.387 }, 00:33:20.387 { 00:33:20.387 "name": "BaseBdev2", 00:33:20.387 "uuid": "8ff96db1-ea2b-49fc-a05d-fda3fb829110", 00:33:20.387 "is_configured": true, 00:33:20.387 "data_offset": 0, 00:33:20.387 "data_size": 65536 00:33:20.387 }, 00:33:20.387 { 00:33:20.387 "name": "BaseBdev3", 00:33:20.387 "uuid": "662f1fb0-0c43-43f5-b127-1c967f90a579", 00:33:20.387 "is_configured": true, 00:33:20.387 "data_offset": 0, 00:33:20.387 "data_size": 65536 00:33:20.387 }, 00:33:20.387 { 00:33:20.387 "name": "BaseBdev4", 00:33:20.387 "uuid": "276db33c-3b68-4490-8d75-8a921a9eaf19", 00:33:20.387 "is_configured": true, 00:33:20.387 "data_offset": 0, 00:33:20.387 "data_size": 65536 00:33:20.387 } 00:33:20.387 ] 00:33:20.387 } 00:33:20.387 } 00:33:20.387 }' 00:33:20.387 11:44:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:33:20.387 11:44:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:33:20.387 BaseBdev2 00:33:20.387 BaseBdev3 00:33:20.387 BaseBdev4' 00:33:20.387 11:44:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:33:20.387 11:44:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:33:20.387 11:44:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:33:20.387 11:44:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:33:20.387 "name": "NewBaseBdev", 00:33:20.387 "aliases": [ 00:33:20.387 "9b8b61a3-19ac-4606-91fd-a7ad5b3965f1" 00:33:20.387 ], 00:33:20.387 "product_name": "Malloc disk", 00:33:20.387 "block_size": 512, 00:33:20.387 "num_blocks": 65536, 00:33:20.387 "uuid": "9b8b61a3-19ac-4606-91fd-a7ad5b3965f1", 00:33:20.387 "assigned_rate_limits": { 00:33:20.387 "rw_ios_per_sec": 0, 00:33:20.387 "rw_mbytes_per_sec": 0, 00:33:20.387 "r_mbytes_per_sec": 0, 00:33:20.387 "w_mbytes_per_sec": 0 00:33:20.387 }, 00:33:20.387 "claimed": true, 00:33:20.387 "claim_type": "exclusive_write", 00:33:20.387 "zoned": false, 00:33:20.387 "supported_io_types": { 00:33:20.387 "read": true, 00:33:20.387 "write": true, 00:33:20.387 "unmap": true, 00:33:20.387 "write_zeroes": true, 00:33:20.387 "flush": true, 00:33:20.387 "reset": true, 00:33:20.387 "compare": false, 00:33:20.387 "compare_and_write": false, 00:33:20.387 "abort": true, 00:33:20.387 "nvme_admin": false, 00:33:20.387 "nvme_io": false 00:33:20.387 }, 00:33:20.387 "memory_domains": [ 00:33:20.387 { 00:33:20.387 "dma_device_id": "system", 00:33:20.387 "dma_device_type": 1 00:33:20.387 }, 00:33:20.387 { 00:33:20.387 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:20.387 "dma_device_type": 2 00:33:20.387 } 00:33:20.387 ], 00:33:20.387 "driver_specific": {} 00:33:20.387 }' 00:33:20.387 11:44:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:20.646 11:44:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:20.646 11:44:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:33:20.646 11:44:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:20.646 11:44:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:20.646 11:44:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:33:20.646 11:44:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:20.646 11:44:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:20.646 11:44:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:33:20.646 11:44:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:20.646 11:44:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:20.904 11:44:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:33:20.904 11:44:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:33:20.904 11:44:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:33:20.904 11:44:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:33:20.904 11:44:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:33:20.904 "name": "BaseBdev2", 00:33:20.904 "aliases": [ 00:33:20.904 "8ff96db1-ea2b-49fc-a05d-fda3fb829110" 00:33:20.904 ], 00:33:20.904 "product_name": "Malloc disk", 00:33:20.904 "block_size": 512, 00:33:20.904 "num_blocks": 65536, 00:33:20.904 "uuid": "8ff96db1-ea2b-49fc-a05d-fda3fb829110", 00:33:20.904 "assigned_rate_limits": { 00:33:20.904 "rw_ios_per_sec": 0, 00:33:20.904 "rw_mbytes_per_sec": 0, 00:33:20.904 "r_mbytes_per_sec": 0, 00:33:20.904 "w_mbytes_per_sec": 0 00:33:20.904 }, 00:33:20.904 "claimed": true, 00:33:20.904 "claim_type": "exclusive_write", 00:33:20.904 "zoned": false, 00:33:20.905 "supported_io_types": { 00:33:20.905 "read": true, 00:33:20.905 "write": true, 00:33:20.905 "unmap": true, 00:33:20.905 "write_zeroes": true, 00:33:20.905 "flush": true, 00:33:20.905 "reset": true, 00:33:20.905 "compare": false, 00:33:20.905 "compare_and_write": false, 00:33:20.905 "abort": true, 00:33:20.905 "nvme_admin": false, 00:33:20.905 "nvme_io": false 00:33:20.905 }, 00:33:20.905 "memory_domains": [ 00:33:20.905 { 00:33:20.905 "dma_device_id": "system", 00:33:20.905 "dma_device_type": 1 00:33:20.905 }, 00:33:20.905 { 00:33:20.905 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:20.905 "dma_device_type": 2 00:33:20.905 } 00:33:20.905 ], 00:33:20.905 "driver_specific": {} 00:33:20.905 }' 00:33:20.905 11:44:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:20.905 11:44:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:21.163 11:44:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:33:21.163 11:44:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:21.163 11:44:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:21.163 11:44:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:33:21.163 11:44:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:21.163 11:44:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:21.163 11:44:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:33:21.163 11:44:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:21.163 11:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:21.163 11:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:33:21.163 11:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:33:21.163 11:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:33:21.163 11:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:33:21.421 11:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:33:21.421 "name": "BaseBdev3", 00:33:21.421 "aliases": [ 00:33:21.421 "662f1fb0-0c43-43f5-b127-1c967f90a579" 00:33:21.421 ], 00:33:21.421 "product_name": "Malloc disk", 00:33:21.421 "block_size": 512, 00:33:21.421 "num_blocks": 65536, 00:33:21.421 "uuid": "662f1fb0-0c43-43f5-b127-1c967f90a579", 00:33:21.421 "assigned_rate_limits": { 00:33:21.421 "rw_ios_per_sec": 0, 00:33:21.421 "rw_mbytes_per_sec": 0, 00:33:21.421 "r_mbytes_per_sec": 0, 00:33:21.421 "w_mbytes_per_sec": 0 00:33:21.421 }, 00:33:21.421 "claimed": true, 00:33:21.421 "claim_type": "exclusive_write", 00:33:21.421 "zoned": false, 00:33:21.421 "supported_io_types": { 00:33:21.421 "read": true, 00:33:21.421 "write": true, 00:33:21.421 "unmap": true, 00:33:21.421 "write_zeroes": true, 00:33:21.421 "flush": true, 00:33:21.421 "reset": true, 00:33:21.421 "compare": false, 00:33:21.421 "compare_and_write": false, 00:33:21.421 "abort": true, 00:33:21.421 "nvme_admin": false, 00:33:21.421 "nvme_io": false 00:33:21.421 }, 00:33:21.421 "memory_domains": [ 00:33:21.421 { 00:33:21.421 "dma_device_id": "system", 00:33:21.421 "dma_device_type": 1 00:33:21.421 }, 00:33:21.421 { 00:33:21.421 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:21.421 "dma_device_type": 2 00:33:21.421 } 00:33:21.421 ], 00:33:21.421 "driver_specific": {} 00:33:21.421 }' 00:33:21.421 11:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:21.421 11:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:21.421 11:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:33:21.421 11:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:21.421 11:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:21.679 11:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:33:21.679 11:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:21.679 11:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:21.679 11:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:33:21.679 11:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:21.679 11:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:21.679 11:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:33:21.679 11:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:33:21.679 11:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:33:21.679 11:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:33:21.938 11:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:33:21.938 "name": "BaseBdev4", 00:33:21.938 "aliases": [ 00:33:21.938 "276db33c-3b68-4490-8d75-8a921a9eaf19" 00:33:21.938 ], 00:33:21.938 "product_name": "Malloc disk", 00:33:21.938 "block_size": 512, 00:33:21.938 "num_blocks": 65536, 00:33:21.938 "uuid": "276db33c-3b68-4490-8d75-8a921a9eaf19", 00:33:21.938 "assigned_rate_limits": { 00:33:21.938 "rw_ios_per_sec": 0, 00:33:21.938 "rw_mbytes_per_sec": 0, 00:33:21.938 "r_mbytes_per_sec": 0, 00:33:21.938 "w_mbytes_per_sec": 0 00:33:21.938 }, 00:33:21.938 "claimed": true, 00:33:21.938 "claim_type": "exclusive_write", 00:33:21.938 "zoned": false, 00:33:21.938 "supported_io_types": { 00:33:21.938 "read": true, 00:33:21.938 "write": true, 00:33:21.938 "unmap": true, 00:33:21.938 "write_zeroes": true, 00:33:21.938 "flush": true, 00:33:21.938 "reset": true, 00:33:21.938 "compare": false, 00:33:21.938 "compare_and_write": false, 00:33:21.938 "abort": true, 00:33:21.938 "nvme_admin": false, 00:33:21.938 "nvme_io": false 00:33:21.938 }, 00:33:21.938 "memory_domains": [ 00:33:21.938 { 00:33:21.938 "dma_device_id": "system", 00:33:21.938 "dma_device_type": 1 00:33:21.938 }, 00:33:21.938 { 00:33:21.938 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:21.938 "dma_device_type": 2 00:33:21.938 } 00:33:21.938 ], 00:33:21.938 "driver_specific": {} 00:33:21.938 }' 00:33:21.938 11:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:21.938 11:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:21.938 11:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:33:21.938 11:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:21.938 11:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:22.196 11:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:33:22.196 11:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:22.196 11:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:22.196 11:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:33:22.196 11:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:22.196 11:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:22.196 11:44:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:33:22.196 11:44:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:33:22.455 [2024-06-10 11:44:06.197753] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:33:22.455 [2024-06-10 11:44:06.197775] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:33:22.455 [2024-06-10 11:44:06.197812] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:33:22.455 [2024-06-10 11:44:06.197854] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:33:22.455 [2024-06-10 11:44:06.197862] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13bfca0 name Existed_Raid, state offline 00:33:22.455 11:44:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 179843 00:33:22.455 11:44:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@949 -- # '[' -z 179843 ']' 00:33:22.455 11:44:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # kill -0 179843 00:33:22.455 11:44:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # uname 00:33:22.455 11:44:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:33:22.455 11:44:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 179843 00:33:22.455 11:44:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:33:22.455 11:44:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:33:22.455 11:44:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 179843' 00:33:22.455 killing process with pid 179843 00:33:22.455 11:44:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # kill 179843 00:33:22.455 [2024-06-10 11:44:06.254051] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:33:22.455 11:44:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@973 -- # wait 179843 00:33:22.455 [2024-06-10 11:44:06.294613] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:33:22.714 11:44:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:33:22.714 00:33:22.714 real 0m24.674s 00:33:22.714 user 0m44.959s 00:33:22.714 sys 0m4.817s 00:33:22.714 11:44:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:33:22.714 11:44:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:33:22.714 ************************************ 00:33:22.714 END TEST raid_state_function_test 00:33:22.714 ************************************ 00:33:22.714 11:44:06 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 4 true 00:33:22.714 11:44:06 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:33:22.714 11:44:06 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:33:22.714 11:44:06 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:33:22.714 ************************************ 00:33:22.714 START TEST raid_state_function_test_sb 00:33:22.714 ************************************ 00:33:22.714 11:44:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # raid_state_function_test raid0 4 true 00:33:22.714 11:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:33:22.714 11:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:33:22.714 11:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:33:22.715 11:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:33:22.715 11:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:33:22.715 11:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:33:22.715 11:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:33:22.715 11:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:33:22.715 11:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:33:22.715 11:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:33:22.715 11:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:33:22.715 11:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:33:22.715 11:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:33:22.715 11:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:33:22.715 11:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:33:22.715 11:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:33:22.715 11:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:33:22.715 11:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:33:22.715 11:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:33:22.715 11:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:33:22.715 11:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:33:22.715 11:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:33:22.715 11:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:33:22.715 11:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:33:22.715 11:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:33:22.715 11:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:33:22.715 11:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:33:22.715 11:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:33:22.715 11:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:33:22.715 11:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=183700 00:33:22.715 11:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 183700' 00:33:22.715 Process raid pid: 183700 00:33:22.715 11:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:33:22.715 11:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 183700 /var/tmp/spdk-raid.sock 00:33:22.715 11:44:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@830 -- # '[' -z 183700 ']' 00:33:22.715 11:44:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:33:22.715 11:44:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local max_retries=100 00:33:22.715 11:44:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:33:22.715 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:33:22.715 11:44:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@839 -- # xtrace_disable 00:33:22.715 11:44:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:33:22.715 [2024-06-10 11:44:06.647633] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:33:22.715 [2024-06-10 11:44:06.647685] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:33:22.974 [2024-06-10 11:44:06.736219] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:22.974 [2024-06-10 11:44:06.823695] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:33:22.974 [2024-06-10 11:44:06.882800] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:33:22.974 [2024-06-10 11:44:06.882830] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:33:23.540 11:44:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:33:23.540 11:44:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@863 -- # return 0 00:33:23.540 11:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:33:23.798 [2024-06-10 11:44:07.609349] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:33:23.798 [2024-06-10 11:44:07.609385] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:33:23.798 [2024-06-10 11:44:07.609393] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:33:23.798 [2024-06-10 11:44:07.609416] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:33:23.798 [2024-06-10 11:44:07.609422] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:33:23.798 [2024-06-10 11:44:07.609429] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:33:23.798 [2024-06-10 11:44:07.609435] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:33:23.798 [2024-06-10 11:44:07.609445] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:33:23.798 11:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:33:23.798 11:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:33:23.798 11:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:33:23.798 11:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:33:23.798 11:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:33:23.798 11:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:33:23.798 11:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:23.798 11:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:23.798 11:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:23.798 11:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:23.798 11:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:23.798 11:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:33:24.056 11:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:24.056 "name": "Existed_Raid", 00:33:24.056 "uuid": "7d08ccfb-b92a-48e7-858d-67f0c45c8ccb", 00:33:24.056 "strip_size_kb": 64, 00:33:24.056 "state": "configuring", 00:33:24.056 "raid_level": "raid0", 00:33:24.056 "superblock": true, 00:33:24.056 "num_base_bdevs": 4, 00:33:24.056 "num_base_bdevs_discovered": 0, 00:33:24.056 "num_base_bdevs_operational": 4, 00:33:24.056 "base_bdevs_list": [ 00:33:24.056 { 00:33:24.056 "name": "BaseBdev1", 00:33:24.056 "uuid": "00000000-0000-0000-0000-000000000000", 00:33:24.056 "is_configured": false, 00:33:24.056 "data_offset": 0, 00:33:24.056 "data_size": 0 00:33:24.056 }, 00:33:24.056 { 00:33:24.056 "name": "BaseBdev2", 00:33:24.056 "uuid": "00000000-0000-0000-0000-000000000000", 00:33:24.056 "is_configured": false, 00:33:24.056 "data_offset": 0, 00:33:24.056 "data_size": 0 00:33:24.056 }, 00:33:24.056 { 00:33:24.056 "name": "BaseBdev3", 00:33:24.056 "uuid": "00000000-0000-0000-0000-000000000000", 00:33:24.056 "is_configured": false, 00:33:24.056 "data_offset": 0, 00:33:24.056 "data_size": 0 00:33:24.056 }, 00:33:24.056 { 00:33:24.056 "name": "BaseBdev4", 00:33:24.056 "uuid": "00000000-0000-0000-0000-000000000000", 00:33:24.056 "is_configured": false, 00:33:24.056 "data_offset": 0, 00:33:24.056 "data_size": 0 00:33:24.056 } 00:33:24.056 ] 00:33:24.056 }' 00:33:24.056 11:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:24.056 11:44:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:33:24.621 11:44:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:33:24.621 [2024-06-10 11:44:08.467482] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:33:24.621 [2024-06-10 11:44:08.467506] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a7a550 name Existed_Raid, state configuring 00:33:24.621 11:44:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:33:24.878 [2024-06-10 11:44:08.655999] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:33:24.878 [2024-06-10 11:44:08.656023] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:33:24.878 [2024-06-10 11:44:08.656030] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:33:24.878 [2024-06-10 11:44:08.656038] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:33:24.878 [2024-06-10 11:44:08.656044] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:33:24.878 [2024-06-10 11:44:08.656051] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:33:24.878 [2024-06-10 11:44:08.656057] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:33:24.878 [2024-06-10 11:44:08.656064] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:33:24.878 11:44:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:33:25.136 [2024-06-10 11:44:08.837062] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:33:25.136 BaseBdev1 00:33:25.136 11:44:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:33:25.136 11:44:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:33:25.136 11:44:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:33:25.136 11:44:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:33:25.136 11:44:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:33:25.136 11:44:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:33:25.136 11:44:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:33:25.136 11:44:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:33:25.393 [ 00:33:25.393 { 00:33:25.393 "name": "BaseBdev1", 00:33:25.393 "aliases": [ 00:33:25.393 "1fabdecc-ffba-47b9-a091-aaa8b9cdf4bf" 00:33:25.393 ], 00:33:25.393 "product_name": "Malloc disk", 00:33:25.393 "block_size": 512, 00:33:25.393 "num_blocks": 65536, 00:33:25.393 "uuid": "1fabdecc-ffba-47b9-a091-aaa8b9cdf4bf", 00:33:25.393 "assigned_rate_limits": { 00:33:25.393 "rw_ios_per_sec": 0, 00:33:25.393 "rw_mbytes_per_sec": 0, 00:33:25.393 "r_mbytes_per_sec": 0, 00:33:25.393 "w_mbytes_per_sec": 0 00:33:25.393 }, 00:33:25.393 "claimed": true, 00:33:25.393 "claim_type": "exclusive_write", 00:33:25.393 "zoned": false, 00:33:25.393 "supported_io_types": { 00:33:25.393 "read": true, 00:33:25.393 "write": true, 00:33:25.393 "unmap": true, 00:33:25.393 "write_zeroes": true, 00:33:25.393 "flush": true, 00:33:25.393 "reset": true, 00:33:25.393 "compare": false, 00:33:25.393 "compare_and_write": false, 00:33:25.393 "abort": true, 00:33:25.393 "nvme_admin": false, 00:33:25.393 "nvme_io": false 00:33:25.393 }, 00:33:25.393 "memory_domains": [ 00:33:25.393 { 00:33:25.393 "dma_device_id": "system", 00:33:25.393 "dma_device_type": 1 00:33:25.393 }, 00:33:25.393 { 00:33:25.393 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:25.393 "dma_device_type": 2 00:33:25.393 } 00:33:25.393 ], 00:33:25.393 "driver_specific": {} 00:33:25.393 } 00:33:25.393 ] 00:33:25.393 11:44:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:33:25.393 11:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:33:25.393 11:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:33:25.393 11:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:33:25.393 11:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:33:25.393 11:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:33:25.393 11:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:33:25.393 11:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:25.393 11:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:25.394 11:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:25.394 11:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:25.394 11:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:25.394 11:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:33:25.651 11:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:25.651 "name": "Existed_Raid", 00:33:25.651 "uuid": "4a307d3a-c8ba-47a6-8011-caa17bfdb896", 00:33:25.651 "strip_size_kb": 64, 00:33:25.651 "state": "configuring", 00:33:25.651 "raid_level": "raid0", 00:33:25.651 "superblock": true, 00:33:25.651 "num_base_bdevs": 4, 00:33:25.651 "num_base_bdevs_discovered": 1, 00:33:25.651 "num_base_bdevs_operational": 4, 00:33:25.651 "base_bdevs_list": [ 00:33:25.651 { 00:33:25.651 "name": "BaseBdev1", 00:33:25.651 "uuid": "1fabdecc-ffba-47b9-a091-aaa8b9cdf4bf", 00:33:25.651 "is_configured": true, 00:33:25.651 "data_offset": 2048, 00:33:25.651 "data_size": 63488 00:33:25.651 }, 00:33:25.651 { 00:33:25.651 "name": "BaseBdev2", 00:33:25.651 "uuid": "00000000-0000-0000-0000-000000000000", 00:33:25.651 "is_configured": false, 00:33:25.651 "data_offset": 0, 00:33:25.651 "data_size": 0 00:33:25.651 }, 00:33:25.651 { 00:33:25.651 "name": "BaseBdev3", 00:33:25.651 "uuid": "00000000-0000-0000-0000-000000000000", 00:33:25.651 "is_configured": false, 00:33:25.651 "data_offset": 0, 00:33:25.651 "data_size": 0 00:33:25.651 }, 00:33:25.651 { 00:33:25.651 "name": "BaseBdev4", 00:33:25.651 "uuid": "00000000-0000-0000-0000-000000000000", 00:33:25.651 "is_configured": false, 00:33:25.651 "data_offset": 0, 00:33:25.651 "data_size": 0 00:33:25.651 } 00:33:25.651 ] 00:33:25.651 }' 00:33:25.651 11:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:25.651 11:44:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:33:26.236 11:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:33:26.236 [2024-06-10 11:44:10.016107] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:33:26.236 [2024-06-10 11:44:10.016142] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a79dc0 name Existed_Raid, state configuring 00:33:26.236 11:44:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:33:26.510 [2024-06-10 11:44:10.192601] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:33:26.510 [2024-06-10 11:44:10.193672] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:33:26.510 [2024-06-10 11:44:10.193699] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:33:26.510 [2024-06-10 11:44:10.193707] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:33:26.510 [2024-06-10 11:44:10.193715] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:33:26.510 [2024-06-10 11:44:10.193721] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:33:26.510 [2024-06-10 11:44:10.193728] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:33:26.510 11:44:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:33:26.510 11:44:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:33:26.511 11:44:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:33:26.511 11:44:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:33:26.511 11:44:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:33:26.511 11:44:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:33:26.511 11:44:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:33:26.511 11:44:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:33:26.511 11:44:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:26.511 11:44:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:26.511 11:44:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:26.511 11:44:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:26.511 11:44:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:26.511 11:44:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:33:26.511 11:44:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:26.511 "name": "Existed_Raid", 00:33:26.511 "uuid": "15c9e933-6572-4222-ba4c-d278668fd88c", 00:33:26.511 "strip_size_kb": 64, 00:33:26.511 "state": "configuring", 00:33:26.511 "raid_level": "raid0", 00:33:26.511 "superblock": true, 00:33:26.511 "num_base_bdevs": 4, 00:33:26.511 "num_base_bdevs_discovered": 1, 00:33:26.511 "num_base_bdevs_operational": 4, 00:33:26.511 "base_bdevs_list": [ 00:33:26.511 { 00:33:26.511 "name": "BaseBdev1", 00:33:26.511 "uuid": "1fabdecc-ffba-47b9-a091-aaa8b9cdf4bf", 00:33:26.511 "is_configured": true, 00:33:26.511 "data_offset": 2048, 00:33:26.511 "data_size": 63488 00:33:26.511 }, 00:33:26.511 { 00:33:26.511 "name": "BaseBdev2", 00:33:26.511 "uuid": "00000000-0000-0000-0000-000000000000", 00:33:26.511 "is_configured": false, 00:33:26.511 "data_offset": 0, 00:33:26.511 "data_size": 0 00:33:26.511 }, 00:33:26.511 { 00:33:26.511 "name": "BaseBdev3", 00:33:26.511 "uuid": "00000000-0000-0000-0000-000000000000", 00:33:26.511 "is_configured": false, 00:33:26.511 "data_offset": 0, 00:33:26.511 "data_size": 0 00:33:26.511 }, 00:33:26.511 { 00:33:26.511 "name": "BaseBdev4", 00:33:26.511 "uuid": "00000000-0000-0000-0000-000000000000", 00:33:26.511 "is_configured": false, 00:33:26.511 "data_offset": 0, 00:33:26.511 "data_size": 0 00:33:26.511 } 00:33:26.511 ] 00:33:26.511 }' 00:33:26.511 11:44:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:26.511 11:44:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:33:27.075 11:44:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:33:27.333 [2024-06-10 11:44:11.033689] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:33:27.333 BaseBdev2 00:33:27.333 11:44:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:33:27.333 11:44:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:33:27.333 11:44:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:33:27.333 11:44:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:33:27.333 11:44:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:33:27.333 11:44:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:33:27.333 11:44:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:33:27.333 11:44:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:33:27.591 [ 00:33:27.591 { 00:33:27.591 "name": "BaseBdev2", 00:33:27.591 "aliases": [ 00:33:27.591 "bf276d4c-d012-4e3d-856e-4dbe43e528b8" 00:33:27.591 ], 00:33:27.591 "product_name": "Malloc disk", 00:33:27.591 "block_size": 512, 00:33:27.591 "num_blocks": 65536, 00:33:27.591 "uuid": "bf276d4c-d012-4e3d-856e-4dbe43e528b8", 00:33:27.591 "assigned_rate_limits": { 00:33:27.591 "rw_ios_per_sec": 0, 00:33:27.591 "rw_mbytes_per_sec": 0, 00:33:27.591 "r_mbytes_per_sec": 0, 00:33:27.591 "w_mbytes_per_sec": 0 00:33:27.591 }, 00:33:27.591 "claimed": true, 00:33:27.591 "claim_type": "exclusive_write", 00:33:27.591 "zoned": false, 00:33:27.591 "supported_io_types": { 00:33:27.591 "read": true, 00:33:27.591 "write": true, 00:33:27.591 "unmap": true, 00:33:27.591 "write_zeroes": true, 00:33:27.591 "flush": true, 00:33:27.591 "reset": true, 00:33:27.591 "compare": false, 00:33:27.591 "compare_and_write": false, 00:33:27.591 "abort": true, 00:33:27.591 "nvme_admin": false, 00:33:27.591 "nvme_io": false 00:33:27.591 }, 00:33:27.591 "memory_domains": [ 00:33:27.591 { 00:33:27.591 "dma_device_id": "system", 00:33:27.591 "dma_device_type": 1 00:33:27.591 }, 00:33:27.591 { 00:33:27.591 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:27.591 "dma_device_type": 2 00:33:27.591 } 00:33:27.591 ], 00:33:27.591 "driver_specific": {} 00:33:27.591 } 00:33:27.591 ] 00:33:27.591 11:44:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:33:27.591 11:44:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:33:27.591 11:44:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:33:27.591 11:44:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:33:27.591 11:44:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:33:27.591 11:44:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:33:27.591 11:44:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:33:27.591 11:44:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:33:27.591 11:44:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:33:27.591 11:44:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:27.591 11:44:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:27.592 11:44:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:27.592 11:44:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:27.592 11:44:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:27.592 11:44:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:33:27.849 11:44:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:27.849 "name": "Existed_Raid", 00:33:27.849 "uuid": "15c9e933-6572-4222-ba4c-d278668fd88c", 00:33:27.849 "strip_size_kb": 64, 00:33:27.849 "state": "configuring", 00:33:27.849 "raid_level": "raid0", 00:33:27.850 "superblock": true, 00:33:27.850 "num_base_bdevs": 4, 00:33:27.850 "num_base_bdevs_discovered": 2, 00:33:27.850 "num_base_bdevs_operational": 4, 00:33:27.850 "base_bdevs_list": [ 00:33:27.850 { 00:33:27.850 "name": "BaseBdev1", 00:33:27.850 "uuid": "1fabdecc-ffba-47b9-a091-aaa8b9cdf4bf", 00:33:27.850 "is_configured": true, 00:33:27.850 "data_offset": 2048, 00:33:27.850 "data_size": 63488 00:33:27.850 }, 00:33:27.850 { 00:33:27.850 "name": "BaseBdev2", 00:33:27.850 "uuid": "bf276d4c-d012-4e3d-856e-4dbe43e528b8", 00:33:27.850 "is_configured": true, 00:33:27.850 "data_offset": 2048, 00:33:27.850 "data_size": 63488 00:33:27.850 }, 00:33:27.850 { 00:33:27.850 "name": "BaseBdev3", 00:33:27.850 "uuid": "00000000-0000-0000-0000-000000000000", 00:33:27.850 "is_configured": false, 00:33:27.850 "data_offset": 0, 00:33:27.850 "data_size": 0 00:33:27.850 }, 00:33:27.850 { 00:33:27.850 "name": "BaseBdev4", 00:33:27.850 "uuid": "00000000-0000-0000-0000-000000000000", 00:33:27.850 "is_configured": false, 00:33:27.850 "data_offset": 0, 00:33:27.850 "data_size": 0 00:33:27.850 } 00:33:27.850 ] 00:33:27.850 }' 00:33:27.850 11:44:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:27.850 11:44:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:33:28.416 11:44:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:33:28.416 [2024-06-10 11:44:12.247635] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:33:28.416 BaseBdev3 00:33:28.416 11:44:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:33:28.416 11:44:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:33:28.416 11:44:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:33:28.416 11:44:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:33:28.416 11:44:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:33:28.416 11:44:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:33:28.416 11:44:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:33:28.674 11:44:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:33:28.674 [ 00:33:28.674 { 00:33:28.674 "name": "BaseBdev3", 00:33:28.674 "aliases": [ 00:33:28.674 "3bb0dcfc-5f51-4d51-942a-0166eff4f3ea" 00:33:28.674 ], 00:33:28.674 "product_name": "Malloc disk", 00:33:28.674 "block_size": 512, 00:33:28.674 "num_blocks": 65536, 00:33:28.674 "uuid": "3bb0dcfc-5f51-4d51-942a-0166eff4f3ea", 00:33:28.674 "assigned_rate_limits": { 00:33:28.674 "rw_ios_per_sec": 0, 00:33:28.674 "rw_mbytes_per_sec": 0, 00:33:28.674 "r_mbytes_per_sec": 0, 00:33:28.674 "w_mbytes_per_sec": 0 00:33:28.674 }, 00:33:28.674 "claimed": true, 00:33:28.674 "claim_type": "exclusive_write", 00:33:28.674 "zoned": false, 00:33:28.674 "supported_io_types": { 00:33:28.674 "read": true, 00:33:28.674 "write": true, 00:33:28.674 "unmap": true, 00:33:28.674 "write_zeroes": true, 00:33:28.674 "flush": true, 00:33:28.674 "reset": true, 00:33:28.674 "compare": false, 00:33:28.674 "compare_and_write": false, 00:33:28.674 "abort": true, 00:33:28.674 "nvme_admin": false, 00:33:28.674 "nvme_io": false 00:33:28.674 }, 00:33:28.674 "memory_domains": [ 00:33:28.674 { 00:33:28.674 "dma_device_id": "system", 00:33:28.674 "dma_device_type": 1 00:33:28.674 }, 00:33:28.674 { 00:33:28.674 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:28.674 "dma_device_type": 2 00:33:28.674 } 00:33:28.674 ], 00:33:28.674 "driver_specific": {} 00:33:28.674 } 00:33:28.674 ] 00:33:28.674 11:44:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:33:28.674 11:44:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:33:28.674 11:44:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:33:28.674 11:44:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:33:28.674 11:44:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:33:28.674 11:44:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:33:28.674 11:44:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:33:28.674 11:44:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:33:28.674 11:44:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:33:28.674 11:44:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:28.674 11:44:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:28.674 11:44:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:28.674 11:44:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:28.674 11:44:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:28.674 11:44:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:33:28.932 11:44:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:28.932 "name": "Existed_Raid", 00:33:28.932 "uuid": "15c9e933-6572-4222-ba4c-d278668fd88c", 00:33:28.932 "strip_size_kb": 64, 00:33:28.932 "state": "configuring", 00:33:28.932 "raid_level": "raid0", 00:33:28.932 "superblock": true, 00:33:28.932 "num_base_bdevs": 4, 00:33:28.932 "num_base_bdevs_discovered": 3, 00:33:28.932 "num_base_bdevs_operational": 4, 00:33:28.932 "base_bdevs_list": [ 00:33:28.932 { 00:33:28.932 "name": "BaseBdev1", 00:33:28.932 "uuid": "1fabdecc-ffba-47b9-a091-aaa8b9cdf4bf", 00:33:28.932 "is_configured": true, 00:33:28.932 "data_offset": 2048, 00:33:28.932 "data_size": 63488 00:33:28.932 }, 00:33:28.932 { 00:33:28.932 "name": "BaseBdev2", 00:33:28.932 "uuid": "bf276d4c-d012-4e3d-856e-4dbe43e528b8", 00:33:28.932 "is_configured": true, 00:33:28.932 "data_offset": 2048, 00:33:28.932 "data_size": 63488 00:33:28.932 }, 00:33:28.932 { 00:33:28.932 "name": "BaseBdev3", 00:33:28.932 "uuid": "3bb0dcfc-5f51-4d51-942a-0166eff4f3ea", 00:33:28.932 "is_configured": true, 00:33:28.932 "data_offset": 2048, 00:33:28.932 "data_size": 63488 00:33:28.932 }, 00:33:28.932 { 00:33:28.932 "name": "BaseBdev4", 00:33:28.932 "uuid": "00000000-0000-0000-0000-000000000000", 00:33:28.932 "is_configured": false, 00:33:28.932 "data_offset": 0, 00:33:28.932 "data_size": 0 00:33:28.932 } 00:33:28.932 ] 00:33:28.932 }' 00:33:28.932 11:44:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:28.932 11:44:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:33:29.509 11:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:33:29.509 [2024-06-10 11:44:13.441586] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:33:29.509 [2024-06-10 11:44:13.441714] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a7ae20 00:33:29.509 [2024-06-10 11:44:13.441727] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:33:29.509 [2024-06-10 11:44:13.441846] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a7ba70 00:33:29.509 [2024-06-10 11:44:13.441940] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a7ae20 00:33:29.509 [2024-06-10 11:44:13.441947] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1a7ae20 00:33:29.509 [2024-06-10 11:44:13.442014] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:33:29.509 BaseBdev4 00:33:29.766 11:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:33:29.766 11:44:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev4 00:33:29.766 11:44:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:33:29.766 11:44:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:33:29.766 11:44:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:33:29.766 11:44:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:33:29.766 11:44:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:33:29.766 11:44:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:33:30.024 [ 00:33:30.024 { 00:33:30.024 "name": "BaseBdev4", 00:33:30.024 "aliases": [ 00:33:30.024 "cc642895-b33b-445c-8ebc-854614aa0961" 00:33:30.024 ], 00:33:30.024 "product_name": "Malloc disk", 00:33:30.024 "block_size": 512, 00:33:30.024 "num_blocks": 65536, 00:33:30.024 "uuid": "cc642895-b33b-445c-8ebc-854614aa0961", 00:33:30.024 "assigned_rate_limits": { 00:33:30.024 "rw_ios_per_sec": 0, 00:33:30.024 "rw_mbytes_per_sec": 0, 00:33:30.024 "r_mbytes_per_sec": 0, 00:33:30.024 "w_mbytes_per_sec": 0 00:33:30.024 }, 00:33:30.024 "claimed": true, 00:33:30.024 "claim_type": "exclusive_write", 00:33:30.024 "zoned": false, 00:33:30.024 "supported_io_types": { 00:33:30.024 "read": true, 00:33:30.024 "write": true, 00:33:30.024 "unmap": true, 00:33:30.024 "write_zeroes": true, 00:33:30.024 "flush": true, 00:33:30.024 "reset": true, 00:33:30.024 "compare": false, 00:33:30.024 "compare_and_write": false, 00:33:30.024 "abort": true, 00:33:30.024 "nvme_admin": false, 00:33:30.024 "nvme_io": false 00:33:30.024 }, 00:33:30.024 "memory_domains": [ 00:33:30.024 { 00:33:30.024 "dma_device_id": "system", 00:33:30.024 "dma_device_type": 1 00:33:30.024 }, 00:33:30.024 { 00:33:30.024 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:30.024 "dma_device_type": 2 00:33:30.024 } 00:33:30.024 ], 00:33:30.024 "driver_specific": {} 00:33:30.024 } 00:33:30.024 ] 00:33:30.024 11:44:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:33:30.024 11:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:33:30.024 11:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:33:30.024 11:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:33:30.024 11:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:33:30.024 11:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:33:30.024 11:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:33:30.024 11:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:33:30.024 11:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:33:30.024 11:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:30.024 11:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:30.024 11:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:30.024 11:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:30.024 11:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:30.024 11:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:33:30.282 11:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:30.282 "name": "Existed_Raid", 00:33:30.282 "uuid": "15c9e933-6572-4222-ba4c-d278668fd88c", 00:33:30.282 "strip_size_kb": 64, 00:33:30.282 "state": "online", 00:33:30.282 "raid_level": "raid0", 00:33:30.282 "superblock": true, 00:33:30.282 "num_base_bdevs": 4, 00:33:30.282 "num_base_bdevs_discovered": 4, 00:33:30.282 "num_base_bdevs_operational": 4, 00:33:30.282 "base_bdevs_list": [ 00:33:30.282 { 00:33:30.282 "name": "BaseBdev1", 00:33:30.283 "uuid": "1fabdecc-ffba-47b9-a091-aaa8b9cdf4bf", 00:33:30.283 "is_configured": true, 00:33:30.283 "data_offset": 2048, 00:33:30.283 "data_size": 63488 00:33:30.283 }, 00:33:30.283 { 00:33:30.283 "name": "BaseBdev2", 00:33:30.283 "uuid": "bf276d4c-d012-4e3d-856e-4dbe43e528b8", 00:33:30.283 "is_configured": true, 00:33:30.283 "data_offset": 2048, 00:33:30.283 "data_size": 63488 00:33:30.283 }, 00:33:30.283 { 00:33:30.283 "name": "BaseBdev3", 00:33:30.283 "uuid": "3bb0dcfc-5f51-4d51-942a-0166eff4f3ea", 00:33:30.283 "is_configured": true, 00:33:30.283 "data_offset": 2048, 00:33:30.283 "data_size": 63488 00:33:30.283 }, 00:33:30.283 { 00:33:30.283 "name": "BaseBdev4", 00:33:30.283 "uuid": "cc642895-b33b-445c-8ebc-854614aa0961", 00:33:30.283 "is_configured": true, 00:33:30.283 "data_offset": 2048, 00:33:30.283 "data_size": 63488 00:33:30.283 } 00:33:30.283 ] 00:33:30.283 }' 00:33:30.283 11:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:30.283 11:44:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:33:30.849 11:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:33:30.849 11:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:33:30.849 11:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:33:30.849 11:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:33:30.849 11:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:33:30.849 11:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:33:30.849 11:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:33:30.849 11:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:33:30.849 [2024-06-10 11:44:14.656966] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:33:30.849 11:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:33:30.849 "name": "Existed_Raid", 00:33:30.849 "aliases": [ 00:33:30.849 "15c9e933-6572-4222-ba4c-d278668fd88c" 00:33:30.849 ], 00:33:30.849 "product_name": "Raid Volume", 00:33:30.849 "block_size": 512, 00:33:30.849 "num_blocks": 253952, 00:33:30.849 "uuid": "15c9e933-6572-4222-ba4c-d278668fd88c", 00:33:30.849 "assigned_rate_limits": { 00:33:30.849 "rw_ios_per_sec": 0, 00:33:30.849 "rw_mbytes_per_sec": 0, 00:33:30.849 "r_mbytes_per_sec": 0, 00:33:30.849 "w_mbytes_per_sec": 0 00:33:30.849 }, 00:33:30.849 "claimed": false, 00:33:30.849 "zoned": false, 00:33:30.849 "supported_io_types": { 00:33:30.849 "read": true, 00:33:30.849 "write": true, 00:33:30.849 "unmap": true, 00:33:30.849 "write_zeroes": true, 00:33:30.849 "flush": true, 00:33:30.849 "reset": true, 00:33:30.849 "compare": false, 00:33:30.849 "compare_and_write": false, 00:33:30.849 "abort": false, 00:33:30.849 "nvme_admin": false, 00:33:30.849 "nvme_io": false 00:33:30.849 }, 00:33:30.849 "memory_domains": [ 00:33:30.849 { 00:33:30.849 "dma_device_id": "system", 00:33:30.849 "dma_device_type": 1 00:33:30.849 }, 00:33:30.849 { 00:33:30.849 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:30.849 "dma_device_type": 2 00:33:30.849 }, 00:33:30.849 { 00:33:30.849 "dma_device_id": "system", 00:33:30.849 "dma_device_type": 1 00:33:30.849 }, 00:33:30.849 { 00:33:30.849 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:30.849 "dma_device_type": 2 00:33:30.849 }, 00:33:30.849 { 00:33:30.849 "dma_device_id": "system", 00:33:30.849 "dma_device_type": 1 00:33:30.849 }, 00:33:30.849 { 00:33:30.849 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:30.849 "dma_device_type": 2 00:33:30.849 }, 00:33:30.849 { 00:33:30.849 "dma_device_id": "system", 00:33:30.849 "dma_device_type": 1 00:33:30.849 }, 00:33:30.849 { 00:33:30.849 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:30.849 "dma_device_type": 2 00:33:30.849 } 00:33:30.849 ], 00:33:30.849 "driver_specific": { 00:33:30.849 "raid": { 00:33:30.849 "uuid": "15c9e933-6572-4222-ba4c-d278668fd88c", 00:33:30.849 "strip_size_kb": 64, 00:33:30.849 "state": "online", 00:33:30.849 "raid_level": "raid0", 00:33:30.849 "superblock": true, 00:33:30.849 "num_base_bdevs": 4, 00:33:30.849 "num_base_bdevs_discovered": 4, 00:33:30.849 "num_base_bdevs_operational": 4, 00:33:30.849 "base_bdevs_list": [ 00:33:30.849 { 00:33:30.849 "name": "BaseBdev1", 00:33:30.849 "uuid": "1fabdecc-ffba-47b9-a091-aaa8b9cdf4bf", 00:33:30.849 "is_configured": true, 00:33:30.849 "data_offset": 2048, 00:33:30.849 "data_size": 63488 00:33:30.849 }, 00:33:30.849 { 00:33:30.849 "name": "BaseBdev2", 00:33:30.849 "uuid": "bf276d4c-d012-4e3d-856e-4dbe43e528b8", 00:33:30.849 "is_configured": true, 00:33:30.849 "data_offset": 2048, 00:33:30.849 "data_size": 63488 00:33:30.849 }, 00:33:30.849 { 00:33:30.849 "name": "BaseBdev3", 00:33:30.849 "uuid": "3bb0dcfc-5f51-4d51-942a-0166eff4f3ea", 00:33:30.849 "is_configured": true, 00:33:30.849 "data_offset": 2048, 00:33:30.849 "data_size": 63488 00:33:30.849 }, 00:33:30.849 { 00:33:30.849 "name": "BaseBdev4", 00:33:30.849 "uuid": "cc642895-b33b-445c-8ebc-854614aa0961", 00:33:30.849 "is_configured": true, 00:33:30.849 "data_offset": 2048, 00:33:30.849 "data_size": 63488 00:33:30.849 } 00:33:30.849 ] 00:33:30.849 } 00:33:30.849 } 00:33:30.849 }' 00:33:30.849 11:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:33:30.849 11:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:33:30.849 BaseBdev2 00:33:30.849 BaseBdev3 00:33:30.849 BaseBdev4' 00:33:30.849 11:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:33:30.849 11:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:33:30.849 11:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:33:31.107 11:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:33:31.107 "name": "BaseBdev1", 00:33:31.107 "aliases": [ 00:33:31.107 "1fabdecc-ffba-47b9-a091-aaa8b9cdf4bf" 00:33:31.107 ], 00:33:31.107 "product_name": "Malloc disk", 00:33:31.107 "block_size": 512, 00:33:31.107 "num_blocks": 65536, 00:33:31.107 "uuid": "1fabdecc-ffba-47b9-a091-aaa8b9cdf4bf", 00:33:31.107 "assigned_rate_limits": { 00:33:31.107 "rw_ios_per_sec": 0, 00:33:31.107 "rw_mbytes_per_sec": 0, 00:33:31.107 "r_mbytes_per_sec": 0, 00:33:31.107 "w_mbytes_per_sec": 0 00:33:31.107 }, 00:33:31.107 "claimed": true, 00:33:31.108 "claim_type": "exclusive_write", 00:33:31.108 "zoned": false, 00:33:31.108 "supported_io_types": { 00:33:31.108 "read": true, 00:33:31.108 "write": true, 00:33:31.108 "unmap": true, 00:33:31.108 "write_zeroes": true, 00:33:31.108 "flush": true, 00:33:31.108 "reset": true, 00:33:31.108 "compare": false, 00:33:31.108 "compare_and_write": false, 00:33:31.108 "abort": true, 00:33:31.108 "nvme_admin": false, 00:33:31.108 "nvme_io": false 00:33:31.108 }, 00:33:31.108 "memory_domains": [ 00:33:31.108 { 00:33:31.108 "dma_device_id": "system", 00:33:31.108 "dma_device_type": 1 00:33:31.108 }, 00:33:31.108 { 00:33:31.108 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:31.108 "dma_device_type": 2 00:33:31.108 } 00:33:31.108 ], 00:33:31.108 "driver_specific": {} 00:33:31.108 }' 00:33:31.108 11:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:31.108 11:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:31.108 11:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:33:31.108 11:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:31.108 11:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:31.108 11:44:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:33:31.108 11:44:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:31.366 11:44:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:31.366 11:44:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:33:31.366 11:44:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:31.366 11:44:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:31.366 11:44:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:33:31.366 11:44:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:33:31.366 11:44:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:33:31.366 11:44:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:33:31.624 11:44:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:33:31.624 "name": "BaseBdev2", 00:33:31.624 "aliases": [ 00:33:31.624 "bf276d4c-d012-4e3d-856e-4dbe43e528b8" 00:33:31.624 ], 00:33:31.624 "product_name": "Malloc disk", 00:33:31.624 "block_size": 512, 00:33:31.624 "num_blocks": 65536, 00:33:31.624 "uuid": "bf276d4c-d012-4e3d-856e-4dbe43e528b8", 00:33:31.624 "assigned_rate_limits": { 00:33:31.624 "rw_ios_per_sec": 0, 00:33:31.624 "rw_mbytes_per_sec": 0, 00:33:31.624 "r_mbytes_per_sec": 0, 00:33:31.624 "w_mbytes_per_sec": 0 00:33:31.624 }, 00:33:31.624 "claimed": true, 00:33:31.624 "claim_type": "exclusive_write", 00:33:31.624 "zoned": false, 00:33:31.624 "supported_io_types": { 00:33:31.624 "read": true, 00:33:31.624 "write": true, 00:33:31.624 "unmap": true, 00:33:31.624 "write_zeroes": true, 00:33:31.624 "flush": true, 00:33:31.624 "reset": true, 00:33:31.624 "compare": false, 00:33:31.624 "compare_and_write": false, 00:33:31.624 "abort": true, 00:33:31.624 "nvme_admin": false, 00:33:31.624 "nvme_io": false 00:33:31.624 }, 00:33:31.624 "memory_domains": [ 00:33:31.624 { 00:33:31.624 "dma_device_id": "system", 00:33:31.624 "dma_device_type": 1 00:33:31.624 }, 00:33:31.624 { 00:33:31.624 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:31.624 "dma_device_type": 2 00:33:31.624 } 00:33:31.624 ], 00:33:31.624 "driver_specific": {} 00:33:31.624 }' 00:33:31.624 11:44:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:31.624 11:44:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:31.624 11:44:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:33:31.624 11:44:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:31.624 11:44:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:31.624 11:44:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:33:31.624 11:44:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:31.624 11:44:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:31.624 11:44:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:33:31.624 11:44:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:31.882 11:44:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:31.882 11:44:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:33:31.882 11:44:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:33:31.882 11:44:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:33:31.882 11:44:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:33:31.882 11:44:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:33:31.882 "name": "BaseBdev3", 00:33:31.882 "aliases": [ 00:33:31.882 "3bb0dcfc-5f51-4d51-942a-0166eff4f3ea" 00:33:31.882 ], 00:33:31.882 "product_name": "Malloc disk", 00:33:31.882 "block_size": 512, 00:33:31.882 "num_blocks": 65536, 00:33:31.882 "uuid": "3bb0dcfc-5f51-4d51-942a-0166eff4f3ea", 00:33:31.882 "assigned_rate_limits": { 00:33:31.882 "rw_ios_per_sec": 0, 00:33:31.882 "rw_mbytes_per_sec": 0, 00:33:31.882 "r_mbytes_per_sec": 0, 00:33:31.882 "w_mbytes_per_sec": 0 00:33:31.882 }, 00:33:31.882 "claimed": true, 00:33:31.882 "claim_type": "exclusive_write", 00:33:31.882 "zoned": false, 00:33:31.882 "supported_io_types": { 00:33:31.882 "read": true, 00:33:31.882 "write": true, 00:33:31.882 "unmap": true, 00:33:31.882 "write_zeroes": true, 00:33:31.882 "flush": true, 00:33:31.882 "reset": true, 00:33:31.882 "compare": false, 00:33:31.882 "compare_and_write": false, 00:33:31.882 "abort": true, 00:33:31.882 "nvme_admin": false, 00:33:31.882 "nvme_io": false 00:33:31.882 }, 00:33:31.882 "memory_domains": [ 00:33:31.882 { 00:33:31.882 "dma_device_id": "system", 00:33:31.882 "dma_device_type": 1 00:33:31.882 }, 00:33:31.882 { 00:33:31.882 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:31.882 "dma_device_type": 2 00:33:31.882 } 00:33:31.882 ], 00:33:31.882 "driver_specific": {} 00:33:31.882 }' 00:33:31.882 11:44:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:32.140 11:44:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:32.140 11:44:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:33:32.140 11:44:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:32.140 11:44:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:32.140 11:44:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:33:32.140 11:44:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:32.140 11:44:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:32.140 11:44:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:33:32.140 11:44:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:32.140 11:44:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:32.397 11:44:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:33:32.397 11:44:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:33:32.397 11:44:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:33:32.397 11:44:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:33:32.397 11:44:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:33:32.397 "name": "BaseBdev4", 00:33:32.397 "aliases": [ 00:33:32.397 "cc642895-b33b-445c-8ebc-854614aa0961" 00:33:32.397 ], 00:33:32.397 "product_name": "Malloc disk", 00:33:32.397 "block_size": 512, 00:33:32.397 "num_blocks": 65536, 00:33:32.397 "uuid": "cc642895-b33b-445c-8ebc-854614aa0961", 00:33:32.397 "assigned_rate_limits": { 00:33:32.397 "rw_ios_per_sec": 0, 00:33:32.397 "rw_mbytes_per_sec": 0, 00:33:32.397 "r_mbytes_per_sec": 0, 00:33:32.397 "w_mbytes_per_sec": 0 00:33:32.397 }, 00:33:32.397 "claimed": true, 00:33:32.397 "claim_type": "exclusive_write", 00:33:32.397 "zoned": false, 00:33:32.397 "supported_io_types": { 00:33:32.397 "read": true, 00:33:32.397 "write": true, 00:33:32.397 "unmap": true, 00:33:32.397 "write_zeroes": true, 00:33:32.397 "flush": true, 00:33:32.397 "reset": true, 00:33:32.397 "compare": false, 00:33:32.397 "compare_and_write": false, 00:33:32.397 "abort": true, 00:33:32.397 "nvme_admin": false, 00:33:32.397 "nvme_io": false 00:33:32.397 }, 00:33:32.397 "memory_domains": [ 00:33:32.397 { 00:33:32.397 "dma_device_id": "system", 00:33:32.397 "dma_device_type": 1 00:33:32.397 }, 00:33:32.397 { 00:33:32.397 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:32.397 "dma_device_type": 2 00:33:32.397 } 00:33:32.397 ], 00:33:32.397 "driver_specific": {} 00:33:32.397 }' 00:33:32.397 11:44:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:32.397 11:44:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:32.655 11:44:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:33:32.655 11:44:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:32.655 11:44:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:32.655 11:44:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:33:32.655 11:44:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:32.655 11:44:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:32.655 11:44:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:33:32.655 11:44:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:32.655 11:44:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:32.655 11:44:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:33:32.655 11:44:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:33:32.913 [2024-06-10 11:44:16.722123] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:33:32.913 [2024-06-10 11:44:16.722146] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:33:32.913 [2024-06-10 11:44:16.722177] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:33:32.913 11:44:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:33:32.913 11:44:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:33:32.913 11:44:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:33:32.913 11:44:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:33:32.913 11:44:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:33:32.913 11:44:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 3 00:33:32.913 11:44:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:33:32.913 11:44:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:33:32.913 11:44:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:33:32.913 11:44:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:33:32.913 11:44:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:33:32.913 11:44:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:32.913 11:44:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:32.913 11:44:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:32.913 11:44:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:32.913 11:44:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:32.913 11:44:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:33:33.171 11:44:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:33.171 "name": "Existed_Raid", 00:33:33.171 "uuid": "15c9e933-6572-4222-ba4c-d278668fd88c", 00:33:33.171 "strip_size_kb": 64, 00:33:33.171 "state": "offline", 00:33:33.171 "raid_level": "raid0", 00:33:33.171 "superblock": true, 00:33:33.171 "num_base_bdevs": 4, 00:33:33.171 "num_base_bdevs_discovered": 3, 00:33:33.171 "num_base_bdevs_operational": 3, 00:33:33.171 "base_bdevs_list": [ 00:33:33.171 { 00:33:33.171 "name": null, 00:33:33.171 "uuid": "00000000-0000-0000-0000-000000000000", 00:33:33.171 "is_configured": false, 00:33:33.171 "data_offset": 2048, 00:33:33.171 "data_size": 63488 00:33:33.171 }, 00:33:33.171 { 00:33:33.171 "name": "BaseBdev2", 00:33:33.171 "uuid": "bf276d4c-d012-4e3d-856e-4dbe43e528b8", 00:33:33.171 "is_configured": true, 00:33:33.171 "data_offset": 2048, 00:33:33.171 "data_size": 63488 00:33:33.172 }, 00:33:33.172 { 00:33:33.172 "name": "BaseBdev3", 00:33:33.172 "uuid": "3bb0dcfc-5f51-4d51-942a-0166eff4f3ea", 00:33:33.172 "is_configured": true, 00:33:33.172 "data_offset": 2048, 00:33:33.172 "data_size": 63488 00:33:33.172 }, 00:33:33.172 { 00:33:33.172 "name": "BaseBdev4", 00:33:33.172 "uuid": "cc642895-b33b-445c-8ebc-854614aa0961", 00:33:33.172 "is_configured": true, 00:33:33.172 "data_offset": 2048, 00:33:33.172 "data_size": 63488 00:33:33.172 } 00:33:33.172 ] 00:33:33.172 }' 00:33:33.172 11:44:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:33.172 11:44:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:33:33.737 11:44:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:33:33.737 11:44:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:33:33.737 11:44:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:33:33.737 11:44:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:33.737 11:44:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:33:33.737 11:44:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:33:33.737 11:44:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:33:33.995 [2024-06-10 11:44:17.713431] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:33:33.995 11:44:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:33:33.995 11:44:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:33:33.995 11:44:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:33.995 11:44:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:33:33.995 11:44:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:33:33.995 11:44:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:33:33.995 11:44:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:33:34.252 [2024-06-10 11:44:18.078049] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:33:34.252 11:44:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:33:34.252 11:44:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:33:34.252 11:44:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:33:34.252 11:44:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:34.510 11:44:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:33:34.510 11:44:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:33:34.510 11:44:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:33:34.510 [2024-06-10 11:44:18.437002] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:33:34.510 [2024-06-10 11:44:18.437037] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a7ae20 name Existed_Raid, state offline 00:33:34.769 11:44:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:33:34.769 11:44:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:33:34.769 11:44:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:34.769 11:44:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:33:34.769 11:44:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:33:34.769 11:44:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:33:34.769 11:44:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:33:34.769 11:44:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:33:34.769 11:44:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:33:34.769 11:44:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:33:35.027 BaseBdev2 00:33:35.027 11:44:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:33:35.027 11:44:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:33:35.027 11:44:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:33:35.027 11:44:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:33:35.027 11:44:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:33:35.027 11:44:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:33:35.027 11:44:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:33:35.285 11:44:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:33:35.285 [ 00:33:35.285 { 00:33:35.285 "name": "BaseBdev2", 00:33:35.285 "aliases": [ 00:33:35.285 "b4db141b-3e3c-4c1a-ad3e-a668567ed111" 00:33:35.285 ], 00:33:35.285 "product_name": "Malloc disk", 00:33:35.285 "block_size": 512, 00:33:35.285 "num_blocks": 65536, 00:33:35.285 "uuid": "b4db141b-3e3c-4c1a-ad3e-a668567ed111", 00:33:35.285 "assigned_rate_limits": { 00:33:35.285 "rw_ios_per_sec": 0, 00:33:35.285 "rw_mbytes_per_sec": 0, 00:33:35.285 "r_mbytes_per_sec": 0, 00:33:35.285 "w_mbytes_per_sec": 0 00:33:35.285 }, 00:33:35.285 "claimed": false, 00:33:35.285 "zoned": false, 00:33:35.285 "supported_io_types": { 00:33:35.285 "read": true, 00:33:35.285 "write": true, 00:33:35.285 "unmap": true, 00:33:35.285 "write_zeroes": true, 00:33:35.285 "flush": true, 00:33:35.285 "reset": true, 00:33:35.285 "compare": false, 00:33:35.285 "compare_and_write": false, 00:33:35.285 "abort": true, 00:33:35.285 "nvme_admin": false, 00:33:35.285 "nvme_io": false 00:33:35.285 }, 00:33:35.285 "memory_domains": [ 00:33:35.285 { 00:33:35.285 "dma_device_id": "system", 00:33:35.285 "dma_device_type": 1 00:33:35.285 }, 00:33:35.285 { 00:33:35.285 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:35.286 "dma_device_type": 2 00:33:35.286 } 00:33:35.286 ], 00:33:35.286 "driver_specific": {} 00:33:35.286 } 00:33:35.286 ] 00:33:35.286 11:44:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:33:35.286 11:44:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:33:35.286 11:44:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:33:35.286 11:44:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:33:35.543 BaseBdev3 00:33:35.543 11:44:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:33:35.543 11:44:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:33:35.543 11:44:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:33:35.543 11:44:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:33:35.543 11:44:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:33:35.544 11:44:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:33:35.544 11:44:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:33:35.801 11:44:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:33:35.801 [ 00:33:35.801 { 00:33:35.801 "name": "BaseBdev3", 00:33:35.801 "aliases": [ 00:33:35.801 "53179180-1000-4e8b-a594-1fc57495b61f" 00:33:35.801 ], 00:33:35.801 "product_name": "Malloc disk", 00:33:35.801 "block_size": 512, 00:33:35.801 "num_blocks": 65536, 00:33:35.801 "uuid": "53179180-1000-4e8b-a594-1fc57495b61f", 00:33:35.801 "assigned_rate_limits": { 00:33:35.801 "rw_ios_per_sec": 0, 00:33:35.801 "rw_mbytes_per_sec": 0, 00:33:35.801 "r_mbytes_per_sec": 0, 00:33:35.801 "w_mbytes_per_sec": 0 00:33:35.801 }, 00:33:35.801 "claimed": false, 00:33:35.801 "zoned": false, 00:33:35.801 "supported_io_types": { 00:33:35.801 "read": true, 00:33:35.801 "write": true, 00:33:35.801 "unmap": true, 00:33:35.801 "write_zeroes": true, 00:33:35.801 "flush": true, 00:33:35.801 "reset": true, 00:33:35.801 "compare": false, 00:33:35.801 "compare_and_write": false, 00:33:35.801 "abort": true, 00:33:35.801 "nvme_admin": false, 00:33:35.801 "nvme_io": false 00:33:35.801 }, 00:33:35.801 "memory_domains": [ 00:33:35.801 { 00:33:35.801 "dma_device_id": "system", 00:33:35.801 "dma_device_type": 1 00:33:35.801 }, 00:33:35.801 { 00:33:35.801 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:35.801 "dma_device_type": 2 00:33:35.801 } 00:33:35.801 ], 00:33:35.801 "driver_specific": {} 00:33:35.801 } 00:33:35.801 ] 00:33:35.801 11:44:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:33:35.801 11:44:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:33:35.802 11:44:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:33:35.802 11:44:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:33:36.060 BaseBdev4 00:33:36.060 11:44:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:33:36.060 11:44:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev4 00:33:36.060 11:44:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:33:36.060 11:44:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:33:36.060 11:44:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:33:36.060 11:44:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:33:36.060 11:44:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:33:36.060 11:44:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:33:36.318 [ 00:33:36.318 { 00:33:36.318 "name": "BaseBdev4", 00:33:36.318 "aliases": [ 00:33:36.318 "cd4c6a3a-0409-40ba-a88b-45194a2d2171" 00:33:36.318 ], 00:33:36.318 "product_name": "Malloc disk", 00:33:36.318 "block_size": 512, 00:33:36.318 "num_blocks": 65536, 00:33:36.318 "uuid": "cd4c6a3a-0409-40ba-a88b-45194a2d2171", 00:33:36.318 "assigned_rate_limits": { 00:33:36.318 "rw_ios_per_sec": 0, 00:33:36.318 "rw_mbytes_per_sec": 0, 00:33:36.318 "r_mbytes_per_sec": 0, 00:33:36.318 "w_mbytes_per_sec": 0 00:33:36.318 }, 00:33:36.318 "claimed": false, 00:33:36.318 "zoned": false, 00:33:36.318 "supported_io_types": { 00:33:36.318 "read": true, 00:33:36.318 "write": true, 00:33:36.318 "unmap": true, 00:33:36.318 "write_zeroes": true, 00:33:36.318 "flush": true, 00:33:36.318 "reset": true, 00:33:36.318 "compare": false, 00:33:36.318 "compare_and_write": false, 00:33:36.318 "abort": true, 00:33:36.318 "nvme_admin": false, 00:33:36.318 "nvme_io": false 00:33:36.318 }, 00:33:36.318 "memory_domains": [ 00:33:36.318 { 00:33:36.318 "dma_device_id": "system", 00:33:36.318 "dma_device_type": 1 00:33:36.318 }, 00:33:36.318 { 00:33:36.318 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:36.318 "dma_device_type": 2 00:33:36.318 } 00:33:36.318 ], 00:33:36.318 "driver_specific": {} 00:33:36.318 } 00:33:36.318 ] 00:33:36.318 11:44:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:33:36.318 11:44:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:33:36.318 11:44:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:33:36.318 11:44:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:33:36.576 [2024-06-10 11:44:20.301082] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:33:36.576 [2024-06-10 11:44:20.301116] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:33:36.576 [2024-06-10 11:44:20.301130] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:33:36.576 [2024-06-10 11:44:20.302100] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:33:36.576 [2024-06-10 11:44:20.302129] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:33:36.576 11:44:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:33:36.576 11:44:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:33:36.576 11:44:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:33:36.576 11:44:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:33:36.576 11:44:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:33:36.576 11:44:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:33:36.576 11:44:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:36.576 11:44:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:36.576 11:44:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:36.576 11:44:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:36.576 11:44:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:36.576 11:44:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:33:36.576 11:44:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:36.576 "name": "Existed_Raid", 00:33:36.576 "uuid": "985023b0-7781-4e02-a068-05e646015b1c", 00:33:36.576 "strip_size_kb": 64, 00:33:36.576 "state": "configuring", 00:33:36.576 "raid_level": "raid0", 00:33:36.576 "superblock": true, 00:33:36.576 "num_base_bdevs": 4, 00:33:36.576 "num_base_bdevs_discovered": 3, 00:33:36.576 "num_base_bdevs_operational": 4, 00:33:36.576 "base_bdevs_list": [ 00:33:36.576 { 00:33:36.576 "name": "BaseBdev1", 00:33:36.576 "uuid": "00000000-0000-0000-0000-000000000000", 00:33:36.576 "is_configured": false, 00:33:36.576 "data_offset": 0, 00:33:36.576 "data_size": 0 00:33:36.576 }, 00:33:36.576 { 00:33:36.576 "name": "BaseBdev2", 00:33:36.576 "uuid": "b4db141b-3e3c-4c1a-ad3e-a668567ed111", 00:33:36.576 "is_configured": true, 00:33:36.576 "data_offset": 2048, 00:33:36.576 "data_size": 63488 00:33:36.576 }, 00:33:36.576 { 00:33:36.576 "name": "BaseBdev3", 00:33:36.576 "uuid": "53179180-1000-4e8b-a594-1fc57495b61f", 00:33:36.576 "is_configured": true, 00:33:36.576 "data_offset": 2048, 00:33:36.576 "data_size": 63488 00:33:36.576 }, 00:33:36.576 { 00:33:36.576 "name": "BaseBdev4", 00:33:36.576 "uuid": "cd4c6a3a-0409-40ba-a88b-45194a2d2171", 00:33:36.576 "is_configured": true, 00:33:36.576 "data_offset": 2048, 00:33:36.576 "data_size": 63488 00:33:36.576 } 00:33:36.577 ] 00:33:36.577 }' 00:33:36.577 11:44:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:36.577 11:44:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:33:37.140 11:44:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:33:37.397 [2024-06-10 11:44:21.143243] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:33:37.397 11:44:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:33:37.397 11:44:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:33:37.397 11:44:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:33:37.397 11:44:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:33:37.397 11:44:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:33:37.397 11:44:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:33:37.397 11:44:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:37.397 11:44:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:37.397 11:44:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:37.397 11:44:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:37.397 11:44:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:37.397 11:44:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:33:37.655 11:44:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:37.655 "name": "Existed_Raid", 00:33:37.655 "uuid": "985023b0-7781-4e02-a068-05e646015b1c", 00:33:37.655 "strip_size_kb": 64, 00:33:37.655 "state": "configuring", 00:33:37.655 "raid_level": "raid0", 00:33:37.655 "superblock": true, 00:33:37.655 "num_base_bdevs": 4, 00:33:37.655 "num_base_bdevs_discovered": 2, 00:33:37.655 "num_base_bdevs_operational": 4, 00:33:37.655 "base_bdevs_list": [ 00:33:37.655 { 00:33:37.655 "name": "BaseBdev1", 00:33:37.655 "uuid": "00000000-0000-0000-0000-000000000000", 00:33:37.655 "is_configured": false, 00:33:37.655 "data_offset": 0, 00:33:37.655 "data_size": 0 00:33:37.655 }, 00:33:37.655 { 00:33:37.655 "name": null, 00:33:37.655 "uuid": "b4db141b-3e3c-4c1a-ad3e-a668567ed111", 00:33:37.655 "is_configured": false, 00:33:37.655 "data_offset": 2048, 00:33:37.655 "data_size": 63488 00:33:37.655 }, 00:33:37.655 { 00:33:37.655 "name": "BaseBdev3", 00:33:37.655 "uuid": "53179180-1000-4e8b-a594-1fc57495b61f", 00:33:37.655 "is_configured": true, 00:33:37.655 "data_offset": 2048, 00:33:37.655 "data_size": 63488 00:33:37.655 }, 00:33:37.655 { 00:33:37.655 "name": "BaseBdev4", 00:33:37.655 "uuid": "cd4c6a3a-0409-40ba-a88b-45194a2d2171", 00:33:37.655 "is_configured": true, 00:33:37.655 "data_offset": 2048, 00:33:37.655 "data_size": 63488 00:33:37.655 } 00:33:37.655 ] 00:33:37.655 }' 00:33:37.655 11:44:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:37.655 11:44:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:33:37.913 11:44:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:37.913 11:44:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:33:38.171 11:44:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:33:38.171 11:44:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:33:38.428 [2024-06-10 11:44:22.161931] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:33:38.428 BaseBdev1 00:33:38.428 11:44:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:33:38.428 11:44:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:33:38.428 11:44:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:33:38.428 11:44:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:33:38.428 11:44:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:33:38.428 11:44:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:33:38.428 11:44:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:33:38.428 11:44:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:33:38.685 [ 00:33:38.685 { 00:33:38.685 "name": "BaseBdev1", 00:33:38.685 "aliases": [ 00:33:38.685 "214e0c4b-df76-4a65-8746-d1bfa61bcc92" 00:33:38.685 ], 00:33:38.685 "product_name": "Malloc disk", 00:33:38.685 "block_size": 512, 00:33:38.685 "num_blocks": 65536, 00:33:38.685 "uuid": "214e0c4b-df76-4a65-8746-d1bfa61bcc92", 00:33:38.685 "assigned_rate_limits": { 00:33:38.685 "rw_ios_per_sec": 0, 00:33:38.685 "rw_mbytes_per_sec": 0, 00:33:38.685 "r_mbytes_per_sec": 0, 00:33:38.685 "w_mbytes_per_sec": 0 00:33:38.685 }, 00:33:38.685 "claimed": true, 00:33:38.685 "claim_type": "exclusive_write", 00:33:38.685 "zoned": false, 00:33:38.685 "supported_io_types": { 00:33:38.685 "read": true, 00:33:38.685 "write": true, 00:33:38.685 "unmap": true, 00:33:38.685 "write_zeroes": true, 00:33:38.685 "flush": true, 00:33:38.685 "reset": true, 00:33:38.685 "compare": false, 00:33:38.685 "compare_and_write": false, 00:33:38.685 "abort": true, 00:33:38.685 "nvme_admin": false, 00:33:38.685 "nvme_io": false 00:33:38.685 }, 00:33:38.685 "memory_domains": [ 00:33:38.685 { 00:33:38.685 "dma_device_id": "system", 00:33:38.685 "dma_device_type": 1 00:33:38.685 }, 00:33:38.685 { 00:33:38.685 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:38.685 "dma_device_type": 2 00:33:38.685 } 00:33:38.685 ], 00:33:38.685 "driver_specific": {} 00:33:38.685 } 00:33:38.685 ] 00:33:38.685 11:44:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:33:38.685 11:44:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:33:38.685 11:44:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:33:38.685 11:44:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:33:38.685 11:44:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:33:38.685 11:44:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:33:38.685 11:44:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:33:38.685 11:44:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:38.685 11:44:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:38.685 11:44:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:38.685 11:44:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:38.685 11:44:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:38.685 11:44:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:33:38.942 11:44:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:38.942 "name": "Existed_Raid", 00:33:38.942 "uuid": "985023b0-7781-4e02-a068-05e646015b1c", 00:33:38.942 "strip_size_kb": 64, 00:33:38.942 "state": "configuring", 00:33:38.942 "raid_level": "raid0", 00:33:38.942 "superblock": true, 00:33:38.942 "num_base_bdevs": 4, 00:33:38.942 "num_base_bdevs_discovered": 3, 00:33:38.942 "num_base_bdevs_operational": 4, 00:33:38.942 "base_bdevs_list": [ 00:33:38.942 { 00:33:38.942 "name": "BaseBdev1", 00:33:38.942 "uuid": "214e0c4b-df76-4a65-8746-d1bfa61bcc92", 00:33:38.942 "is_configured": true, 00:33:38.942 "data_offset": 2048, 00:33:38.942 "data_size": 63488 00:33:38.942 }, 00:33:38.942 { 00:33:38.942 "name": null, 00:33:38.942 "uuid": "b4db141b-3e3c-4c1a-ad3e-a668567ed111", 00:33:38.942 "is_configured": false, 00:33:38.942 "data_offset": 2048, 00:33:38.942 "data_size": 63488 00:33:38.942 }, 00:33:38.942 { 00:33:38.942 "name": "BaseBdev3", 00:33:38.942 "uuid": "53179180-1000-4e8b-a594-1fc57495b61f", 00:33:38.942 "is_configured": true, 00:33:38.942 "data_offset": 2048, 00:33:38.942 "data_size": 63488 00:33:38.942 }, 00:33:38.942 { 00:33:38.942 "name": "BaseBdev4", 00:33:38.942 "uuid": "cd4c6a3a-0409-40ba-a88b-45194a2d2171", 00:33:38.942 "is_configured": true, 00:33:38.942 "data_offset": 2048, 00:33:38.942 "data_size": 63488 00:33:38.942 } 00:33:38.942 ] 00:33:38.942 }' 00:33:38.942 11:44:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:38.942 11:44:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:33:39.199 11:44:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:39.199 11:44:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:33:39.457 11:44:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:33:39.457 11:44:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:33:39.714 [2024-06-10 11:44:23.465321] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:33:39.714 11:44:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:33:39.714 11:44:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:33:39.714 11:44:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:33:39.714 11:44:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:33:39.714 11:44:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:33:39.714 11:44:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:33:39.714 11:44:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:39.714 11:44:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:39.714 11:44:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:39.714 11:44:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:39.714 11:44:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:39.714 11:44:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:33:39.972 11:44:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:39.972 "name": "Existed_Raid", 00:33:39.972 "uuid": "985023b0-7781-4e02-a068-05e646015b1c", 00:33:39.972 "strip_size_kb": 64, 00:33:39.972 "state": "configuring", 00:33:39.972 "raid_level": "raid0", 00:33:39.972 "superblock": true, 00:33:39.972 "num_base_bdevs": 4, 00:33:39.972 "num_base_bdevs_discovered": 2, 00:33:39.972 "num_base_bdevs_operational": 4, 00:33:39.972 "base_bdevs_list": [ 00:33:39.972 { 00:33:39.972 "name": "BaseBdev1", 00:33:39.972 "uuid": "214e0c4b-df76-4a65-8746-d1bfa61bcc92", 00:33:39.972 "is_configured": true, 00:33:39.972 "data_offset": 2048, 00:33:39.972 "data_size": 63488 00:33:39.972 }, 00:33:39.972 { 00:33:39.972 "name": null, 00:33:39.972 "uuid": "b4db141b-3e3c-4c1a-ad3e-a668567ed111", 00:33:39.972 "is_configured": false, 00:33:39.972 "data_offset": 2048, 00:33:39.972 "data_size": 63488 00:33:39.972 }, 00:33:39.972 { 00:33:39.972 "name": null, 00:33:39.972 "uuid": "53179180-1000-4e8b-a594-1fc57495b61f", 00:33:39.972 "is_configured": false, 00:33:39.972 "data_offset": 2048, 00:33:39.972 "data_size": 63488 00:33:39.972 }, 00:33:39.972 { 00:33:39.972 "name": "BaseBdev4", 00:33:39.972 "uuid": "cd4c6a3a-0409-40ba-a88b-45194a2d2171", 00:33:39.972 "is_configured": true, 00:33:39.972 "data_offset": 2048, 00:33:39.972 "data_size": 63488 00:33:39.972 } 00:33:39.972 ] 00:33:39.972 }' 00:33:39.972 11:44:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:39.972 11:44:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:33:40.538 11:44:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:40.538 11:44:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:33:40.538 11:44:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:33:40.538 11:44:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:33:40.795 [2024-06-10 11:44:24.504014] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:33:40.795 11:44:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:33:40.795 11:44:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:33:40.795 11:44:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:33:40.795 11:44:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:33:40.795 11:44:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:33:40.796 11:44:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:33:40.796 11:44:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:40.796 11:44:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:40.796 11:44:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:40.796 11:44:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:40.796 11:44:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:40.796 11:44:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:33:40.796 11:44:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:40.796 "name": "Existed_Raid", 00:33:40.796 "uuid": "985023b0-7781-4e02-a068-05e646015b1c", 00:33:40.796 "strip_size_kb": 64, 00:33:40.796 "state": "configuring", 00:33:40.796 "raid_level": "raid0", 00:33:40.796 "superblock": true, 00:33:40.796 "num_base_bdevs": 4, 00:33:40.796 "num_base_bdevs_discovered": 3, 00:33:40.796 "num_base_bdevs_operational": 4, 00:33:40.796 "base_bdevs_list": [ 00:33:40.796 { 00:33:40.796 "name": "BaseBdev1", 00:33:40.796 "uuid": "214e0c4b-df76-4a65-8746-d1bfa61bcc92", 00:33:40.796 "is_configured": true, 00:33:40.796 "data_offset": 2048, 00:33:40.796 "data_size": 63488 00:33:40.796 }, 00:33:40.796 { 00:33:40.796 "name": null, 00:33:40.796 "uuid": "b4db141b-3e3c-4c1a-ad3e-a668567ed111", 00:33:40.796 "is_configured": false, 00:33:40.796 "data_offset": 2048, 00:33:40.796 "data_size": 63488 00:33:40.796 }, 00:33:40.796 { 00:33:40.796 "name": "BaseBdev3", 00:33:40.796 "uuid": "53179180-1000-4e8b-a594-1fc57495b61f", 00:33:40.796 "is_configured": true, 00:33:40.796 "data_offset": 2048, 00:33:40.796 "data_size": 63488 00:33:40.796 }, 00:33:40.796 { 00:33:40.796 "name": "BaseBdev4", 00:33:40.796 "uuid": "cd4c6a3a-0409-40ba-a88b-45194a2d2171", 00:33:40.796 "is_configured": true, 00:33:40.796 "data_offset": 2048, 00:33:40.796 "data_size": 63488 00:33:40.796 } 00:33:40.796 ] 00:33:40.796 }' 00:33:40.796 11:44:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:40.796 11:44:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:33:41.361 11:44:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:41.361 11:44:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:33:41.618 11:44:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:33:41.619 11:44:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:33:41.619 [2024-06-10 11:44:25.514632] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:33:41.619 11:44:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:33:41.619 11:44:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:33:41.619 11:44:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:33:41.619 11:44:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:33:41.619 11:44:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:33:41.619 11:44:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:33:41.619 11:44:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:41.619 11:44:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:41.619 11:44:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:41.619 11:44:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:41.619 11:44:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:41.619 11:44:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:33:41.877 11:44:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:41.877 "name": "Existed_Raid", 00:33:41.877 "uuid": "985023b0-7781-4e02-a068-05e646015b1c", 00:33:41.877 "strip_size_kb": 64, 00:33:41.877 "state": "configuring", 00:33:41.877 "raid_level": "raid0", 00:33:41.877 "superblock": true, 00:33:41.877 "num_base_bdevs": 4, 00:33:41.877 "num_base_bdevs_discovered": 2, 00:33:41.877 "num_base_bdevs_operational": 4, 00:33:41.877 "base_bdevs_list": [ 00:33:41.877 { 00:33:41.877 "name": null, 00:33:41.877 "uuid": "214e0c4b-df76-4a65-8746-d1bfa61bcc92", 00:33:41.877 "is_configured": false, 00:33:41.877 "data_offset": 2048, 00:33:41.877 "data_size": 63488 00:33:41.877 }, 00:33:41.877 { 00:33:41.877 "name": null, 00:33:41.877 "uuid": "b4db141b-3e3c-4c1a-ad3e-a668567ed111", 00:33:41.877 "is_configured": false, 00:33:41.877 "data_offset": 2048, 00:33:41.877 "data_size": 63488 00:33:41.877 }, 00:33:41.877 { 00:33:41.877 "name": "BaseBdev3", 00:33:41.877 "uuid": "53179180-1000-4e8b-a594-1fc57495b61f", 00:33:41.877 "is_configured": true, 00:33:41.877 "data_offset": 2048, 00:33:41.877 "data_size": 63488 00:33:41.877 }, 00:33:41.877 { 00:33:41.877 "name": "BaseBdev4", 00:33:41.877 "uuid": "cd4c6a3a-0409-40ba-a88b-45194a2d2171", 00:33:41.877 "is_configured": true, 00:33:41.877 "data_offset": 2048, 00:33:41.877 "data_size": 63488 00:33:41.877 } 00:33:41.877 ] 00:33:41.877 }' 00:33:41.877 11:44:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:41.877 11:44:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:33:42.441 11:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:33:42.441 11:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:42.441 11:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:33:42.441 11:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:33:42.699 [2024-06-10 11:44:26.535245] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:33:42.699 11:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:33:42.699 11:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:33:42.699 11:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:33:42.699 11:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:33:42.699 11:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:33:42.699 11:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:33:42.699 11:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:42.699 11:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:42.699 11:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:42.699 11:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:42.699 11:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:42.699 11:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:33:42.958 11:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:42.958 "name": "Existed_Raid", 00:33:42.958 "uuid": "985023b0-7781-4e02-a068-05e646015b1c", 00:33:42.958 "strip_size_kb": 64, 00:33:42.958 "state": "configuring", 00:33:42.958 "raid_level": "raid0", 00:33:42.958 "superblock": true, 00:33:42.958 "num_base_bdevs": 4, 00:33:42.958 "num_base_bdevs_discovered": 3, 00:33:42.958 "num_base_bdevs_operational": 4, 00:33:42.958 "base_bdevs_list": [ 00:33:42.958 { 00:33:42.958 "name": null, 00:33:42.958 "uuid": "214e0c4b-df76-4a65-8746-d1bfa61bcc92", 00:33:42.958 "is_configured": false, 00:33:42.958 "data_offset": 2048, 00:33:42.958 "data_size": 63488 00:33:42.958 }, 00:33:42.958 { 00:33:42.958 "name": "BaseBdev2", 00:33:42.958 "uuid": "b4db141b-3e3c-4c1a-ad3e-a668567ed111", 00:33:42.958 "is_configured": true, 00:33:42.958 "data_offset": 2048, 00:33:42.958 "data_size": 63488 00:33:42.958 }, 00:33:42.958 { 00:33:42.958 "name": "BaseBdev3", 00:33:42.958 "uuid": "53179180-1000-4e8b-a594-1fc57495b61f", 00:33:42.958 "is_configured": true, 00:33:42.958 "data_offset": 2048, 00:33:42.958 "data_size": 63488 00:33:42.958 }, 00:33:42.958 { 00:33:42.958 "name": "BaseBdev4", 00:33:42.958 "uuid": "cd4c6a3a-0409-40ba-a88b-45194a2d2171", 00:33:42.958 "is_configured": true, 00:33:42.958 "data_offset": 2048, 00:33:42.958 "data_size": 63488 00:33:42.958 } 00:33:42.958 ] 00:33:42.958 }' 00:33:42.958 11:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:42.958 11:44:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:33:43.524 11:44:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:43.524 11:44:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:33:43.524 11:44:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:33:43.524 11:44:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:33:43.524 11:44:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:43.782 11:44:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 214e0c4b-df76-4a65-8746-d1bfa61bcc92 00:33:43.782 [2024-06-10 11:44:27.685258] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:33:43.782 [2024-06-10 11:44:27.685381] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a7fe60 00:33:43.782 [2024-06-10 11:44:27.685389] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:33:43.782 [2024-06-10 11:44:27.685507] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a7e790 00:33:43.782 [2024-06-10 11:44:27.685588] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a7fe60 00:33:43.782 [2024-06-10 11:44:27.685595] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1a7fe60 00:33:43.782 [2024-06-10 11:44:27.685653] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:33:43.782 NewBaseBdev 00:33:43.782 11:44:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:33:43.782 11:44:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=NewBaseBdev 00:33:43.782 11:44:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:33:43.782 11:44:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:33:43.782 11:44:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:33:43.782 11:44:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:33:43.782 11:44:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:33:44.040 11:44:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:33:44.298 [ 00:33:44.298 { 00:33:44.298 "name": "NewBaseBdev", 00:33:44.298 "aliases": [ 00:33:44.298 "214e0c4b-df76-4a65-8746-d1bfa61bcc92" 00:33:44.298 ], 00:33:44.298 "product_name": "Malloc disk", 00:33:44.298 "block_size": 512, 00:33:44.298 "num_blocks": 65536, 00:33:44.298 "uuid": "214e0c4b-df76-4a65-8746-d1bfa61bcc92", 00:33:44.298 "assigned_rate_limits": { 00:33:44.298 "rw_ios_per_sec": 0, 00:33:44.298 "rw_mbytes_per_sec": 0, 00:33:44.298 "r_mbytes_per_sec": 0, 00:33:44.298 "w_mbytes_per_sec": 0 00:33:44.298 }, 00:33:44.298 "claimed": true, 00:33:44.298 "claim_type": "exclusive_write", 00:33:44.298 "zoned": false, 00:33:44.298 "supported_io_types": { 00:33:44.298 "read": true, 00:33:44.298 "write": true, 00:33:44.298 "unmap": true, 00:33:44.298 "write_zeroes": true, 00:33:44.298 "flush": true, 00:33:44.298 "reset": true, 00:33:44.298 "compare": false, 00:33:44.298 "compare_and_write": false, 00:33:44.298 "abort": true, 00:33:44.298 "nvme_admin": false, 00:33:44.298 "nvme_io": false 00:33:44.298 }, 00:33:44.298 "memory_domains": [ 00:33:44.298 { 00:33:44.298 "dma_device_id": "system", 00:33:44.298 "dma_device_type": 1 00:33:44.298 }, 00:33:44.298 { 00:33:44.298 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:44.298 "dma_device_type": 2 00:33:44.298 } 00:33:44.298 ], 00:33:44.298 "driver_specific": {} 00:33:44.298 } 00:33:44.298 ] 00:33:44.298 11:44:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:33:44.298 11:44:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:33:44.299 11:44:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:33:44.299 11:44:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:33:44.299 11:44:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:33:44.299 11:44:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:33:44.299 11:44:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:33:44.299 11:44:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:44.299 11:44:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:44.299 11:44:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:44.299 11:44:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:44.299 11:44:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:44.299 11:44:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:33:44.299 11:44:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:44.299 "name": "Existed_Raid", 00:33:44.299 "uuid": "985023b0-7781-4e02-a068-05e646015b1c", 00:33:44.299 "strip_size_kb": 64, 00:33:44.299 "state": "online", 00:33:44.299 "raid_level": "raid0", 00:33:44.299 "superblock": true, 00:33:44.299 "num_base_bdevs": 4, 00:33:44.299 "num_base_bdevs_discovered": 4, 00:33:44.299 "num_base_bdevs_operational": 4, 00:33:44.299 "base_bdevs_list": [ 00:33:44.299 { 00:33:44.299 "name": "NewBaseBdev", 00:33:44.299 "uuid": "214e0c4b-df76-4a65-8746-d1bfa61bcc92", 00:33:44.299 "is_configured": true, 00:33:44.299 "data_offset": 2048, 00:33:44.299 "data_size": 63488 00:33:44.299 }, 00:33:44.299 { 00:33:44.299 "name": "BaseBdev2", 00:33:44.299 "uuid": "b4db141b-3e3c-4c1a-ad3e-a668567ed111", 00:33:44.299 "is_configured": true, 00:33:44.299 "data_offset": 2048, 00:33:44.299 "data_size": 63488 00:33:44.299 }, 00:33:44.299 { 00:33:44.299 "name": "BaseBdev3", 00:33:44.299 "uuid": "53179180-1000-4e8b-a594-1fc57495b61f", 00:33:44.299 "is_configured": true, 00:33:44.299 "data_offset": 2048, 00:33:44.299 "data_size": 63488 00:33:44.299 }, 00:33:44.299 { 00:33:44.299 "name": "BaseBdev4", 00:33:44.299 "uuid": "cd4c6a3a-0409-40ba-a88b-45194a2d2171", 00:33:44.299 "is_configured": true, 00:33:44.299 "data_offset": 2048, 00:33:44.299 "data_size": 63488 00:33:44.299 } 00:33:44.299 ] 00:33:44.299 }' 00:33:44.299 11:44:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:44.299 11:44:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:33:44.882 11:44:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:33:44.882 11:44:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:33:44.882 11:44:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:33:44.882 11:44:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:33:44.882 11:44:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:33:44.882 11:44:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:33:44.882 11:44:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:33:44.882 11:44:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:33:45.140 [2024-06-10 11:44:28.832413] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:33:45.140 11:44:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:33:45.140 "name": "Existed_Raid", 00:33:45.140 "aliases": [ 00:33:45.140 "985023b0-7781-4e02-a068-05e646015b1c" 00:33:45.140 ], 00:33:45.140 "product_name": "Raid Volume", 00:33:45.140 "block_size": 512, 00:33:45.140 "num_blocks": 253952, 00:33:45.140 "uuid": "985023b0-7781-4e02-a068-05e646015b1c", 00:33:45.140 "assigned_rate_limits": { 00:33:45.140 "rw_ios_per_sec": 0, 00:33:45.140 "rw_mbytes_per_sec": 0, 00:33:45.140 "r_mbytes_per_sec": 0, 00:33:45.140 "w_mbytes_per_sec": 0 00:33:45.140 }, 00:33:45.140 "claimed": false, 00:33:45.140 "zoned": false, 00:33:45.140 "supported_io_types": { 00:33:45.140 "read": true, 00:33:45.140 "write": true, 00:33:45.140 "unmap": true, 00:33:45.140 "write_zeroes": true, 00:33:45.140 "flush": true, 00:33:45.140 "reset": true, 00:33:45.140 "compare": false, 00:33:45.140 "compare_and_write": false, 00:33:45.140 "abort": false, 00:33:45.140 "nvme_admin": false, 00:33:45.140 "nvme_io": false 00:33:45.140 }, 00:33:45.140 "memory_domains": [ 00:33:45.140 { 00:33:45.140 "dma_device_id": "system", 00:33:45.140 "dma_device_type": 1 00:33:45.140 }, 00:33:45.140 { 00:33:45.140 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:45.140 "dma_device_type": 2 00:33:45.140 }, 00:33:45.140 { 00:33:45.140 "dma_device_id": "system", 00:33:45.140 "dma_device_type": 1 00:33:45.140 }, 00:33:45.140 { 00:33:45.140 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:45.140 "dma_device_type": 2 00:33:45.140 }, 00:33:45.140 { 00:33:45.140 "dma_device_id": "system", 00:33:45.140 "dma_device_type": 1 00:33:45.140 }, 00:33:45.140 { 00:33:45.140 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:45.140 "dma_device_type": 2 00:33:45.140 }, 00:33:45.140 { 00:33:45.140 "dma_device_id": "system", 00:33:45.140 "dma_device_type": 1 00:33:45.140 }, 00:33:45.140 { 00:33:45.140 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:45.140 "dma_device_type": 2 00:33:45.140 } 00:33:45.140 ], 00:33:45.140 "driver_specific": { 00:33:45.140 "raid": { 00:33:45.140 "uuid": "985023b0-7781-4e02-a068-05e646015b1c", 00:33:45.140 "strip_size_kb": 64, 00:33:45.140 "state": "online", 00:33:45.140 "raid_level": "raid0", 00:33:45.140 "superblock": true, 00:33:45.140 "num_base_bdevs": 4, 00:33:45.140 "num_base_bdevs_discovered": 4, 00:33:45.140 "num_base_bdevs_operational": 4, 00:33:45.140 "base_bdevs_list": [ 00:33:45.140 { 00:33:45.140 "name": "NewBaseBdev", 00:33:45.140 "uuid": "214e0c4b-df76-4a65-8746-d1bfa61bcc92", 00:33:45.140 "is_configured": true, 00:33:45.140 "data_offset": 2048, 00:33:45.140 "data_size": 63488 00:33:45.140 }, 00:33:45.141 { 00:33:45.141 "name": "BaseBdev2", 00:33:45.141 "uuid": "b4db141b-3e3c-4c1a-ad3e-a668567ed111", 00:33:45.141 "is_configured": true, 00:33:45.141 "data_offset": 2048, 00:33:45.141 "data_size": 63488 00:33:45.141 }, 00:33:45.141 { 00:33:45.141 "name": "BaseBdev3", 00:33:45.141 "uuid": "53179180-1000-4e8b-a594-1fc57495b61f", 00:33:45.141 "is_configured": true, 00:33:45.141 "data_offset": 2048, 00:33:45.141 "data_size": 63488 00:33:45.141 }, 00:33:45.141 { 00:33:45.141 "name": "BaseBdev4", 00:33:45.141 "uuid": "cd4c6a3a-0409-40ba-a88b-45194a2d2171", 00:33:45.141 "is_configured": true, 00:33:45.141 "data_offset": 2048, 00:33:45.141 "data_size": 63488 00:33:45.141 } 00:33:45.141 ] 00:33:45.141 } 00:33:45.141 } 00:33:45.141 }' 00:33:45.141 11:44:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:33:45.141 11:44:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:33:45.141 BaseBdev2 00:33:45.141 BaseBdev3 00:33:45.141 BaseBdev4' 00:33:45.141 11:44:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:33:45.141 11:44:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:33:45.141 11:44:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:33:45.141 11:44:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:33:45.141 "name": "NewBaseBdev", 00:33:45.141 "aliases": [ 00:33:45.141 "214e0c4b-df76-4a65-8746-d1bfa61bcc92" 00:33:45.141 ], 00:33:45.141 "product_name": "Malloc disk", 00:33:45.141 "block_size": 512, 00:33:45.141 "num_blocks": 65536, 00:33:45.141 "uuid": "214e0c4b-df76-4a65-8746-d1bfa61bcc92", 00:33:45.141 "assigned_rate_limits": { 00:33:45.141 "rw_ios_per_sec": 0, 00:33:45.141 "rw_mbytes_per_sec": 0, 00:33:45.141 "r_mbytes_per_sec": 0, 00:33:45.141 "w_mbytes_per_sec": 0 00:33:45.141 }, 00:33:45.141 "claimed": true, 00:33:45.141 "claim_type": "exclusive_write", 00:33:45.141 "zoned": false, 00:33:45.141 "supported_io_types": { 00:33:45.141 "read": true, 00:33:45.141 "write": true, 00:33:45.141 "unmap": true, 00:33:45.141 "write_zeroes": true, 00:33:45.141 "flush": true, 00:33:45.141 "reset": true, 00:33:45.141 "compare": false, 00:33:45.141 "compare_and_write": false, 00:33:45.141 "abort": true, 00:33:45.141 "nvme_admin": false, 00:33:45.141 "nvme_io": false 00:33:45.141 }, 00:33:45.141 "memory_domains": [ 00:33:45.141 { 00:33:45.141 "dma_device_id": "system", 00:33:45.141 "dma_device_type": 1 00:33:45.141 }, 00:33:45.141 { 00:33:45.141 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:45.141 "dma_device_type": 2 00:33:45.141 } 00:33:45.141 ], 00:33:45.141 "driver_specific": {} 00:33:45.141 }' 00:33:45.141 11:44:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:45.399 11:44:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:45.399 11:44:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:33:45.399 11:44:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:45.399 11:44:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:45.399 11:44:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:33:45.399 11:44:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:45.399 11:44:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:45.399 11:44:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:33:45.399 11:44:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:45.657 11:44:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:45.657 11:44:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:33:45.657 11:44:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:33:45.657 11:44:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:33:45.657 11:44:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:33:45.657 11:44:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:33:45.657 "name": "BaseBdev2", 00:33:45.657 "aliases": [ 00:33:45.657 "b4db141b-3e3c-4c1a-ad3e-a668567ed111" 00:33:45.657 ], 00:33:45.657 "product_name": "Malloc disk", 00:33:45.657 "block_size": 512, 00:33:45.657 "num_blocks": 65536, 00:33:45.657 "uuid": "b4db141b-3e3c-4c1a-ad3e-a668567ed111", 00:33:45.657 "assigned_rate_limits": { 00:33:45.657 "rw_ios_per_sec": 0, 00:33:45.657 "rw_mbytes_per_sec": 0, 00:33:45.657 "r_mbytes_per_sec": 0, 00:33:45.657 "w_mbytes_per_sec": 0 00:33:45.657 }, 00:33:45.657 "claimed": true, 00:33:45.657 "claim_type": "exclusive_write", 00:33:45.657 "zoned": false, 00:33:45.657 "supported_io_types": { 00:33:45.657 "read": true, 00:33:45.657 "write": true, 00:33:45.657 "unmap": true, 00:33:45.657 "write_zeroes": true, 00:33:45.657 "flush": true, 00:33:45.657 "reset": true, 00:33:45.657 "compare": false, 00:33:45.657 "compare_and_write": false, 00:33:45.657 "abort": true, 00:33:45.657 "nvme_admin": false, 00:33:45.657 "nvme_io": false 00:33:45.657 }, 00:33:45.657 "memory_domains": [ 00:33:45.657 { 00:33:45.657 "dma_device_id": "system", 00:33:45.657 "dma_device_type": 1 00:33:45.657 }, 00:33:45.657 { 00:33:45.657 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:45.657 "dma_device_type": 2 00:33:45.657 } 00:33:45.657 ], 00:33:45.657 "driver_specific": {} 00:33:45.657 }' 00:33:45.657 11:44:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:45.915 11:44:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:45.915 11:44:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:33:45.915 11:44:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:45.915 11:44:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:45.915 11:44:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:33:45.915 11:44:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:45.915 11:44:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:45.915 11:44:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:33:45.915 11:44:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:45.915 11:44:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:46.174 11:44:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:33:46.174 11:44:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:33:46.174 11:44:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:33:46.174 11:44:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:33:46.174 11:44:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:33:46.174 "name": "BaseBdev3", 00:33:46.174 "aliases": [ 00:33:46.174 "53179180-1000-4e8b-a594-1fc57495b61f" 00:33:46.174 ], 00:33:46.174 "product_name": "Malloc disk", 00:33:46.174 "block_size": 512, 00:33:46.174 "num_blocks": 65536, 00:33:46.174 "uuid": "53179180-1000-4e8b-a594-1fc57495b61f", 00:33:46.174 "assigned_rate_limits": { 00:33:46.174 "rw_ios_per_sec": 0, 00:33:46.174 "rw_mbytes_per_sec": 0, 00:33:46.174 "r_mbytes_per_sec": 0, 00:33:46.174 "w_mbytes_per_sec": 0 00:33:46.174 }, 00:33:46.174 "claimed": true, 00:33:46.174 "claim_type": "exclusive_write", 00:33:46.174 "zoned": false, 00:33:46.174 "supported_io_types": { 00:33:46.174 "read": true, 00:33:46.174 "write": true, 00:33:46.174 "unmap": true, 00:33:46.174 "write_zeroes": true, 00:33:46.174 "flush": true, 00:33:46.174 "reset": true, 00:33:46.174 "compare": false, 00:33:46.174 "compare_and_write": false, 00:33:46.174 "abort": true, 00:33:46.174 "nvme_admin": false, 00:33:46.174 "nvme_io": false 00:33:46.174 }, 00:33:46.174 "memory_domains": [ 00:33:46.174 { 00:33:46.174 "dma_device_id": "system", 00:33:46.174 "dma_device_type": 1 00:33:46.174 }, 00:33:46.174 { 00:33:46.174 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:46.174 "dma_device_type": 2 00:33:46.174 } 00:33:46.174 ], 00:33:46.174 "driver_specific": {} 00:33:46.174 }' 00:33:46.174 11:44:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:46.174 11:44:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:46.174 11:44:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:33:46.174 11:44:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:46.432 11:44:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:46.432 11:44:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:33:46.432 11:44:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:46.432 11:44:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:46.432 11:44:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:33:46.432 11:44:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:46.432 11:44:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:46.432 11:44:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:33:46.432 11:44:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:33:46.432 11:44:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:33:46.432 11:44:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:33:46.691 11:44:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:33:46.691 "name": "BaseBdev4", 00:33:46.691 "aliases": [ 00:33:46.691 "cd4c6a3a-0409-40ba-a88b-45194a2d2171" 00:33:46.691 ], 00:33:46.691 "product_name": "Malloc disk", 00:33:46.691 "block_size": 512, 00:33:46.691 "num_blocks": 65536, 00:33:46.691 "uuid": "cd4c6a3a-0409-40ba-a88b-45194a2d2171", 00:33:46.691 "assigned_rate_limits": { 00:33:46.691 "rw_ios_per_sec": 0, 00:33:46.691 "rw_mbytes_per_sec": 0, 00:33:46.691 "r_mbytes_per_sec": 0, 00:33:46.691 "w_mbytes_per_sec": 0 00:33:46.691 }, 00:33:46.691 "claimed": true, 00:33:46.691 "claim_type": "exclusive_write", 00:33:46.691 "zoned": false, 00:33:46.691 "supported_io_types": { 00:33:46.691 "read": true, 00:33:46.691 "write": true, 00:33:46.691 "unmap": true, 00:33:46.691 "write_zeroes": true, 00:33:46.691 "flush": true, 00:33:46.691 "reset": true, 00:33:46.691 "compare": false, 00:33:46.691 "compare_and_write": false, 00:33:46.691 "abort": true, 00:33:46.691 "nvme_admin": false, 00:33:46.691 "nvme_io": false 00:33:46.691 }, 00:33:46.691 "memory_domains": [ 00:33:46.691 { 00:33:46.691 "dma_device_id": "system", 00:33:46.691 "dma_device_type": 1 00:33:46.691 }, 00:33:46.691 { 00:33:46.691 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:46.691 "dma_device_type": 2 00:33:46.691 } 00:33:46.691 ], 00:33:46.691 "driver_specific": {} 00:33:46.691 }' 00:33:46.691 11:44:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:46.691 11:44:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:46.691 11:44:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:33:46.691 11:44:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:46.950 11:44:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:46.950 11:44:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:33:46.950 11:44:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:46.950 11:44:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:46.950 11:44:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:33:46.950 11:44:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:46.950 11:44:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:46.950 11:44:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:33:46.950 11:44:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:33:47.208 [2024-06-10 11:44:30.965773] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:33:47.208 [2024-06-10 11:44:30.965796] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:33:47.208 [2024-06-10 11:44:30.965836] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:33:47.208 [2024-06-10 11:44:30.965881] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:33:47.208 [2024-06-10 11:44:30.965890] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a7fe60 name Existed_Raid, state offline 00:33:47.208 11:44:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 183700 00:33:47.208 11:44:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@949 -- # '[' -z 183700 ']' 00:33:47.208 11:44:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # kill -0 183700 00:33:47.208 11:44:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # uname 00:33:47.208 11:44:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:33:47.208 11:44:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 183700 00:33:47.208 11:44:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:33:47.209 11:44:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:33:47.209 11:44:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # echo 'killing process with pid 183700' 00:33:47.209 killing process with pid 183700 00:33:47.209 11:44:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # kill 183700 00:33:47.209 [2024-06-10 11:44:31.030545] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:33:47.209 11:44:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@973 -- # wait 183700 00:33:47.209 [2024-06-10 11:44:31.071244] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:33:47.468 11:44:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:33:47.468 00:33:47.468 real 0m24.690s 00:33:47.468 user 0m45.148s 00:33:47.468 sys 0m4.714s 00:33:47.468 11:44:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # xtrace_disable 00:33:47.468 11:44:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:33:47.468 ************************************ 00:33:47.468 END TEST raid_state_function_test_sb 00:33:47.468 ************************************ 00:33:47.468 11:44:31 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 4 00:33:47.468 11:44:31 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:33:47.468 11:44:31 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:33:47.468 11:44:31 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:33:47.468 ************************************ 00:33:47.468 START TEST raid_superblock_test 00:33:47.468 ************************************ 00:33:47.468 11:44:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # raid_superblock_test raid0 4 00:33:47.468 11:44:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:33:47.468 11:44:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:33:47.468 11:44:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:33:47.468 11:44:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:33:47.468 11:44:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:33:47.468 11:44:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:33:47.468 11:44:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:33:47.468 11:44:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:33:47.468 11:44:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:33:47.468 11:44:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:33:47.468 11:44:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:33:47.468 11:44:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:33:47.468 11:44:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:33:47.468 11:44:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:33:47.468 11:44:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:33:47.468 11:44:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:33:47.468 11:44:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:33:47.468 11:44:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=187585 00:33:47.468 11:44:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 187585 /var/tmp/spdk-raid.sock 00:33:47.468 11:44:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@830 -- # '[' -z 187585 ']' 00:33:47.468 11:44:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:33:47.468 11:44:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:33:47.468 11:44:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:33:47.468 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:33:47.468 11:44:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:33:47.468 11:44:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:33:47.468 [2024-06-10 11:44:31.394696] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:33:47.468 [2024-06-10 11:44:31.394746] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid187585 ] 00:33:47.727 [2024-06-10 11:44:31.483891] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:47.727 [2024-06-10 11:44:31.571923] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:33:47.727 [2024-06-10 11:44:31.632957] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:33:47.727 [2024-06-10 11:44:31.632983] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:33:48.292 11:44:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:33:48.292 11:44:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@863 -- # return 0 00:33:48.292 11:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:33:48.292 11:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:33:48.292 11:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:33:48.292 11:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:33:48.292 11:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:33:48.292 11:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:33:48.292 11:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:33:48.292 11:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:33:48.292 11:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:33:48.551 malloc1 00:33:48.551 11:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:33:48.810 [2024-06-10 11:44:32.549371] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:33:48.810 [2024-06-10 11:44:32.549412] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:33:48.810 [2024-06-10 11:44:32.549425] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc51100 00:33:48.810 [2024-06-10 11:44:32.549433] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:33:48.810 [2024-06-10 11:44:32.550669] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:33:48.810 [2024-06-10 11:44:32.550695] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:33:48.810 pt1 00:33:48.810 11:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:33:48.810 11:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:33:48.810 11:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:33:48.810 11:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:33:48.810 11:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:33:48.810 11:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:33:48.810 11:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:33:48.810 11:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:33:48.810 11:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:33:48.810 malloc2 00:33:48.810 11:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:33:49.069 [2024-06-10 11:44:32.895321] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:33:49.069 [2024-06-10 11:44:32.895358] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:33:49.069 [2024-06-10 11:44:32.895372] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc52500 00:33:49.069 [2024-06-10 11:44:32.895380] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:33:49.069 [2024-06-10 11:44:32.896495] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:33:49.069 [2024-06-10 11:44:32.896520] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:33:49.069 pt2 00:33:49.069 11:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:33:49.069 11:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:33:49.069 11:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:33:49.069 11:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:33:49.069 11:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:33:49.069 11:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:33:49.069 11:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:33:49.069 11:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:33:49.069 11:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:33:49.328 malloc3 00:33:49.328 11:44:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:33:49.328 [2024-06-10 11:44:33.225097] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:33:49.328 [2024-06-10 11:44:33.225136] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:33:49.328 [2024-06-10 11:44:33.225149] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdfc7a0 00:33:49.328 [2024-06-10 11:44:33.225157] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:33:49.328 [2024-06-10 11:44:33.226294] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:33:49.328 [2024-06-10 11:44:33.226318] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:33:49.328 pt3 00:33:49.328 11:44:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:33:49.328 11:44:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:33:49.328 11:44:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:33:49.328 11:44:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:33:49.328 11:44:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:33:49.328 11:44:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:33:49.328 11:44:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:33:49.328 11:44:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:33:49.328 11:44:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:33:49.587 malloc4 00:33:49.587 11:44:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:33:49.891 [2024-06-10 11:44:33.562418] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:33:49.891 [2024-06-10 11:44:33.562456] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:33:49.891 [2024-06-10 11:44:33.562468] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdfeb50 00:33:49.891 [2024-06-10 11:44:33.562476] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:33:49.891 [2024-06-10 11:44:33.563617] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:33:49.891 [2024-06-10 11:44:33.563641] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:33:49.891 pt4 00:33:49.891 11:44:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:33:49.891 11:44:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:33:49.891 11:44:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:33:49.891 [2024-06-10 11:44:33.726871] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:33:49.891 [2024-06-10 11:44:33.727813] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:33:49.891 [2024-06-10 11:44:33.727852] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:33:49.891 [2024-06-10 11:44:33.727892] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:33:49.891 [2024-06-10 11:44:33.728021] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xdff1a0 00:33:49.891 [2024-06-10 11:44:33.728028] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:33:49.891 [2024-06-10 11:44:33.728161] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc4ff50 00:33:49.891 [2024-06-10 11:44:33.728258] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xdff1a0 00:33:49.891 [2024-06-10 11:44:33.728265] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xdff1a0 00:33:49.891 [2024-06-10 11:44:33.728330] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:33:49.891 11:44:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:33:49.891 11:44:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:33:49.891 11:44:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:33:49.891 11:44:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:33:49.891 11:44:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:33:49.891 11:44:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:33:49.891 11:44:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:49.891 11:44:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:49.891 11:44:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:49.891 11:44:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:49.891 11:44:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:49.891 11:44:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:33:50.176 11:44:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:50.176 "name": "raid_bdev1", 00:33:50.176 "uuid": "9c2efff6-efee-4212-85eb-2d09da355c39", 00:33:50.176 "strip_size_kb": 64, 00:33:50.176 "state": "online", 00:33:50.176 "raid_level": "raid0", 00:33:50.176 "superblock": true, 00:33:50.176 "num_base_bdevs": 4, 00:33:50.176 "num_base_bdevs_discovered": 4, 00:33:50.176 "num_base_bdevs_operational": 4, 00:33:50.176 "base_bdevs_list": [ 00:33:50.176 { 00:33:50.176 "name": "pt1", 00:33:50.176 "uuid": "00000000-0000-0000-0000-000000000001", 00:33:50.176 "is_configured": true, 00:33:50.176 "data_offset": 2048, 00:33:50.176 "data_size": 63488 00:33:50.176 }, 00:33:50.176 { 00:33:50.176 "name": "pt2", 00:33:50.176 "uuid": "00000000-0000-0000-0000-000000000002", 00:33:50.176 "is_configured": true, 00:33:50.176 "data_offset": 2048, 00:33:50.176 "data_size": 63488 00:33:50.176 }, 00:33:50.176 { 00:33:50.176 "name": "pt3", 00:33:50.176 "uuid": "00000000-0000-0000-0000-000000000003", 00:33:50.176 "is_configured": true, 00:33:50.176 "data_offset": 2048, 00:33:50.176 "data_size": 63488 00:33:50.176 }, 00:33:50.176 { 00:33:50.176 "name": "pt4", 00:33:50.176 "uuid": "00000000-0000-0000-0000-000000000004", 00:33:50.176 "is_configured": true, 00:33:50.176 "data_offset": 2048, 00:33:50.176 "data_size": 63488 00:33:50.176 } 00:33:50.176 ] 00:33:50.176 }' 00:33:50.176 11:44:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:50.176 11:44:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:33:50.742 11:44:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:33:50.742 11:44:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:33:50.742 11:44:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:33:50.742 11:44:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:33:50.742 11:44:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:33:50.742 11:44:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:33:50.742 11:44:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:33:50.742 11:44:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:33:50.742 [2024-06-10 11:44:34.545157] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:33:50.742 11:44:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:33:50.742 "name": "raid_bdev1", 00:33:50.742 "aliases": [ 00:33:50.742 "9c2efff6-efee-4212-85eb-2d09da355c39" 00:33:50.742 ], 00:33:50.742 "product_name": "Raid Volume", 00:33:50.742 "block_size": 512, 00:33:50.742 "num_blocks": 253952, 00:33:50.742 "uuid": "9c2efff6-efee-4212-85eb-2d09da355c39", 00:33:50.742 "assigned_rate_limits": { 00:33:50.742 "rw_ios_per_sec": 0, 00:33:50.742 "rw_mbytes_per_sec": 0, 00:33:50.742 "r_mbytes_per_sec": 0, 00:33:50.742 "w_mbytes_per_sec": 0 00:33:50.742 }, 00:33:50.742 "claimed": false, 00:33:50.742 "zoned": false, 00:33:50.742 "supported_io_types": { 00:33:50.742 "read": true, 00:33:50.742 "write": true, 00:33:50.742 "unmap": true, 00:33:50.742 "write_zeroes": true, 00:33:50.742 "flush": true, 00:33:50.742 "reset": true, 00:33:50.742 "compare": false, 00:33:50.742 "compare_and_write": false, 00:33:50.742 "abort": false, 00:33:50.742 "nvme_admin": false, 00:33:50.742 "nvme_io": false 00:33:50.742 }, 00:33:50.742 "memory_domains": [ 00:33:50.742 { 00:33:50.742 "dma_device_id": "system", 00:33:50.742 "dma_device_type": 1 00:33:50.742 }, 00:33:50.742 { 00:33:50.742 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:50.742 "dma_device_type": 2 00:33:50.742 }, 00:33:50.742 { 00:33:50.742 "dma_device_id": "system", 00:33:50.742 "dma_device_type": 1 00:33:50.742 }, 00:33:50.743 { 00:33:50.743 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:50.743 "dma_device_type": 2 00:33:50.743 }, 00:33:50.743 { 00:33:50.743 "dma_device_id": "system", 00:33:50.743 "dma_device_type": 1 00:33:50.743 }, 00:33:50.743 { 00:33:50.743 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:50.743 "dma_device_type": 2 00:33:50.743 }, 00:33:50.743 { 00:33:50.743 "dma_device_id": "system", 00:33:50.743 "dma_device_type": 1 00:33:50.743 }, 00:33:50.743 { 00:33:50.743 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:50.743 "dma_device_type": 2 00:33:50.743 } 00:33:50.743 ], 00:33:50.743 "driver_specific": { 00:33:50.743 "raid": { 00:33:50.743 "uuid": "9c2efff6-efee-4212-85eb-2d09da355c39", 00:33:50.743 "strip_size_kb": 64, 00:33:50.743 "state": "online", 00:33:50.743 "raid_level": "raid0", 00:33:50.743 "superblock": true, 00:33:50.743 "num_base_bdevs": 4, 00:33:50.743 "num_base_bdevs_discovered": 4, 00:33:50.743 "num_base_bdevs_operational": 4, 00:33:50.743 "base_bdevs_list": [ 00:33:50.743 { 00:33:50.743 "name": "pt1", 00:33:50.743 "uuid": "00000000-0000-0000-0000-000000000001", 00:33:50.743 "is_configured": true, 00:33:50.743 "data_offset": 2048, 00:33:50.743 "data_size": 63488 00:33:50.743 }, 00:33:50.743 { 00:33:50.743 "name": "pt2", 00:33:50.743 "uuid": "00000000-0000-0000-0000-000000000002", 00:33:50.743 "is_configured": true, 00:33:50.743 "data_offset": 2048, 00:33:50.743 "data_size": 63488 00:33:50.743 }, 00:33:50.743 { 00:33:50.743 "name": "pt3", 00:33:50.743 "uuid": "00000000-0000-0000-0000-000000000003", 00:33:50.743 "is_configured": true, 00:33:50.743 "data_offset": 2048, 00:33:50.743 "data_size": 63488 00:33:50.743 }, 00:33:50.743 { 00:33:50.743 "name": "pt4", 00:33:50.743 "uuid": "00000000-0000-0000-0000-000000000004", 00:33:50.743 "is_configured": true, 00:33:50.743 "data_offset": 2048, 00:33:50.743 "data_size": 63488 00:33:50.743 } 00:33:50.743 ] 00:33:50.743 } 00:33:50.743 } 00:33:50.743 }' 00:33:50.743 11:44:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:33:50.743 11:44:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:33:50.743 pt2 00:33:50.743 pt3 00:33:50.743 pt4' 00:33:50.743 11:44:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:33:50.743 11:44:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:33:50.743 11:44:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:33:51.001 11:44:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:33:51.001 "name": "pt1", 00:33:51.001 "aliases": [ 00:33:51.001 "00000000-0000-0000-0000-000000000001" 00:33:51.001 ], 00:33:51.001 "product_name": "passthru", 00:33:51.001 "block_size": 512, 00:33:51.001 "num_blocks": 65536, 00:33:51.001 "uuid": "00000000-0000-0000-0000-000000000001", 00:33:51.001 "assigned_rate_limits": { 00:33:51.001 "rw_ios_per_sec": 0, 00:33:51.001 "rw_mbytes_per_sec": 0, 00:33:51.001 "r_mbytes_per_sec": 0, 00:33:51.001 "w_mbytes_per_sec": 0 00:33:51.001 }, 00:33:51.001 "claimed": true, 00:33:51.001 "claim_type": "exclusive_write", 00:33:51.001 "zoned": false, 00:33:51.001 "supported_io_types": { 00:33:51.001 "read": true, 00:33:51.001 "write": true, 00:33:51.001 "unmap": true, 00:33:51.001 "write_zeroes": true, 00:33:51.001 "flush": true, 00:33:51.001 "reset": true, 00:33:51.001 "compare": false, 00:33:51.001 "compare_and_write": false, 00:33:51.001 "abort": true, 00:33:51.001 "nvme_admin": false, 00:33:51.001 "nvme_io": false 00:33:51.001 }, 00:33:51.001 "memory_domains": [ 00:33:51.001 { 00:33:51.001 "dma_device_id": "system", 00:33:51.001 "dma_device_type": 1 00:33:51.001 }, 00:33:51.001 { 00:33:51.001 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:51.001 "dma_device_type": 2 00:33:51.001 } 00:33:51.001 ], 00:33:51.001 "driver_specific": { 00:33:51.001 "passthru": { 00:33:51.001 "name": "pt1", 00:33:51.001 "base_bdev_name": "malloc1" 00:33:51.001 } 00:33:51.001 } 00:33:51.001 }' 00:33:51.001 11:44:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:51.001 11:44:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:51.001 11:44:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:33:51.001 11:44:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:51.001 11:44:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:51.001 11:44:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:33:51.001 11:44:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:51.258 11:44:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:51.258 11:44:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:33:51.258 11:44:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:51.258 11:44:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:51.258 11:44:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:33:51.258 11:44:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:33:51.258 11:44:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:33:51.258 11:44:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:33:51.516 11:44:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:33:51.516 "name": "pt2", 00:33:51.516 "aliases": [ 00:33:51.516 "00000000-0000-0000-0000-000000000002" 00:33:51.516 ], 00:33:51.516 "product_name": "passthru", 00:33:51.516 "block_size": 512, 00:33:51.516 "num_blocks": 65536, 00:33:51.516 "uuid": "00000000-0000-0000-0000-000000000002", 00:33:51.516 "assigned_rate_limits": { 00:33:51.516 "rw_ios_per_sec": 0, 00:33:51.516 "rw_mbytes_per_sec": 0, 00:33:51.516 "r_mbytes_per_sec": 0, 00:33:51.516 "w_mbytes_per_sec": 0 00:33:51.516 }, 00:33:51.516 "claimed": true, 00:33:51.516 "claim_type": "exclusive_write", 00:33:51.516 "zoned": false, 00:33:51.516 "supported_io_types": { 00:33:51.516 "read": true, 00:33:51.516 "write": true, 00:33:51.516 "unmap": true, 00:33:51.516 "write_zeroes": true, 00:33:51.516 "flush": true, 00:33:51.516 "reset": true, 00:33:51.516 "compare": false, 00:33:51.516 "compare_and_write": false, 00:33:51.516 "abort": true, 00:33:51.516 "nvme_admin": false, 00:33:51.516 "nvme_io": false 00:33:51.516 }, 00:33:51.516 "memory_domains": [ 00:33:51.516 { 00:33:51.516 "dma_device_id": "system", 00:33:51.516 "dma_device_type": 1 00:33:51.516 }, 00:33:51.516 { 00:33:51.516 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:51.516 "dma_device_type": 2 00:33:51.516 } 00:33:51.516 ], 00:33:51.516 "driver_specific": { 00:33:51.516 "passthru": { 00:33:51.516 "name": "pt2", 00:33:51.516 "base_bdev_name": "malloc2" 00:33:51.516 } 00:33:51.516 } 00:33:51.516 }' 00:33:51.516 11:44:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:51.516 11:44:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:51.516 11:44:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:33:51.516 11:44:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:51.516 11:44:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:51.516 11:44:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:33:51.516 11:44:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:51.516 11:44:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:51.774 11:44:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:33:51.774 11:44:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:51.774 11:44:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:51.774 11:44:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:33:51.774 11:44:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:33:51.774 11:44:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:33:51.774 11:44:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:33:51.774 11:44:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:33:51.774 "name": "pt3", 00:33:51.774 "aliases": [ 00:33:51.774 "00000000-0000-0000-0000-000000000003" 00:33:51.774 ], 00:33:51.774 "product_name": "passthru", 00:33:51.774 "block_size": 512, 00:33:51.774 "num_blocks": 65536, 00:33:51.774 "uuid": "00000000-0000-0000-0000-000000000003", 00:33:51.774 "assigned_rate_limits": { 00:33:51.774 "rw_ios_per_sec": 0, 00:33:51.774 "rw_mbytes_per_sec": 0, 00:33:51.774 "r_mbytes_per_sec": 0, 00:33:51.774 "w_mbytes_per_sec": 0 00:33:51.774 }, 00:33:51.774 "claimed": true, 00:33:51.774 "claim_type": "exclusive_write", 00:33:51.774 "zoned": false, 00:33:51.774 "supported_io_types": { 00:33:51.774 "read": true, 00:33:51.774 "write": true, 00:33:51.774 "unmap": true, 00:33:51.774 "write_zeroes": true, 00:33:51.774 "flush": true, 00:33:51.774 "reset": true, 00:33:51.774 "compare": false, 00:33:51.774 "compare_and_write": false, 00:33:51.774 "abort": true, 00:33:51.774 "nvme_admin": false, 00:33:51.774 "nvme_io": false 00:33:51.774 }, 00:33:51.774 "memory_domains": [ 00:33:51.774 { 00:33:51.774 "dma_device_id": "system", 00:33:51.774 "dma_device_type": 1 00:33:51.774 }, 00:33:51.774 { 00:33:51.774 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:51.774 "dma_device_type": 2 00:33:51.774 } 00:33:51.774 ], 00:33:51.774 "driver_specific": { 00:33:51.774 "passthru": { 00:33:51.774 "name": "pt3", 00:33:51.774 "base_bdev_name": "malloc3" 00:33:51.774 } 00:33:51.774 } 00:33:51.774 }' 00:33:51.774 11:44:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:52.032 11:44:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:52.032 11:44:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:33:52.032 11:44:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:52.032 11:44:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:52.032 11:44:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:33:52.032 11:44:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:52.032 11:44:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:52.032 11:44:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:33:52.032 11:44:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:52.290 11:44:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:52.290 11:44:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:33:52.290 11:44:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:33:52.290 11:44:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:33:52.290 11:44:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:33:52.290 11:44:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:33:52.290 "name": "pt4", 00:33:52.290 "aliases": [ 00:33:52.290 "00000000-0000-0000-0000-000000000004" 00:33:52.290 ], 00:33:52.290 "product_name": "passthru", 00:33:52.290 "block_size": 512, 00:33:52.290 "num_blocks": 65536, 00:33:52.290 "uuid": "00000000-0000-0000-0000-000000000004", 00:33:52.290 "assigned_rate_limits": { 00:33:52.290 "rw_ios_per_sec": 0, 00:33:52.290 "rw_mbytes_per_sec": 0, 00:33:52.290 "r_mbytes_per_sec": 0, 00:33:52.290 "w_mbytes_per_sec": 0 00:33:52.290 }, 00:33:52.290 "claimed": true, 00:33:52.290 "claim_type": "exclusive_write", 00:33:52.290 "zoned": false, 00:33:52.290 "supported_io_types": { 00:33:52.290 "read": true, 00:33:52.290 "write": true, 00:33:52.290 "unmap": true, 00:33:52.290 "write_zeroes": true, 00:33:52.290 "flush": true, 00:33:52.290 "reset": true, 00:33:52.290 "compare": false, 00:33:52.290 "compare_and_write": false, 00:33:52.290 "abort": true, 00:33:52.290 "nvme_admin": false, 00:33:52.290 "nvme_io": false 00:33:52.290 }, 00:33:52.290 "memory_domains": [ 00:33:52.290 { 00:33:52.290 "dma_device_id": "system", 00:33:52.290 "dma_device_type": 1 00:33:52.290 }, 00:33:52.290 { 00:33:52.290 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:52.290 "dma_device_type": 2 00:33:52.290 } 00:33:52.290 ], 00:33:52.290 "driver_specific": { 00:33:52.290 "passthru": { 00:33:52.290 "name": "pt4", 00:33:52.290 "base_bdev_name": "malloc4" 00:33:52.290 } 00:33:52.290 } 00:33:52.290 }' 00:33:52.290 11:44:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:52.290 11:44:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:52.548 11:44:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:33:52.548 11:44:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:52.548 11:44:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:52.548 11:44:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:33:52.548 11:44:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:52.548 11:44:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:52.548 11:44:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:33:52.548 11:44:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:52.548 11:44:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:52.548 11:44:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:33:52.548 11:44:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:33:52.548 11:44:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:33:52.806 [2024-06-10 11:44:36.590443] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:33:52.806 11:44:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=9c2efff6-efee-4212-85eb-2d09da355c39 00:33:52.806 11:44:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 9c2efff6-efee-4212-85eb-2d09da355c39 ']' 00:33:52.806 11:44:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:33:53.063 [2024-06-10 11:44:36.758689] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:33:53.063 [2024-06-10 11:44:36.758705] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:33:53.063 [2024-06-10 11:44:36.758744] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:33:53.063 [2024-06-10 11:44:36.758786] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:33:53.063 [2024-06-10 11:44:36.758794] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xdff1a0 name raid_bdev1, state offline 00:33:53.063 11:44:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:33:53.063 11:44:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:53.063 11:44:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:33:53.063 11:44:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:33:53.063 11:44:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:33:53.064 11:44:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:33:53.321 11:44:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:33:53.321 11:44:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:33:53.321 11:44:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:33:53.321 11:44:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:33:53.579 11:44:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:33:53.579 11:44:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:33:53.836 11:44:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:33:53.836 11:44:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:33:53.836 11:44:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:33:53.836 11:44:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:33:53.836 11:44:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@649 -- # local es=0 00:33:53.836 11:44:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:33:53.836 11:44:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:33:54.093 11:44:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:33:54.093 11:44:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:33:54.093 11:44:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:33:54.093 11:44:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:33:54.093 11:44:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:33:54.093 11:44:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:33:54.093 11:44:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:33:54.093 11:44:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:33:54.093 [2024-06-10 11:44:37.933691] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:33:54.093 [2024-06-10 11:44:37.934728] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:33:54.093 [2024-06-10 11:44:37.934761] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:33:54.093 [2024-06-10 11:44:37.934783] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:33:54.093 [2024-06-10 11:44:37.934815] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:33:54.093 [2024-06-10 11:44:37.934846] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:33:54.093 [2024-06-10 11:44:37.934879] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:33:54.093 [2024-06-10 11:44:37.934899] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:33:54.093 [2024-06-10 11:44:37.934913] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:33:54.093 [2024-06-10 11:44:37.934932] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe02750 name raid_bdev1, state configuring 00:33:54.093 request: 00:33:54.093 { 00:33:54.093 "name": "raid_bdev1", 00:33:54.093 "raid_level": "raid0", 00:33:54.093 "base_bdevs": [ 00:33:54.093 "malloc1", 00:33:54.093 "malloc2", 00:33:54.093 "malloc3", 00:33:54.093 "malloc4" 00:33:54.093 ], 00:33:54.093 "superblock": false, 00:33:54.093 "strip_size_kb": 64, 00:33:54.093 "method": "bdev_raid_create", 00:33:54.093 "req_id": 1 00:33:54.093 } 00:33:54.093 Got JSON-RPC error response 00:33:54.093 response: 00:33:54.093 { 00:33:54.093 "code": -17, 00:33:54.093 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:33:54.093 } 00:33:54.093 11:44:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # es=1 00:33:54.093 11:44:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:33:54.093 11:44:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:33:54.093 11:44:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:33:54.093 11:44:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:54.093 11:44:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:33:54.351 11:44:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:33:54.351 11:44:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:33:54.351 11:44:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:33:54.351 [2024-06-10 11:44:38.294582] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:33:54.351 [2024-06-10 11:44:38.294616] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:33:54.351 [2024-06-10 11:44:38.294632] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdfdad0 00:33:54.351 [2024-06-10 11:44:38.294641] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:33:54.351 [2024-06-10 11:44:38.295827] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:33:54.351 [2024-06-10 11:44:38.295851] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:33:54.351 [2024-06-10 11:44:38.295908] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:33:54.351 [2024-06-10 11:44:38.295930] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:33:54.610 pt1 00:33:54.610 11:44:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 4 00:33:54.610 11:44:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:33:54.610 11:44:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:33:54.610 11:44:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:33:54.610 11:44:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:33:54.610 11:44:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:33:54.610 11:44:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:54.610 11:44:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:54.610 11:44:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:54.610 11:44:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:54.610 11:44:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:54.610 11:44:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:33:54.610 11:44:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:54.610 "name": "raid_bdev1", 00:33:54.610 "uuid": "9c2efff6-efee-4212-85eb-2d09da355c39", 00:33:54.610 "strip_size_kb": 64, 00:33:54.610 "state": "configuring", 00:33:54.610 "raid_level": "raid0", 00:33:54.610 "superblock": true, 00:33:54.610 "num_base_bdevs": 4, 00:33:54.610 "num_base_bdevs_discovered": 1, 00:33:54.610 "num_base_bdevs_operational": 4, 00:33:54.610 "base_bdevs_list": [ 00:33:54.610 { 00:33:54.610 "name": "pt1", 00:33:54.610 "uuid": "00000000-0000-0000-0000-000000000001", 00:33:54.610 "is_configured": true, 00:33:54.610 "data_offset": 2048, 00:33:54.610 "data_size": 63488 00:33:54.610 }, 00:33:54.610 { 00:33:54.610 "name": null, 00:33:54.610 "uuid": "00000000-0000-0000-0000-000000000002", 00:33:54.610 "is_configured": false, 00:33:54.610 "data_offset": 2048, 00:33:54.610 "data_size": 63488 00:33:54.610 }, 00:33:54.610 { 00:33:54.610 "name": null, 00:33:54.610 "uuid": "00000000-0000-0000-0000-000000000003", 00:33:54.610 "is_configured": false, 00:33:54.610 "data_offset": 2048, 00:33:54.610 "data_size": 63488 00:33:54.610 }, 00:33:54.610 { 00:33:54.610 "name": null, 00:33:54.610 "uuid": "00000000-0000-0000-0000-000000000004", 00:33:54.610 "is_configured": false, 00:33:54.610 "data_offset": 2048, 00:33:54.610 "data_size": 63488 00:33:54.610 } 00:33:54.610 ] 00:33:54.610 }' 00:33:54.610 11:44:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:54.610 11:44:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:33:55.177 11:44:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:33:55.177 11:44:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:33:55.177 [2024-06-10 11:44:39.112696] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:33:55.177 [2024-06-10 11:44:39.112740] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:33:55.177 [2024-06-10 11:44:39.112752] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe01db0 00:33:55.177 [2024-06-10 11:44:39.112760] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:33:55.177 [2024-06-10 11:44:39.113000] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:33:55.177 [2024-06-10 11:44:39.113014] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:33:55.177 [2024-06-10 11:44:39.113061] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:33:55.177 [2024-06-10 11:44:39.113074] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:33:55.177 pt2 00:33:55.435 11:44:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:33:55.435 [2024-06-10 11:44:39.285153] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:33:55.435 11:44:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 4 00:33:55.435 11:44:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:33:55.435 11:44:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:33:55.435 11:44:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:33:55.435 11:44:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:33:55.435 11:44:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:33:55.435 11:44:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:55.435 11:44:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:55.435 11:44:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:55.436 11:44:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:55.436 11:44:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:55.436 11:44:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:33:55.694 11:44:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:55.694 "name": "raid_bdev1", 00:33:55.694 "uuid": "9c2efff6-efee-4212-85eb-2d09da355c39", 00:33:55.694 "strip_size_kb": 64, 00:33:55.694 "state": "configuring", 00:33:55.694 "raid_level": "raid0", 00:33:55.694 "superblock": true, 00:33:55.694 "num_base_bdevs": 4, 00:33:55.694 "num_base_bdevs_discovered": 1, 00:33:55.694 "num_base_bdevs_operational": 4, 00:33:55.694 "base_bdevs_list": [ 00:33:55.694 { 00:33:55.694 "name": "pt1", 00:33:55.694 "uuid": "00000000-0000-0000-0000-000000000001", 00:33:55.694 "is_configured": true, 00:33:55.694 "data_offset": 2048, 00:33:55.694 "data_size": 63488 00:33:55.694 }, 00:33:55.694 { 00:33:55.694 "name": null, 00:33:55.694 "uuid": "00000000-0000-0000-0000-000000000002", 00:33:55.694 "is_configured": false, 00:33:55.694 "data_offset": 2048, 00:33:55.694 "data_size": 63488 00:33:55.694 }, 00:33:55.694 { 00:33:55.694 "name": null, 00:33:55.694 "uuid": "00000000-0000-0000-0000-000000000003", 00:33:55.694 "is_configured": false, 00:33:55.694 "data_offset": 2048, 00:33:55.694 "data_size": 63488 00:33:55.694 }, 00:33:55.694 { 00:33:55.694 "name": null, 00:33:55.694 "uuid": "00000000-0000-0000-0000-000000000004", 00:33:55.694 "is_configured": false, 00:33:55.694 "data_offset": 2048, 00:33:55.694 "data_size": 63488 00:33:55.694 } 00:33:55.694 ] 00:33:55.694 }' 00:33:55.694 11:44:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:55.694 11:44:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:33:56.262 11:44:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:33:56.262 11:44:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:33:56.262 11:44:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:33:56.262 [2024-06-10 11:44:40.119299] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:33:56.262 [2024-06-10 11:44:40.119343] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:33:56.262 [2024-06-10 11:44:40.119357] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc50530 00:33:56.262 [2024-06-10 11:44:40.119365] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:33:56.262 [2024-06-10 11:44:40.119609] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:33:56.262 [2024-06-10 11:44:40.119623] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:33:56.262 [2024-06-10 11:44:40.119670] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:33:56.262 [2024-06-10 11:44:40.119685] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:33:56.262 pt2 00:33:56.262 11:44:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:33:56.262 11:44:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:33:56.262 11:44:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:33:56.520 [2024-06-10 11:44:40.291746] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:33:56.520 [2024-06-10 11:44:40.291782] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:33:56.520 [2024-06-10 11:44:40.291796] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc507a0 00:33:56.520 [2024-06-10 11:44:40.291804] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:33:56.520 [2024-06-10 11:44:40.292039] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:33:56.520 [2024-06-10 11:44:40.292052] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:33:56.520 [2024-06-10 11:44:40.292096] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:33:56.520 [2024-06-10 11:44:40.292109] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:33:56.520 pt3 00:33:56.520 11:44:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:33:56.520 11:44:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:33:56.520 11:44:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:33:56.520 [2024-06-10 11:44:40.464193] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:33:56.520 [2024-06-10 11:44:40.464232] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:33:56.520 [2024-06-10 11:44:40.464244] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe01970 00:33:56.520 [2024-06-10 11:44:40.464253] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:33:56.520 [2024-06-10 11:44:40.464490] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:33:56.520 [2024-06-10 11:44:40.464504] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:33:56.520 [2024-06-10 11:44:40.464547] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:33:56.520 [2024-06-10 11:44:40.464561] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:33:56.520 [2024-06-10 11:44:40.464650] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xe020d0 00:33:56.520 [2024-06-10 11:44:40.464657] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:33:56.520 [2024-06-10 11:44:40.464772] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc4ffc0 00:33:56.520 [2024-06-10 11:44:40.464860] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe020d0 00:33:56.520 [2024-06-10 11:44:40.464877] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xe020d0 00:33:56.520 [2024-06-10 11:44:40.464949] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:33:56.778 pt4 00:33:56.778 11:44:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:33:56.778 11:44:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:33:56.778 11:44:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:33:56.778 11:44:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:33:56.778 11:44:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:33:56.778 11:44:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:33:56.778 11:44:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:33:56.778 11:44:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:33:56.778 11:44:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:56.778 11:44:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:56.778 11:44:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:56.778 11:44:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:56.778 11:44:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:56.778 11:44:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:33:56.778 11:44:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:56.778 "name": "raid_bdev1", 00:33:56.778 "uuid": "9c2efff6-efee-4212-85eb-2d09da355c39", 00:33:56.778 "strip_size_kb": 64, 00:33:56.778 "state": "online", 00:33:56.778 "raid_level": "raid0", 00:33:56.778 "superblock": true, 00:33:56.778 "num_base_bdevs": 4, 00:33:56.778 "num_base_bdevs_discovered": 4, 00:33:56.778 "num_base_bdevs_operational": 4, 00:33:56.778 "base_bdevs_list": [ 00:33:56.778 { 00:33:56.778 "name": "pt1", 00:33:56.778 "uuid": "00000000-0000-0000-0000-000000000001", 00:33:56.778 "is_configured": true, 00:33:56.778 "data_offset": 2048, 00:33:56.778 "data_size": 63488 00:33:56.778 }, 00:33:56.778 { 00:33:56.778 "name": "pt2", 00:33:56.778 "uuid": "00000000-0000-0000-0000-000000000002", 00:33:56.778 "is_configured": true, 00:33:56.778 "data_offset": 2048, 00:33:56.778 "data_size": 63488 00:33:56.778 }, 00:33:56.778 { 00:33:56.778 "name": "pt3", 00:33:56.778 "uuid": "00000000-0000-0000-0000-000000000003", 00:33:56.778 "is_configured": true, 00:33:56.778 "data_offset": 2048, 00:33:56.778 "data_size": 63488 00:33:56.778 }, 00:33:56.778 { 00:33:56.778 "name": "pt4", 00:33:56.778 "uuid": "00000000-0000-0000-0000-000000000004", 00:33:56.778 "is_configured": true, 00:33:56.778 "data_offset": 2048, 00:33:56.778 "data_size": 63488 00:33:56.778 } 00:33:56.778 ] 00:33:56.778 }' 00:33:56.778 11:44:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:56.778 11:44:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:33:57.344 11:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:33:57.344 11:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:33:57.344 11:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:33:57.344 11:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:33:57.344 11:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:33:57.344 11:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:33:57.344 11:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:33:57.344 11:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:33:57.603 [2024-06-10 11:44:41.298526] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:33:57.603 11:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:33:57.603 "name": "raid_bdev1", 00:33:57.603 "aliases": [ 00:33:57.603 "9c2efff6-efee-4212-85eb-2d09da355c39" 00:33:57.603 ], 00:33:57.603 "product_name": "Raid Volume", 00:33:57.603 "block_size": 512, 00:33:57.603 "num_blocks": 253952, 00:33:57.603 "uuid": "9c2efff6-efee-4212-85eb-2d09da355c39", 00:33:57.603 "assigned_rate_limits": { 00:33:57.603 "rw_ios_per_sec": 0, 00:33:57.603 "rw_mbytes_per_sec": 0, 00:33:57.603 "r_mbytes_per_sec": 0, 00:33:57.603 "w_mbytes_per_sec": 0 00:33:57.603 }, 00:33:57.603 "claimed": false, 00:33:57.603 "zoned": false, 00:33:57.603 "supported_io_types": { 00:33:57.603 "read": true, 00:33:57.603 "write": true, 00:33:57.603 "unmap": true, 00:33:57.603 "write_zeroes": true, 00:33:57.603 "flush": true, 00:33:57.603 "reset": true, 00:33:57.603 "compare": false, 00:33:57.603 "compare_and_write": false, 00:33:57.603 "abort": false, 00:33:57.603 "nvme_admin": false, 00:33:57.603 "nvme_io": false 00:33:57.603 }, 00:33:57.603 "memory_domains": [ 00:33:57.603 { 00:33:57.603 "dma_device_id": "system", 00:33:57.603 "dma_device_type": 1 00:33:57.603 }, 00:33:57.603 { 00:33:57.603 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:57.603 "dma_device_type": 2 00:33:57.603 }, 00:33:57.603 { 00:33:57.603 "dma_device_id": "system", 00:33:57.603 "dma_device_type": 1 00:33:57.603 }, 00:33:57.603 { 00:33:57.603 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:57.603 "dma_device_type": 2 00:33:57.603 }, 00:33:57.603 { 00:33:57.603 "dma_device_id": "system", 00:33:57.603 "dma_device_type": 1 00:33:57.603 }, 00:33:57.603 { 00:33:57.603 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:57.603 "dma_device_type": 2 00:33:57.603 }, 00:33:57.603 { 00:33:57.603 "dma_device_id": "system", 00:33:57.603 "dma_device_type": 1 00:33:57.603 }, 00:33:57.603 { 00:33:57.603 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:57.603 "dma_device_type": 2 00:33:57.603 } 00:33:57.603 ], 00:33:57.603 "driver_specific": { 00:33:57.603 "raid": { 00:33:57.603 "uuid": "9c2efff6-efee-4212-85eb-2d09da355c39", 00:33:57.603 "strip_size_kb": 64, 00:33:57.603 "state": "online", 00:33:57.603 "raid_level": "raid0", 00:33:57.603 "superblock": true, 00:33:57.603 "num_base_bdevs": 4, 00:33:57.603 "num_base_bdevs_discovered": 4, 00:33:57.603 "num_base_bdevs_operational": 4, 00:33:57.603 "base_bdevs_list": [ 00:33:57.603 { 00:33:57.603 "name": "pt1", 00:33:57.603 "uuid": "00000000-0000-0000-0000-000000000001", 00:33:57.603 "is_configured": true, 00:33:57.603 "data_offset": 2048, 00:33:57.603 "data_size": 63488 00:33:57.603 }, 00:33:57.603 { 00:33:57.603 "name": "pt2", 00:33:57.603 "uuid": "00000000-0000-0000-0000-000000000002", 00:33:57.603 "is_configured": true, 00:33:57.603 "data_offset": 2048, 00:33:57.603 "data_size": 63488 00:33:57.603 }, 00:33:57.603 { 00:33:57.603 "name": "pt3", 00:33:57.604 "uuid": "00000000-0000-0000-0000-000000000003", 00:33:57.604 "is_configured": true, 00:33:57.604 "data_offset": 2048, 00:33:57.604 "data_size": 63488 00:33:57.604 }, 00:33:57.604 { 00:33:57.604 "name": "pt4", 00:33:57.604 "uuid": "00000000-0000-0000-0000-000000000004", 00:33:57.604 "is_configured": true, 00:33:57.604 "data_offset": 2048, 00:33:57.604 "data_size": 63488 00:33:57.604 } 00:33:57.604 ] 00:33:57.604 } 00:33:57.604 } 00:33:57.604 }' 00:33:57.604 11:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:33:57.604 11:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:33:57.604 pt2 00:33:57.604 pt3 00:33:57.604 pt4' 00:33:57.604 11:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:33:57.604 11:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:33:57.604 11:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:33:57.604 11:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:33:57.604 "name": "pt1", 00:33:57.604 "aliases": [ 00:33:57.604 "00000000-0000-0000-0000-000000000001" 00:33:57.604 ], 00:33:57.604 "product_name": "passthru", 00:33:57.604 "block_size": 512, 00:33:57.604 "num_blocks": 65536, 00:33:57.604 "uuid": "00000000-0000-0000-0000-000000000001", 00:33:57.604 "assigned_rate_limits": { 00:33:57.604 "rw_ios_per_sec": 0, 00:33:57.604 "rw_mbytes_per_sec": 0, 00:33:57.604 "r_mbytes_per_sec": 0, 00:33:57.604 "w_mbytes_per_sec": 0 00:33:57.604 }, 00:33:57.604 "claimed": true, 00:33:57.604 "claim_type": "exclusive_write", 00:33:57.604 "zoned": false, 00:33:57.604 "supported_io_types": { 00:33:57.604 "read": true, 00:33:57.604 "write": true, 00:33:57.604 "unmap": true, 00:33:57.604 "write_zeroes": true, 00:33:57.604 "flush": true, 00:33:57.604 "reset": true, 00:33:57.604 "compare": false, 00:33:57.604 "compare_and_write": false, 00:33:57.604 "abort": true, 00:33:57.604 "nvme_admin": false, 00:33:57.604 "nvme_io": false 00:33:57.604 }, 00:33:57.604 "memory_domains": [ 00:33:57.604 { 00:33:57.604 "dma_device_id": "system", 00:33:57.604 "dma_device_type": 1 00:33:57.604 }, 00:33:57.604 { 00:33:57.604 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:57.604 "dma_device_type": 2 00:33:57.604 } 00:33:57.604 ], 00:33:57.604 "driver_specific": { 00:33:57.604 "passthru": { 00:33:57.604 "name": "pt1", 00:33:57.604 "base_bdev_name": "malloc1" 00:33:57.604 } 00:33:57.604 } 00:33:57.604 }' 00:33:57.604 11:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:57.862 11:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:57.862 11:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:33:57.862 11:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:57.862 11:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:57.862 11:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:33:57.862 11:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:57.862 11:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:57.862 11:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:33:57.862 11:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:57.862 11:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:58.120 11:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:33:58.120 11:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:33:58.120 11:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:33:58.120 11:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:33:58.120 11:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:33:58.120 "name": "pt2", 00:33:58.120 "aliases": [ 00:33:58.120 "00000000-0000-0000-0000-000000000002" 00:33:58.120 ], 00:33:58.120 "product_name": "passthru", 00:33:58.120 "block_size": 512, 00:33:58.120 "num_blocks": 65536, 00:33:58.120 "uuid": "00000000-0000-0000-0000-000000000002", 00:33:58.120 "assigned_rate_limits": { 00:33:58.120 "rw_ios_per_sec": 0, 00:33:58.120 "rw_mbytes_per_sec": 0, 00:33:58.120 "r_mbytes_per_sec": 0, 00:33:58.120 "w_mbytes_per_sec": 0 00:33:58.120 }, 00:33:58.120 "claimed": true, 00:33:58.120 "claim_type": "exclusive_write", 00:33:58.120 "zoned": false, 00:33:58.120 "supported_io_types": { 00:33:58.120 "read": true, 00:33:58.120 "write": true, 00:33:58.120 "unmap": true, 00:33:58.120 "write_zeroes": true, 00:33:58.120 "flush": true, 00:33:58.120 "reset": true, 00:33:58.120 "compare": false, 00:33:58.120 "compare_and_write": false, 00:33:58.120 "abort": true, 00:33:58.120 "nvme_admin": false, 00:33:58.120 "nvme_io": false 00:33:58.120 }, 00:33:58.120 "memory_domains": [ 00:33:58.120 { 00:33:58.121 "dma_device_id": "system", 00:33:58.121 "dma_device_type": 1 00:33:58.121 }, 00:33:58.121 { 00:33:58.121 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:58.121 "dma_device_type": 2 00:33:58.121 } 00:33:58.121 ], 00:33:58.121 "driver_specific": { 00:33:58.121 "passthru": { 00:33:58.121 "name": "pt2", 00:33:58.121 "base_bdev_name": "malloc2" 00:33:58.121 } 00:33:58.121 } 00:33:58.121 }' 00:33:58.121 11:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:58.121 11:44:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:58.121 11:44:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:33:58.121 11:44:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:58.380 11:44:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:58.380 11:44:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:33:58.380 11:44:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:58.380 11:44:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:58.380 11:44:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:33:58.380 11:44:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:58.380 11:44:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:58.380 11:44:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:33:58.380 11:44:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:33:58.380 11:44:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:33:58.380 11:44:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:33:58.638 11:44:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:33:58.638 "name": "pt3", 00:33:58.638 "aliases": [ 00:33:58.638 "00000000-0000-0000-0000-000000000003" 00:33:58.638 ], 00:33:58.638 "product_name": "passthru", 00:33:58.638 "block_size": 512, 00:33:58.638 "num_blocks": 65536, 00:33:58.638 "uuid": "00000000-0000-0000-0000-000000000003", 00:33:58.638 "assigned_rate_limits": { 00:33:58.638 "rw_ios_per_sec": 0, 00:33:58.638 "rw_mbytes_per_sec": 0, 00:33:58.638 "r_mbytes_per_sec": 0, 00:33:58.638 "w_mbytes_per_sec": 0 00:33:58.638 }, 00:33:58.638 "claimed": true, 00:33:58.638 "claim_type": "exclusive_write", 00:33:58.638 "zoned": false, 00:33:58.638 "supported_io_types": { 00:33:58.638 "read": true, 00:33:58.638 "write": true, 00:33:58.638 "unmap": true, 00:33:58.638 "write_zeroes": true, 00:33:58.638 "flush": true, 00:33:58.638 "reset": true, 00:33:58.638 "compare": false, 00:33:58.638 "compare_and_write": false, 00:33:58.638 "abort": true, 00:33:58.638 "nvme_admin": false, 00:33:58.638 "nvme_io": false 00:33:58.638 }, 00:33:58.638 "memory_domains": [ 00:33:58.638 { 00:33:58.638 "dma_device_id": "system", 00:33:58.638 "dma_device_type": 1 00:33:58.638 }, 00:33:58.638 { 00:33:58.638 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:58.638 "dma_device_type": 2 00:33:58.638 } 00:33:58.638 ], 00:33:58.638 "driver_specific": { 00:33:58.638 "passthru": { 00:33:58.638 "name": "pt3", 00:33:58.638 "base_bdev_name": "malloc3" 00:33:58.638 } 00:33:58.638 } 00:33:58.638 }' 00:33:58.638 11:44:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:58.638 11:44:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:58.638 11:44:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:33:58.638 11:44:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:58.638 11:44:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:58.895 11:44:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:33:58.896 11:44:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:58.896 11:44:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:58.896 11:44:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:33:58.896 11:44:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:58.896 11:44:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:58.896 11:44:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:33:58.896 11:44:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:33:58.896 11:44:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:33:58.896 11:44:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:33:59.154 11:44:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:33:59.154 "name": "pt4", 00:33:59.154 "aliases": [ 00:33:59.154 "00000000-0000-0000-0000-000000000004" 00:33:59.154 ], 00:33:59.154 "product_name": "passthru", 00:33:59.154 "block_size": 512, 00:33:59.154 "num_blocks": 65536, 00:33:59.154 "uuid": "00000000-0000-0000-0000-000000000004", 00:33:59.154 "assigned_rate_limits": { 00:33:59.154 "rw_ios_per_sec": 0, 00:33:59.154 "rw_mbytes_per_sec": 0, 00:33:59.154 "r_mbytes_per_sec": 0, 00:33:59.154 "w_mbytes_per_sec": 0 00:33:59.154 }, 00:33:59.154 "claimed": true, 00:33:59.154 "claim_type": "exclusive_write", 00:33:59.154 "zoned": false, 00:33:59.154 "supported_io_types": { 00:33:59.154 "read": true, 00:33:59.154 "write": true, 00:33:59.154 "unmap": true, 00:33:59.154 "write_zeroes": true, 00:33:59.154 "flush": true, 00:33:59.154 "reset": true, 00:33:59.154 "compare": false, 00:33:59.154 "compare_and_write": false, 00:33:59.154 "abort": true, 00:33:59.154 "nvme_admin": false, 00:33:59.154 "nvme_io": false 00:33:59.154 }, 00:33:59.154 "memory_domains": [ 00:33:59.154 { 00:33:59.154 "dma_device_id": "system", 00:33:59.154 "dma_device_type": 1 00:33:59.154 }, 00:33:59.154 { 00:33:59.154 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:59.154 "dma_device_type": 2 00:33:59.154 } 00:33:59.154 ], 00:33:59.154 "driver_specific": { 00:33:59.154 "passthru": { 00:33:59.154 "name": "pt4", 00:33:59.154 "base_bdev_name": "malloc4" 00:33:59.154 } 00:33:59.154 } 00:33:59.154 }' 00:33:59.154 11:44:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:59.154 11:44:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:59.154 11:44:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:33:59.154 11:44:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:59.154 11:44:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:59.412 11:44:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:33:59.412 11:44:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:59.412 11:44:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:59.412 11:44:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:33:59.412 11:44:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:59.412 11:44:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:59.412 11:44:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:33:59.412 11:44:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:33:59.412 11:44:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:33:59.671 [2024-06-10 11:44:43.428059] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:33:59.671 11:44:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 9c2efff6-efee-4212-85eb-2d09da355c39 '!=' 9c2efff6-efee-4212-85eb-2d09da355c39 ']' 00:33:59.671 11:44:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:33:59.671 11:44:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:33:59.671 11:44:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:33:59.671 11:44:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 187585 00:33:59.671 11:44:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@949 -- # '[' -z 187585 ']' 00:33:59.671 11:44:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # kill -0 187585 00:33:59.671 11:44:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # uname 00:33:59.671 11:44:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:33:59.671 11:44:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 187585 00:33:59.671 11:44:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:33:59.671 11:44:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:33:59.671 11:44:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 187585' 00:33:59.671 killing process with pid 187585 00:33:59.671 11:44:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # kill 187585 00:33:59.671 [2024-06-10 11:44:43.493842] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:33:59.671 [2024-06-10 11:44:43.493902] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:33:59.671 [2024-06-10 11:44:43.493966] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:33:59.671 [2024-06-10 11:44:43.493976] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe020d0 name raid_bdev1, state offline 00:33:59.671 11:44:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@973 -- # wait 187585 00:33:59.671 [2024-06-10 11:44:43.534643] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:33:59.929 11:44:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:33:59.929 00:33:59.929 real 0m12.376s 00:33:59.929 user 0m22.133s 00:33:59.929 sys 0m2.343s 00:33:59.929 11:44:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:33:59.929 11:44:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:33:59.929 ************************************ 00:33:59.929 END TEST raid_superblock_test 00:33:59.929 ************************************ 00:33:59.929 11:44:43 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 4 read 00:33:59.929 11:44:43 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:33:59.929 11:44:43 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:33:59.929 11:44:43 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:33:59.929 ************************************ 00:33:59.929 START TEST raid_read_error_test 00:33:59.929 ************************************ 00:33:59.929 11:44:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test raid0 4 read 00:33:59.929 11:44:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:33:59.929 11:44:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:33:59.929 11:44:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:33:59.929 11:44:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:33:59.929 11:44:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:33:59.929 11:44:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:33:59.929 11:44:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:33:59.930 11:44:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:33:59.930 11:44:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:33:59.930 11:44:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:33:59.930 11:44:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:33:59.930 11:44:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:33:59.930 11:44:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:33:59.930 11:44:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:33:59.930 11:44:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:33:59.930 11:44:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:33:59.930 11:44:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:33:59.930 11:44:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:33:59.930 11:44:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:33:59.930 11:44:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:33:59.930 11:44:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:33:59.930 11:44:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:33:59.930 11:44:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:33:59.930 11:44:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:33:59.930 11:44:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:33:59.930 11:44:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:33:59.930 11:44:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:33:59.930 11:44:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:33:59.930 11:44:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.aRDP4oMlIe 00:33:59.930 11:44:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=189601 00:33:59.930 11:44:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 189601 /var/tmp/spdk-raid.sock 00:33:59.930 11:44:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:33:59.930 11:44:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@830 -- # '[' -z 189601 ']' 00:33:59.930 11:44:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:33:59.930 11:44:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:33:59.930 11:44:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:33:59.930 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:33:59.930 11:44:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:33:59.930 11:44:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:34:00.188 [2024-06-10 11:44:43.889381] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:34:00.188 [2024-06-10 11:44:43.889436] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid189601 ] 00:34:00.188 [2024-06-10 11:44:43.975710] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:00.188 [2024-06-10 11:44:44.063189] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:34:00.445 [2024-06-10 11:44:44.136829] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:34:00.446 [2024-06-10 11:44:44.136856] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:34:01.011 11:44:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:34:01.011 11:44:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@863 -- # return 0 00:34:01.011 11:44:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:34:01.011 11:44:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:34:01.011 BaseBdev1_malloc 00:34:01.012 11:44:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:34:01.269 true 00:34:01.269 11:44:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:34:01.269 [2024-06-10 11:44:45.206652] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:34:01.269 [2024-06-10 11:44:45.206691] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:34:01.269 [2024-06-10 11:44:45.206707] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x138ab10 00:34:01.269 [2024-06-10 11:44:45.206716] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:34:01.269 [2024-06-10 11:44:45.208153] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:34:01.269 [2024-06-10 11:44:45.208179] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:34:01.269 BaseBdev1 00:34:01.527 11:44:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:34:01.527 11:44:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:34:01.527 BaseBdev2_malloc 00:34:01.527 11:44:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:34:01.785 true 00:34:01.785 11:44:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:34:01.785 [2024-06-10 11:44:45.728849] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:34:01.785 [2024-06-10 11:44:45.728893] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:34:01.785 [2024-06-10 11:44:45.728913] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x138f280 00:34:01.785 [2024-06-10 11:44:45.728922] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:34:01.785 [2024-06-10 11:44:45.730131] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:34:01.785 [2024-06-10 11:44:45.730154] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:34:02.042 BaseBdev2 00:34:02.042 11:44:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:34:02.042 11:44:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:34:02.042 BaseBdev3_malloc 00:34:02.042 11:44:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:34:02.300 true 00:34:02.300 11:44:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:34:02.558 [2024-06-10 11:44:46.249879] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:34:02.558 [2024-06-10 11:44:46.249912] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:34:02.558 [2024-06-10 11:44:46.249928] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1391ab0 00:34:02.558 [2024-06-10 11:44:46.249937] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:34:02.558 [2024-06-10 11:44:46.251065] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:34:02.558 [2024-06-10 11:44:46.251088] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:34:02.558 BaseBdev3 00:34:02.558 11:44:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:34:02.558 11:44:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:34:02.558 BaseBdev4_malloc 00:34:02.558 11:44:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:34:02.815 true 00:34:02.815 11:44:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:34:03.073 [2024-06-10 11:44:46.776246] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:34:03.073 [2024-06-10 11:44:46.776282] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:34:03.073 [2024-06-10 11:44:46.776296] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1392380 00:34:03.073 [2024-06-10 11:44:46.776305] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:34:03.073 [2024-06-10 11:44:46.777456] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:34:03.073 [2024-06-10 11:44:46.777478] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:34:03.073 BaseBdev4 00:34:03.073 11:44:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:34:03.073 [2024-06-10 11:44:46.948722] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:34:03.073 [2024-06-10 11:44:46.949701] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:34:03.073 [2024-06-10 11:44:46.949748] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:34:03.073 [2024-06-10 11:44:46.949786] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:34:03.073 [2024-06-10 11:44:46.949976] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x138a020 00:34:03.073 [2024-06-10 11:44:46.949988] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:34:03.073 [2024-06-10 11:44:46.950135] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x138c400 00:34:03.073 [2024-06-10 11:44:46.950247] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x138a020 00:34:03.073 [2024-06-10 11:44:46.950254] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x138a020 00:34:03.073 [2024-06-10 11:44:46.950329] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:34:03.073 11:44:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:34:03.073 11:44:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:34:03.073 11:44:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:34:03.073 11:44:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:34:03.073 11:44:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:34:03.073 11:44:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:34:03.073 11:44:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:34:03.073 11:44:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:34:03.073 11:44:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:34:03.073 11:44:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:34:03.073 11:44:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:03.073 11:44:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:34:03.330 11:44:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:34:03.330 "name": "raid_bdev1", 00:34:03.330 "uuid": "754cbbf5-0b09-4438-bffa-da22305773f7", 00:34:03.330 "strip_size_kb": 64, 00:34:03.330 "state": "online", 00:34:03.331 "raid_level": "raid0", 00:34:03.331 "superblock": true, 00:34:03.331 "num_base_bdevs": 4, 00:34:03.331 "num_base_bdevs_discovered": 4, 00:34:03.331 "num_base_bdevs_operational": 4, 00:34:03.331 "base_bdevs_list": [ 00:34:03.331 { 00:34:03.331 "name": "BaseBdev1", 00:34:03.331 "uuid": "1d8ca7b7-b5d5-5ee7-9668-01e858426a92", 00:34:03.331 "is_configured": true, 00:34:03.331 "data_offset": 2048, 00:34:03.331 "data_size": 63488 00:34:03.331 }, 00:34:03.331 { 00:34:03.331 "name": "BaseBdev2", 00:34:03.331 "uuid": "aa5caa13-f9fd-501e-bf0a-c513400a9244", 00:34:03.331 "is_configured": true, 00:34:03.331 "data_offset": 2048, 00:34:03.331 "data_size": 63488 00:34:03.331 }, 00:34:03.331 { 00:34:03.331 "name": "BaseBdev3", 00:34:03.331 "uuid": "f5b8e1c3-3760-5acc-9441-29b7a11cde3b", 00:34:03.331 "is_configured": true, 00:34:03.331 "data_offset": 2048, 00:34:03.331 "data_size": 63488 00:34:03.331 }, 00:34:03.331 { 00:34:03.331 "name": "BaseBdev4", 00:34:03.331 "uuid": "b121576e-7902-511a-84ea-e889ec1215d1", 00:34:03.331 "is_configured": true, 00:34:03.331 "data_offset": 2048, 00:34:03.331 "data_size": 63488 00:34:03.331 } 00:34:03.331 ] 00:34:03.331 }' 00:34:03.331 11:44:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:34:03.331 11:44:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:34:03.896 11:44:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:34:03.896 11:44:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:34:03.896 [2024-06-10 11:44:47.710891] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x138c670 00:34:04.829 11:44:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:34:05.087 11:44:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:34:05.087 11:44:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:34:05.087 11:44:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:34:05.087 11:44:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:34:05.087 11:44:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:34:05.087 11:44:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:34:05.087 11:44:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:34:05.087 11:44:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:34:05.087 11:44:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:34:05.087 11:44:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:34:05.087 11:44:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:34:05.087 11:44:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:34:05.087 11:44:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:34:05.087 11:44:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:34:05.087 11:44:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:05.087 11:44:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:34:05.087 "name": "raid_bdev1", 00:34:05.087 "uuid": "754cbbf5-0b09-4438-bffa-da22305773f7", 00:34:05.087 "strip_size_kb": 64, 00:34:05.087 "state": "online", 00:34:05.087 "raid_level": "raid0", 00:34:05.087 "superblock": true, 00:34:05.087 "num_base_bdevs": 4, 00:34:05.087 "num_base_bdevs_discovered": 4, 00:34:05.087 "num_base_bdevs_operational": 4, 00:34:05.087 "base_bdevs_list": [ 00:34:05.087 { 00:34:05.087 "name": "BaseBdev1", 00:34:05.087 "uuid": "1d8ca7b7-b5d5-5ee7-9668-01e858426a92", 00:34:05.087 "is_configured": true, 00:34:05.087 "data_offset": 2048, 00:34:05.087 "data_size": 63488 00:34:05.087 }, 00:34:05.087 { 00:34:05.087 "name": "BaseBdev2", 00:34:05.087 "uuid": "aa5caa13-f9fd-501e-bf0a-c513400a9244", 00:34:05.087 "is_configured": true, 00:34:05.087 "data_offset": 2048, 00:34:05.087 "data_size": 63488 00:34:05.087 }, 00:34:05.087 { 00:34:05.087 "name": "BaseBdev3", 00:34:05.087 "uuid": "f5b8e1c3-3760-5acc-9441-29b7a11cde3b", 00:34:05.087 "is_configured": true, 00:34:05.087 "data_offset": 2048, 00:34:05.087 "data_size": 63488 00:34:05.087 }, 00:34:05.087 { 00:34:05.087 "name": "BaseBdev4", 00:34:05.087 "uuid": "b121576e-7902-511a-84ea-e889ec1215d1", 00:34:05.087 "is_configured": true, 00:34:05.087 "data_offset": 2048, 00:34:05.087 "data_size": 63488 00:34:05.087 } 00:34:05.087 ] 00:34:05.087 }' 00:34:05.087 11:44:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:34:05.088 11:44:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:34:05.652 11:44:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:34:05.909 [2024-06-10 11:44:49.628097] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:34:05.909 [2024-06-10 11:44:49.628128] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:34:05.909 [2024-06-10 11:44:49.630177] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:34:05.910 [2024-06-10 11:44:49.630204] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:34:05.910 [2024-06-10 11:44:49.630230] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:34:05.910 [2024-06-10 11:44:49.630238] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x138a020 name raid_bdev1, state offline 00:34:05.910 0 00:34:05.910 11:44:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 189601 00:34:05.910 11:44:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@949 -- # '[' -z 189601 ']' 00:34:05.910 11:44:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # kill -0 189601 00:34:05.910 11:44:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # uname 00:34:05.910 11:44:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:34:05.910 11:44:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 189601 00:34:05.910 11:44:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:34:05.910 11:44:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:34:05.910 11:44:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 189601' 00:34:05.910 killing process with pid 189601 00:34:05.910 11:44:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # kill 189601 00:34:05.910 [2024-06-10 11:44:49.704012] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:34:05.910 11:44:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@973 -- # wait 189601 00:34:05.910 [2024-06-10 11:44:49.738112] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:34:06.168 11:44:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.aRDP4oMlIe 00:34:06.168 11:44:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:34:06.168 11:44:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:34:06.168 11:44:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.52 00:34:06.168 11:44:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:34:06.168 11:44:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:34:06.168 11:44:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:34:06.168 11:44:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.52 != \0\.\0\0 ]] 00:34:06.168 00:34:06.168 real 0m6.132s 00:34:06.168 user 0m9.405s 00:34:06.168 sys 0m1.126s 00:34:06.168 11:44:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:34:06.168 11:44:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:34:06.168 ************************************ 00:34:06.168 END TEST raid_read_error_test 00:34:06.168 ************************************ 00:34:06.168 11:44:49 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 4 write 00:34:06.168 11:44:49 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:34:06.168 11:44:49 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:34:06.168 11:44:49 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:34:06.168 ************************************ 00:34:06.168 START TEST raid_write_error_test 00:34:06.168 ************************************ 00:34:06.168 11:44:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test raid0 4 write 00:34:06.168 11:44:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:34:06.168 11:44:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:34:06.168 11:44:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:34:06.168 11:44:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:34:06.168 11:44:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:34:06.168 11:44:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:34:06.168 11:44:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:34:06.168 11:44:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:34:06.168 11:44:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:34:06.168 11:44:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:34:06.168 11:44:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:34:06.168 11:44:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:34:06.168 11:44:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:34:06.168 11:44:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:34:06.168 11:44:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:34:06.168 11:44:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:34:06.168 11:44:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:34:06.168 11:44:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:34:06.168 11:44:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:34:06.168 11:44:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:34:06.168 11:44:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:34:06.168 11:44:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:34:06.168 11:44:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:34:06.168 11:44:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:34:06.168 11:44:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:34:06.168 11:44:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:34:06.168 11:44:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:34:06.168 11:44:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:34:06.168 11:44:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.samwN1Hfmj 00:34:06.168 11:44:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=190479 00:34:06.168 11:44:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 190479 /var/tmp/spdk-raid.sock 00:34:06.168 11:44:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:34:06.168 11:44:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@830 -- # '[' -z 190479 ']' 00:34:06.168 11:44:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:34:06.168 11:44:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:34:06.168 11:44:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:34:06.168 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:34:06.168 11:44:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:34:06.168 11:44:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:34:06.168 [2024-06-10 11:44:50.106464] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:34:06.168 [2024-06-10 11:44:50.106515] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid190479 ] 00:34:06.426 [2024-06-10 11:44:50.192493] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:06.426 [2024-06-10 11:44:50.270520] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:34:06.426 [2024-06-10 11:44:50.323453] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:34:06.426 [2024-06-10 11:44:50.323486] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:34:06.992 11:44:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:34:06.992 11:44:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@863 -- # return 0 00:34:06.992 11:44:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:34:06.992 11:44:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:34:07.250 BaseBdev1_malloc 00:34:07.250 11:44:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:34:07.508 true 00:34:07.508 11:44:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:34:07.508 [2024-06-10 11:44:51.390566] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:34:07.508 [2024-06-10 11:44:51.390605] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:34:07.508 [2024-06-10 11:44:51.390617] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2213b10 00:34:07.508 [2024-06-10 11:44:51.390626] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:34:07.508 [2024-06-10 11:44:51.391748] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:34:07.508 [2024-06-10 11:44:51.391771] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:34:07.508 BaseBdev1 00:34:07.508 11:44:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:34:07.508 11:44:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:34:07.766 BaseBdev2_malloc 00:34:07.766 11:44:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:34:08.024 true 00:34:08.024 11:44:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:34:08.024 [2024-06-10 11:44:51.911561] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:34:08.024 [2024-06-10 11:44:51.911596] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:34:08.024 [2024-06-10 11:44:51.911609] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2218280 00:34:08.024 [2024-06-10 11:44:51.911618] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:34:08.024 [2024-06-10 11:44:51.912768] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:34:08.024 [2024-06-10 11:44:51.912792] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:34:08.024 BaseBdev2 00:34:08.024 11:44:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:34:08.024 11:44:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:34:08.281 BaseBdev3_malloc 00:34:08.281 11:44:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:34:08.539 true 00:34:08.539 11:44:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:34:08.539 [2024-06-10 11:44:52.444716] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:34:08.539 [2024-06-10 11:44:52.444757] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:34:08.539 [2024-06-10 11:44:52.444773] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x221aab0 00:34:08.539 [2024-06-10 11:44:52.444781] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:34:08.539 [2024-06-10 11:44:52.445835] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:34:08.539 [2024-06-10 11:44:52.445861] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:34:08.539 BaseBdev3 00:34:08.539 11:44:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:34:08.539 11:44:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:34:08.796 BaseBdev4_malloc 00:34:08.796 11:44:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:34:09.054 true 00:34:09.054 11:44:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:34:09.054 [2024-06-10 11:44:52.949796] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:34:09.054 [2024-06-10 11:44:52.949833] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:34:09.054 [2024-06-10 11:44:52.949847] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x221b380 00:34:09.054 [2024-06-10 11:44:52.949860] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:34:09.054 [2024-06-10 11:44:52.950882] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:34:09.054 [2024-06-10 11:44:52.950905] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:34:09.054 BaseBdev4 00:34:09.054 11:44:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:34:09.312 [2024-06-10 11:44:53.122278] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:34:09.312 [2024-06-10 11:44:53.123086] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:34:09.312 [2024-06-10 11:44:53.123130] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:34:09.312 [2024-06-10 11:44:53.123168] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:34:09.312 [2024-06-10 11:44:53.123319] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2213020 00:34:09.312 [2024-06-10 11:44:53.123326] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:34:09.312 [2024-06-10 11:44:53.123454] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2215400 00:34:09.312 [2024-06-10 11:44:53.123553] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2213020 00:34:09.312 [2024-06-10 11:44:53.123559] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2213020 00:34:09.312 [2024-06-10 11:44:53.123623] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:34:09.312 11:44:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:34:09.312 11:44:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:34:09.312 11:44:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:34:09.312 11:44:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:34:09.312 11:44:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:34:09.312 11:44:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:34:09.312 11:44:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:34:09.312 11:44:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:34:09.312 11:44:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:34:09.312 11:44:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:34:09.312 11:44:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:09.312 11:44:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:34:09.570 11:44:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:34:09.570 "name": "raid_bdev1", 00:34:09.570 "uuid": "27b37b86-8233-43d4-8f26-f649ccc2def2", 00:34:09.570 "strip_size_kb": 64, 00:34:09.570 "state": "online", 00:34:09.570 "raid_level": "raid0", 00:34:09.570 "superblock": true, 00:34:09.570 "num_base_bdevs": 4, 00:34:09.570 "num_base_bdevs_discovered": 4, 00:34:09.570 "num_base_bdevs_operational": 4, 00:34:09.570 "base_bdevs_list": [ 00:34:09.570 { 00:34:09.570 "name": "BaseBdev1", 00:34:09.570 "uuid": "d94d3f05-f0e3-58e8-ac2e-28c300fa1177", 00:34:09.570 "is_configured": true, 00:34:09.570 "data_offset": 2048, 00:34:09.570 "data_size": 63488 00:34:09.570 }, 00:34:09.570 { 00:34:09.570 "name": "BaseBdev2", 00:34:09.570 "uuid": "f580b4ac-3916-5ec1-97a1-01bd1811ee2e", 00:34:09.570 "is_configured": true, 00:34:09.570 "data_offset": 2048, 00:34:09.570 "data_size": 63488 00:34:09.570 }, 00:34:09.570 { 00:34:09.570 "name": "BaseBdev3", 00:34:09.570 "uuid": "17688677-14db-5363-9694-63381f067ea8", 00:34:09.570 "is_configured": true, 00:34:09.570 "data_offset": 2048, 00:34:09.570 "data_size": 63488 00:34:09.570 }, 00:34:09.570 { 00:34:09.570 "name": "BaseBdev4", 00:34:09.570 "uuid": "6e36b0e3-f322-5bcb-8934-fee496627e41", 00:34:09.570 "is_configured": true, 00:34:09.570 "data_offset": 2048, 00:34:09.570 "data_size": 63488 00:34:09.570 } 00:34:09.570 ] 00:34:09.570 }' 00:34:09.570 11:44:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:34:09.570 11:44:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:34:10.133 11:44:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:34:10.133 11:44:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:34:10.133 [2024-06-10 11:44:53.912605] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2215670 00:34:11.066 11:44:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:34:11.324 11:44:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:34:11.324 11:44:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:34:11.324 11:44:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:34:11.324 11:44:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:34:11.324 11:44:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:34:11.324 11:44:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:34:11.324 11:44:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:34:11.324 11:44:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:34:11.324 11:44:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:34:11.324 11:44:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:34:11.324 11:44:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:34:11.324 11:44:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:34:11.324 11:44:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:34:11.324 11:44:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:11.324 11:44:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:34:11.324 11:44:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:34:11.324 "name": "raid_bdev1", 00:34:11.324 "uuid": "27b37b86-8233-43d4-8f26-f649ccc2def2", 00:34:11.324 "strip_size_kb": 64, 00:34:11.324 "state": "online", 00:34:11.324 "raid_level": "raid0", 00:34:11.324 "superblock": true, 00:34:11.324 "num_base_bdevs": 4, 00:34:11.324 "num_base_bdevs_discovered": 4, 00:34:11.324 "num_base_bdevs_operational": 4, 00:34:11.324 "base_bdevs_list": [ 00:34:11.324 { 00:34:11.324 "name": "BaseBdev1", 00:34:11.324 "uuid": "d94d3f05-f0e3-58e8-ac2e-28c300fa1177", 00:34:11.324 "is_configured": true, 00:34:11.324 "data_offset": 2048, 00:34:11.324 "data_size": 63488 00:34:11.324 }, 00:34:11.324 { 00:34:11.324 "name": "BaseBdev2", 00:34:11.324 "uuid": "f580b4ac-3916-5ec1-97a1-01bd1811ee2e", 00:34:11.324 "is_configured": true, 00:34:11.324 "data_offset": 2048, 00:34:11.324 "data_size": 63488 00:34:11.324 }, 00:34:11.324 { 00:34:11.324 "name": "BaseBdev3", 00:34:11.324 "uuid": "17688677-14db-5363-9694-63381f067ea8", 00:34:11.324 "is_configured": true, 00:34:11.324 "data_offset": 2048, 00:34:11.324 "data_size": 63488 00:34:11.324 }, 00:34:11.324 { 00:34:11.324 "name": "BaseBdev4", 00:34:11.324 "uuid": "6e36b0e3-f322-5bcb-8934-fee496627e41", 00:34:11.324 "is_configured": true, 00:34:11.324 "data_offset": 2048, 00:34:11.324 "data_size": 63488 00:34:11.324 } 00:34:11.324 ] 00:34:11.324 }' 00:34:11.324 11:44:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:34:11.324 11:44:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:34:11.889 11:44:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:34:11.889 [2024-06-10 11:44:55.833675] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:34:11.889 [2024-06-10 11:44:55.833710] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:34:12.147 [2024-06-10 11:44:55.835941] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:34:12.147 [2024-06-10 11:44:55.835971] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:34:12.147 [2024-06-10 11:44:55.836000] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:34:12.147 [2024-06-10 11:44:55.836007] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2213020 name raid_bdev1, state offline 00:34:12.147 0 00:34:12.147 11:44:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 190479 00:34:12.147 11:44:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@949 -- # '[' -z 190479 ']' 00:34:12.147 11:44:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # kill -0 190479 00:34:12.147 11:44:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # uname 00:34:12.147 11:44:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:34:12.147 11:44:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 190479 00:34:12.147 11:44:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:34:12.147 11:44:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:34:12.147 11:44:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 190479' 00:34:12.147 killing process with pid 190479 00:34:12.147 11:44:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # kill 190479 00:34:12.147 [2024-06-10 11:44:55.887508] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:34:12.147 11:44:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@973 -- # wait 190479 00:34:12.147 [2024-06-10 11:44:55.921666] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:34:12.405 11:44:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.samwN1Hfmj 00:34:12.405 11:44:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:34:12.405 11:44:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:34:12.405 11:44:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.52 00:34:12.405 11:44:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:34:12.405 11:44:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:34:12.405 11:44:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:34:12.405 11:44:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.52 != \0\.\0\0 ]] 00:34:12.405 00:34:12.405 real 0m6.112s 00:34:12.405 user 0m9.370s 00:34:12.405 sys 0m1.146s 00:34:12.405 11:44:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:34:12.405 11:44:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:34:12.405 ************************************ 00:34:12.405 END TEST raid_write_error_test 00:34:12.405 ************************************ 00:34:12.405 11:44:56 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:34:12.405 11:44:56 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 4 false 00:34:12.405 11:44:56 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:34:12.405 11:44:56 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:34:12.405 11:44:56 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:34:12.405 ************************************ 00:34:12.405 START TEST raid_state_function_test 00:34:12.405 ************************************ 00:34:12.405 11:44:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # raid_state_function_test concat 4 false 00:34:12.405 11:44:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:34:12.405 11:44:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:34:12.405 11:44:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:34:12.405 11:44:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:34:12.405 11:44:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:34:12.405 11:44:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:34:12.405 11:44:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:34:12.405 11:44:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:34:12.405 11:44:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:34:12.405 11:44:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:34:12.405 11:44:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:34:12.405 11:44:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:34:12.405 11:44:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:34:12.405 11:44:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:34:12.405 11:44:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:34:12.405 11:44:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:34:12.405 11:44:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:34:12.405 11:44:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:34:12.405 11:44:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:34:12.405 11:44:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:34:12.405 11:44:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:34:12.405 11:44:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:34:12.405 11:44:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:34:12.405 11:44:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:34:12.405 11:44:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:34:12.405 11:44:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:34:12.405 11:44:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:34:12.405 11:44:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:34:12.405 11:44:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:34:12.405 11:44:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=191452 00:34:12.405 11:44:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 191452' 00:34:12.405 Process raid pid: 191452 00:34:12.405 11:44:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:34:12.405 11:44:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 191452 /var/tmp/spdk-raid.sock 00:34:12.405 11:44:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@830 -- # '[' -z 191452 ']' 00:34:12.405 11:44:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:34:12.405 11:44:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:34:12.405 11:44:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:34:12.405 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:34:12.405 11:44:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:34:12.405 11:44:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:34:12.405 [2024-06-10 11:44:56.297831] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:34:12.405 [2024-06-10 11:44:56.297887] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:34:12.663 [2024-06-10 11:44:56.386377] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:12.663 [2024-06-10 11:44:56.473780] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:34:12.663 [2024-06-10 11:44:56.532758] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:34:12.663 [2024-06-10 11:44:56.532783] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:34:13.227 11:44:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:34:13.227 11:44:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@863 -- # return 0 00:34:13.227 11:44:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:34:13.517 [2024-06-10 11:44:57.239616] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:34:13.517 [2024-06-10 11:44:57.239651] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:34:13.517 [2024-06-10 11:44:57.239659] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:34:13.517 [2024-06-10 11:44:57.239666] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:34:13.517 [2024-06-10 11:44:57.239672] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:34:13.517 [2024-06-10 11:44:57.239680] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:34:13.517 [2024-06-10 11:44:57.239686] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:34:13.517 [2024-06-10 11:44:57.239694] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:34:13.517 11:44:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:34:13.517 11:44:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:34:13.517 11:44:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:34:13.517 11:44:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:34:13.517 11:44:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:34:13.517 11:44:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:34:13.517 11:44:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:34:13.517 11:44:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:34:13.517 11:44:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:34:13.517 11:44:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:34:13.517 11:44:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:13.517 11:44:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:34:13.822 11:44:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:34:13.822 "name": "Existed_Raid", 00:34:13.822 "uuid": "00000000-0000-0000-0000-000000000000", 00:34:13.822 "strip_size_kb": 64, 00:34:13.822 "state": "configuring", 00:34:13.822 "raid_level": "concat", 00:34:13.822 "superblock": false, 00:34:13.822 "num_base_bdevs": 4, 00:34:13.822 "num_base_bdevs_discovered": 0, 00:34:13.822 "num_base_bdevs_operational": 4, 00:34:13.822 "base_bdevs_list": [ 00:34:13.822 { 00:34:13.822 "name": "BaseBdev1", 00:34:13.822 "uuid": "00000000-0000-0000-0000-000000000000", 00:34:13.822 "is_configured": false, 00:34:13.822 "data_offset": 0, 00:34:13.822 "data_size": 0 00:34:13.822 }, 00:34:13.822 { 00:34:13.822 "name": "BaseBdev2", 00:34:13.822 "uuid": "00000000-0000-0000-0000-000000000000", 00:34:13.822 "is_configured": false, 00:34:13.822 "data_offset": 0, 00:34:13.822 "data_size": 0 00:34:13.822 }, 00:34:13.822 { 00:34:13.822 "name": "BaseBdev3", 00:34:13.822 "uuid": "00000000-0000-0000-0000-000000000000", 00:34:13.822 "is_configured": false, 00:34:13.822 "data_offset": 0, 00:34:13.822 "data_size": 0 00:34:13.822 }, 00:34:13.822 { 00:34:13.822 "name": "BaseBdev4", 00:34:13.822 "uuid": "00000000-0000-0000-0000-000000000000", 00:34:13.822 "is_configured": false, 00:34:13.822 "data_offset": 0, 00:34:13.822 "data_size": 0 00:34:13.822 } 00:34:13.822 ] 00:34:13.822 }' 00:34:13.822 11:44:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:34:13.822 11:44:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:34:14.079 11:44:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:34:14.336 [2024-06-10 11:44:58.069686] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:34:14.336 [2024-06-10 11:44:58.069710] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x265f550 name Existed_Raid, state configuring 00:34:14.336 11:44:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:34:14.336 [2024-06-10 11:44:58.234131] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:34:14.336 [2024-06-10 11:44:58.234152] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:34:14.336 [2024-06-10 11:44:58.234158] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:34:14.336 [2024-06-10 11:44:58.234166] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:34:14.336 [2024-06-10 11:44:58.234172] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:34:14.336 [2024-06-10 11:44:58.234179] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:34:14.336 [2024-06-10 11:44:58.234185] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:34:14.336 [2024-06-10 11:44:58.234192] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:34:14.336 11:44:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:34:14.593 [2024-06-10 11:44:58.419334] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:34:14.593 BaseBdev1 00:34:14.593 11:44:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:34:14.593 11:44:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:34:14.593 11:44:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:34:14.593 11:44:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:34:14.593 11:44:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:34:14.593 11:44:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:34:14.593 11:44:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:34:14.850 11:44:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:34:14.850 [ 00:34:14.850 { 00:34:14.850 "name": "BaseBdev1", 00:34:14.850 "aliases": [ 00:34:14.850 "41ebf850-2233-41c1-b5bb-7a8c01c6b487" 00:34:14.850 ], 00:34:14.850 "product_name": "Malloc disk", 00:34:14.850 "block_size": 512, 00:34:14.850 "num_blocks": 65536, 00:34:14.850 "uuid": "41ebf850-2233-41c1-b5bb-7a8c01c6b487", 00:34:14.851 "assigned_rate_limits": { 00:34:14.851 "rw_ios_per_sec": 0, 00:34:14.851 "rw_mbytes_per_sec": 0, 00:34:14.851 "r_mbytes_per_sec": 0, 00:34:14.851 "w_mbytes_per_sec": 0 00:34:14.851 }, 00:34:14.851 "claimed": true, 00:34:14.851 "claim_type": "exclusive_write", 00:34:14.851 "zoned": false, 00:34:14.851 "supported_io_types": { 00:34:14.851 "read": true, 00:34:14.851 "write": true, 00:34:14.851 "unmap": true, 00:34:14.851 "write_zeroes": true, 00:34:14.851 "flush": true, 00:34:14.851 "reset": true, 00:34:14.851 "compare": false, 00:34:14.851 "compare_and_write": false, 00:34:14.851 "abort": true, 00:34:14.851 "nvme_admin": false, 00:34:14.851 "nvme_io": false 00:34:14.851 }, 00:34:14.851 "memory_domains": [ 00:34:14.851 { 00:34:14.851 "dma_device_id": "system", 00:34:14.851 "dma_device_type": 1 00:34:14.851 }, 00:34:14.851 { 00:34:14.851 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:34:14.851 "dma_device_type": 2 00:34:14.851 } 00:34:14.851 ], 00:34:14.851 "driver_specific": {} 00:34:14.851 } 00:34:14.851 ] 00:34:14.851 11:44:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:34:14.851 11:44:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:34:14.851 11:44:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:34:14.851 11:44:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:34:14.851 11:44:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:34:14.851 11:44:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:34:14.851 11:44:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:34:14.851 11:44:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:34:14.851 11:44:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:34:14.851 11:44:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:34:14.851 11:44:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:34:14.851 11:44:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:14.851 11:44:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:34:15.109 11:44:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:34:15.109 "name": "Existed_Raid", 00:34:15.109 "uuid": "00000000-0000-0000-0000-000000000000", 00:34:15.109 "strip_size_kb": 64, 00:34:15.109 "state": "configuring", 00:34:15.109 "raid_level": "concat", 00:34:15.109 "superblock": false, 00:34:15.109 "num_base_bdevs": 4, 00:34:15.109 "num_base_bdevs_discovered": 1, 00:34:15.109 "num_base_bdevs_operational": 4, 00:34:15.109 "base_bdevs_list": [ 00:34:15.109 { 00:34:15.109 "name": "BaseBdev1", 00:34:15.109 "uuid": "41ebf850-2233-41c1-b5bb-7a8c01c6b487", 00:34:15.109 "is_configured": true, 00:34:15.109 "data_offset": 0, 00:34:15.109 "data_size": 65536 00:34:15.109 }, 00:34:15.109 { 00:34:15.109 "name": "BaseBdev2", 00:34:15.109 "uuid": "00000000-0000-0000-0000-000000000000", 00:34:15.109 "is_configured": false, 00:34:15.109 "data_offset": 0, 00:34:15.109 "data_size": 0 00:34:15.109 }, 00:34:15.109 { 00:34:15.109 "name": "BaseBdev3", 00:34:15.109 "uuid": "00000000-0000-0000-0000-000000000000", 00:34:15.109 "is_configured": false, 00:34:15.109 "data_offset": 0, 00:34:15.109 "data_size": 0 00:34:15.109 }, 00:34:15.109 { 00:34:15.109 "name": "BaseBdev4", 00:34:15.109 "uuid": "00000000-0000-0000-0000-000000000000", 00:34:15.109 "is_configured": false, 00:34:15.109 "data_offset": 0, 00:34:15.109 "data_size": 0 00:34:15.109 } 00:34:15.109 ] 00:34:15.109 }' 00:34:15.109 11:44:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:34:15.109 11:44:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:34:15.673 11:44:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:34:15.673 [2024-06-10 11:44:59.602383] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:34:15.673 [2024-06-10 11:44:59.602413] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x265edc0 name Existed_Raid, state configuring 00:34:15.930 11:44:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:34:15.930 [2024-06-10 11:44:59.778862] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:34:15.930 [2024-06-10 11:44:59.779882] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:34:15.930 [2024-06-10 11:44:59.779909] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:34:15.930 [2024-06-10 11:44:59.779916] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:34:15.930 [2024-06-10 11:44:59.779924] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:34:15.930 [2024-06-10 11:44:59.779929] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:34:15.930 [2024-06-10 11:44:59.779936] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:34:15.930 11:44:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:34:15.930 11:44:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:34:15.930 11:44:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:34:15.930 11:44:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:34:15.930 11:44:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:34:15.930 11:44:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:34:15.930 11:44:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:34:15.930 11:44:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:34:15.930 11:44:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:34:15.930 11:44:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:34:15.930 11:44:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:34:15.930 11:44:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:34:15.930 11:44:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:15.930 11:44:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:34:16.187 11:44:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:34:16.187 "name": "Existed_Raid", 00:34:16.187 "uuid": "00000000-0000-0000-0000-000000000000", 00:34:16.188 "strip_size_kb": 64, 00:34:16.188 "state": "configuring", 00:34:16.188 "raid_level": "concat", 00:34:16.188 "superblock": false, 00:34:16.188 "num_base_bdevs": 4, 00:34:16.188 "num_base_bdevs_discovered": 1, 00:34:16.188 "num_base_bdevs_operational": 4, 00:34:16.188 "base_bdevs_list": [ 00:34:16.188 { 00:34:16.188 "name": "BaseBdev1", 00:34:16.188 "uuid": "41ebf850-2233-41c1-b5bb-7a8c01c6b487", 00:34:16.188 "is_configured": true, 00:34:16.188 "data_offset": 0, 00:34:16.188 "data_size": 65536 00:34:16.188 }, 00:34:16.188 { 00:34:16.188 "name": "BaseBdev2", 00:34:16.188 "uuid": "00000000-0000-0000-0000-000000000000", 00:34:16.188 "is_configured": false, 00:34:16.188 "data_offset": 0, 00:34:16.188 "data_size": 0 00:34:16.188 }, 00:34:16.188 { 00:34:16.188 "name": "BaseBdev3", 00:34:16.188 "uuid": "00000000-0000-0000-0000-000000000000", 00:34:16.188 "is_configured": false, 00:34:16.188 "data_offset": 0, 00:34:16.188 "data_size": 0 00:34:16.188 }, 00:34:16.188 { 00:34:16.188 "name": "BaseBdev4", 00:34:16.188 "uuid": "00000000-0000-0000-0000-000000000000", 00:34:16.188 "is_configured": false, 00:34:16.188 "data_offset": 0, 00:34:16.188 "data_size": 0 00:34:16.188 } 00:34:16.188 ] 00:34:16.188 }' 00:34:16.188 11:44:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:34:16.188 11:44:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:34:16.752 11:45:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:34:16.752 [2024-06-10 11:45:00.607965] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:34:16.752 BaseBdev2 00:34:16.752 11:45:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:34:16.752 11:45:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:34:16.752 11:45:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:34:16.752 11:45:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:34:16.752 11:45:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:34:16.752 11:45:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:34:16.752 11:45:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:34:17.009 11:45:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:34:17.267 [ 00:34:17.267 { 00:34:17.267 "name": "BaseBdev2", 00:34:17.267 "aliases": [ 00:34:17.267 "a481ea3a-7f64-4d79-9cf2-2408fb7b68cd" 00:34:17.267 ], 00:34:17.268 "product_name": "Malloc disk", 00:34:17.268 "block_size": 512, 00:34:17.268 "num_blocks": 65536, 00:34:17.268 "uuid": "a481ea3a-7f64-4d79-9cf2-2408fb7b68cd", 00:34:17.268 "assigned_rate_limits": { 00:34:17.268 "rw_ios_per_sec": 0, 00:34:17.268 "rw_mbytes_per_sec": 0, 00:34:17.268 "r_mbytes_per_sec": 0, 00:34:17.268 "w_mbytes_per_sec": 0 00:34:17.268 }, 00:34:17.268 "claimed": true, 00:34:17.268 "claim_type": "exclusive_write", 00:34:17.268 "zoned": false, 00:34:17.268 "supported_io_types": { 00:34:17.268 "read": true, 00:34:17.268 "write": true, 00:34:17.268 "unmap": true, 00:34:17.268 "write_zeroes": true, 00:34:17.268 "flush": true, 00:34:17.268 "reset": true, 00:34:17.268 "compare": false, 00:34:17.268 "compare_and_write": false, 00:34:17.268 "abort": true, 00:34:17.268 "nvme_admin": false, 00:34:17.268 "nvme_io": false 00:34:17.268 }, 00:34:17.268 "memory_domains": [ 00:34:17.268 { 00:34:17.268 "dma_device_id": "system", 00:34:17.268 "dma_device_type": 1 00:34:17.268 }, 00:34:17.268 { 00:34:17.268 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:34:17.268 "dma_device_type": 2 00:34:17.268 } 00:34:17.268 ], 00:34:17.268 "driver_specific": {} 00:34:17.268 } 00:34:17.268 ] 00:34:17.268 11:45:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:34:17.268 11:45:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:34:17.268 11:45:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:34:17.268 11:45:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:34:17.268 11:45:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:34:17.268 11:45:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:34:17.268 11:45:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:34:17.268 11:45:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:34:17.268 11:45:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:34:17.268 11:45:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:34:17.268 11:45:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:34:17.268 11:45:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:34:17.268 11:45:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:34:17.268 11:45:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:17.268 11:45:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:34:17.268 11:45:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:34:17.268 "name": "Existed_Raid", 00:34:17.268 "uuid": "00000000-0000-0000-0000-000000000000", 00:34:17.268 "strip_size_kb": 64, 00:34:17.268 "state": "configuring", 00:34:17.268 "raid_level": "concat", 00:34:17.268 "superblock": false, 00:34:17.268 "num_base_bdevs": 4, 00:34:17.268 "num_base_bdevs_discovered": 2, 00:34:17.268 "num_base_bdevs_operational": 4, 00:34:17.268 "base_bdevs_list": [ 00:34:17.268 { 00:34:17.268 "name": "BaseBdev1", 00:34:17.268 "uuid": "41ebf850-2233-41c1-b5bb-7a8c01c6b487", 00:34:17.268 "is_configured": true, 00:34:17.268 "data_offset": 0, 00:34:17.268 "data_size": 65536 00:34:17.268 }, 00:34:17.268 { 00:34:17.268 "name": "BaseBdev2", 00:34:17.268 "uuid": "a481ea3a-7f64-4d79-9cf2-2408fb7b68cd", 00:34:17.268 "is_configured": true, 00:34:17.268 "data_offset": 0, 00:34:17.268 "data_size": 65536 00:34:17.268 }, 00:34:17.268 { 00:34:17.268 "name": "BaseBdev3", 00:34:17.268 "uuid": "00000000-0000-0000-0000-000000000000", 00:34:17.268 "is_configured": false, 00:34:17.268 "data_offset": 0, 00:34:17.268 "data_size": 0 00:34:17.268 }, 00:34:17.268 { 00:34:17.268 "name": "BaseBdev4", 00:34:17.268 "uuid": "00000000-0000-0000-0000-000000000000", 00:34:17.268 "is_configured": false, 00:34:17.268 "data_offset": 0, 00:34:17.268 "data_size": 0 00:34:17.268 } 00:34:17.268 ] 00:34:17.268 }' 00:34:17.268 11:45:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:34:17.268 11:45:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:34:17.833 11:45:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:34:18.090 [2024-06-10 11:45:01.826084] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:34:18.090 BaseBdev3 00:34:18.090 11:45:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:34:18.090 11:45:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:34:18.090 11:45:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:34:18.090 11:45:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:34:18.090 11:45:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:34:18.090 11:45:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:34:18.090 11:45:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:34:18.090 11:45:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:34:18.346 [ 00:34:18.346 { 00:34:18.346 "name": "BaseBdev3", 00:34:18.346 "aliases": [ 00:34:18.346 "4e2d6bd1-e42d-4084-afd1-62e7206ce072" 00:34:18.346 ], 00:34:18.346 "product_name": "Malloc disk", 00:34:18.346 "block_size": 512, 00:34:18.346 "num_blocks": 65536, 00:34:18.346 "uuid": "4e2d6bd1-e42d-4084-afd1-62e7206ce072", 00:34:18.346 "assigned_rate_limits": { 00:34:18.346 "rw_ios_per_sec": 0, 00:34:18.346 "rw_mbytes_per_sec": 0, 00:34:18.346 "r_mbytes_per_sec": 0, 00:34:18.346 "w_mbytes_per_sec": 0 00:34:18.346 }, 00:34:18.346 "claimed": true, 00:34:18.346 "claim_type": "exclusive_write", 00:34:18.346 "zoned": false, 00:34:18.346 "supported_io_types": { 00:34:18.346 "read": true, 00:34:18.346 "write": true, 00:34:18.346 "unmap": true, 00:34:18.346 "write_zeroes": true, 00:34:18.346 "flush": true, 00:34:18.346 "reset": true, 00:34:18.346 "compare": false, 00:34:18.346 "compare_and_write": false, 00:34:18.346 "abort": true, 00:34:18.346 "nvme_admin": false, 00:34:18.346 "nvme_io": false 00:34:18.346 }, 00:34:18.346 "memory_domains": [ 00:34:18.346 { 00:34:18.346 "dma_device_id": "system", 00:34:18.346 "dma_device_type": 1 00:34:18.346 }, 00:34:18.346 { 00:34:18.346 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:34:18.346 "dma_device_type": 2 00:34:18.346 } 00:34:18.346 ], 00:34:18.346 "driver_specific": {} 00:34:18.346 } 00:34:18.346 ] 00:34:18.346 11:45:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:34:18.346 11:45:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:34:18.346 11:45:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:34:18.346 11:45:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:34:18.346 11:45:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:34:18.346 11:45:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:34:18.346 11:45:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:34:18.346 11:45:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:34:18.346 11:45:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:34:18.346 11:45:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:34:18.347 11:45:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:34:18.347 11:45:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:34:18.347 11:45:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:34:18.347 11:45:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:18.347 11:45:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:34:18.604 11:45:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:34:18.604 "name": "Existed_Raid", 00:34:18.604 "uuid": "00000000-0000-0000-0000-000000000000", 00:34:18.604 "strip_size_kb": 64, 00:34:18.604 "state": "configuring", 00:34:18.604 "raid_level": "concat", 00:34:18.604 "superblock": false, 00:34:18.604 "num_base_bdevs": 4, 00:34:18.604 "num_base_bdevs_discovered": 3, 00:34:18.604 "num_base_bdevs_operational": 4, 00:34:18.604 "base_bdevs_list": [ 00:34:18.604 { 00:34:18.604 "name": "BaseBdev1", 00:34:18.604 "uuid": "41ebf850-2233-41c1-b5bb-7a8c01c6b487", 00:34:18.604 "is_configured": true, 00:34:18.604 "data_offset": 0, 00:34:18.604 "data_size": 65536 00:34:18.604 }, 00:34:18.604 { 00:34:18.604 "name": "BaseBdev2", 00:34:18.604 "uuid": "a481ea3a-7f64-4d79-9cf2-2408fb7b68cd", 00:34:18.604 "is_configured": true, 00:34:18.604 "data_offset": 0, 00:34:18.604 "data_size": 65536 00:34:18.604 }, 00:34:18.604 { 00:34:18.604 "name": "BaseBdev3", 00:34:18.604 "uuid": "4e2d6bd1-e42d-4084-afd1-62e7206ce072", 00:34:18.604 "is_configured": true, 00:34:18.604 "data_offset": 0, 00:34:18.604 "data_size": 65536 00:34:18.604 }, 00:34:18.604 { 00:34:18.604 "name": "BaseBdev4", 00:34:18.604 "uuid": "00000000-0000-0000-0000-000000000000", 00:34:18.604 "is_configured": false, 00:34:18.604 "data_offset": 0, 00:34:18.604 "data_size": 0 00:34:18.604 } 00:34:18.604 ] 00:34:18.604 }' 00:34:18.604 11:45:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:34:18.604 11:45:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:34:19.168 11:45:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:34:19.168 [2024-06-10 11:45:03.040773] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:34:19.168 [2024-06-10 11:45:03.040803] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x265fe20 00:34:19.168 [2024-06-10 11:45:03.040808] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:34:19.168 [2024-06-10 11:45:03.040987] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2660a70 00:34:19.168 [2024-06-10 11:45:03.041073] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x265fe20 00:34:19.168 [2024-06-10 11:45:03.041080] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x265fe20 00:34:19.168 [2024-06-10 11:45:03.041212] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:34:19.168 BaseBdev4 00:34:19.168 11:45:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:34:19.168 11:45:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev4 00:34:19.168 11:45:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:34:19.168 11:45:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:34:19.168 11:45:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:34:19.168 11:45:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:34:19.168 11:45:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:34:19.425 11:45:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:34:19.684 [ 00:34:19.684 { 00:34:19.684 "name": "BaseBdev4", 00:34:19.684 "aliases": [ 00:34:19.684 "ac5ed165-cc09-4f9d-8e3f-b4b2c1c30b9a" 00:34:19.684 ], 00:34:19.684 "product_name": "Malloc disk", 00:34:19.684 "block_size": 512, 00:34:19.684 "num_blocks": 65536, 00:34:19.684 "uuid": "ac5ed165-cc09-4f9d-8e3f-b4b2c1c30b9a", 00:34:19.684 "assigned_rate_limits": { 00:34:19.684 "rw_ios_per_sec": 0, 00:34:19.684 "rw_mbytes_per_sec": 0, 00:34:19.684 "r_mbytes_per_sec": 0, 00:34:19.684 "w_mbytes_per_sec": 0 00:34:19.684 }, 00:34:19.684 "claimed": true, 00:34:19.684 "claim_type": "exclusive_write", 00:34:19.684 "zoned": false, 00:34:19.684 "supported_io_types": { 00:34:19.684 "read": true, 00:34:19.684 "write": true, 00:34:19.684 "unmap": true, 00:34:19.684 "write_zeroes": true, 00:34:19.684 "flush": true, 00:34:19.684 "reset": true, 00:34:19.684 "compare": false, 00:34:19.684 "compare_and_write": false, 00:34:19.684 "abort": true, 00:34:19.684 "nvme_admin": false, 00:34:19.684 "nvme_io": false 00:34:19.684 }, 00:34:19.684 "memory_domains": [ 00:34:19.684 { 00:34:19.684 "dma_device_id": "system", 00:34:19.684 "dma_device_type": 1 00:34:19.684 }, 00:34:19.684 { 00:34:19.684 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:34:19.684 "dma_device_type": 2 00:34:19.684 } 00:34:19.684 ], 00:34:19.684 "driver_specific": {} 00:34:19.684 } 00:34:19.684 ] 00:34:19.684 11:45:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:34:19.684 11:45:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:34:19.684 11:45:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:34:19.684 11:45:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:34:19.684 11:45:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:34:19.684 11:45:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:34:19.684 11:45:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:34:19.684 11:45:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:34:19.684 11:45:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:34:19.684 11:45:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:34:19.684 11:45:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:34:19.684 11:45:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:34:19.684 11:45:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:34:19.684 11:45:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:19.684 11:45:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:34:19.684 11:45:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:34:19.684 "name": "Existed_Raid", 00:34:19.684 "uuid": "da9e6010-0760-458e-9219-66db7dfea2ed", 00:34:19.684 "strip_size_kb": 64, 00:34:19.684 "state": "online", 00:34:19.684 "raid_level": "concat", 00:34:19.684 "superblock": false, 00:34:19.684 "num_base_bdevs": 4, 00:34:19.684 "num_base_bdevs_discovered": 4, 00:34:19.684 "num_base_bdevs_operational": 4, 00:34:19.684 "base_bdevs_list": [ 00:34:19.684 { 00:34:19.684 "name": "BaseBdev1", 00:34:19.684 "uuid": "41ebf850-2233-41c1-b5bb-7a8c01c6b487", 00:34:19.684 "is_configured": true, 00:34:19.684 "data_offset": 0, 00:34:19.684 "data_size": 65536 00:34:19.684 }, 00:34:19.684 { 00:34:19.684 "name": "BaseBdev2", 00:34:19.684 "uuid": "a481ea3a-7f64-4d79-9cf2-2408fb7b68cd", 00:34:19.684 "is_configured": true, 00:34:19.684 "data_offset": 0, 00:34:19.684 "data_size": 65536 00:34:19.684 }, 00:34:19.684 { 00:34:19.684 "name": "BaseBdev3", 00:34:19.684 "uuid": "4e2d6bd1-e42d-4084-afd1-62e7206ce072", 00:34:19.684 "is_configured": true, 00:34:19.684 "data_offset": 0, 00:34:19.684 "data_size": 65536 00:34:19.684 }, 00:34:19.684 { 00:34:19.684 "name": "BaseBdev4", 00:34:19.684 "uuid": "ac5ed165-cc09-4f9d-8e3f-b4b2c1c30b9a", 00:34:19.684 "is_configured": true, 00:34:19.684 "data_offset": 0, 00:34:19.684 "data_size": 65536 00:34:19.684 } 00:34:19.684 ] 00:34:19.684 }' 00:34:19.684 11:45:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:34:19.684 11:45:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:34:20.249 11:45:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:34:20.249 11:45:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:34:20.249 11:45:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:34:20.249 11:45:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:34:20.249 11:45:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:34:20.249 11:45:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:34:20.249 11:45:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:34:20.249 11:45:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:34:20.507 [2024-06-10 11:45:04.240084] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:34:20.507 11:45:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:34:20.507 "name": "Existed_Raid", 00:34:20.507 "aliases": [ 00:34:20.507 "da9e6010-0760-458e-9219-66db7dfea2ed" 00:34:20.507 ], 00:34:20.507 "product_name": "Raid Volume", 00:34:20.507 "block_size": 512, 00:34:20.507 "num_blocks": 262144, 00:34:20.507 "uuid": "da9e6010-0760-458e-9219-66db7dfea2ed", 00:34:20.507 "assigned_rate_limits": { 00:34:20.507 "rw_ios_per_sec": 0, 00:34:20.507 "rw_mbytes_per_sec": 0, 00:34:20.507 "r_mbytes_per_sec": 0, 00:34:20.507 "w_mbytes_per_sec": 0 00:34:20.507 }, 00:34:20.507 "claimed": false, 00:34:20.507 "zoned": false, 00:34:20.507 "supported_io_types": { 00:34:20.507 "read": true, 00:34:20.507 "write": true, 00:34:20.507 "unmap": true, 00:34:20.507 "write_zeroes": true, 00:34:20.507 "flush": true, 00:34:20.507 "reset": true, 00:34:20.507 "compare": false, 00:34:20.507 "compare_and_write": false, 00:34:20.507 "abort": false, 00:34:20.507 "nvme_admin": false, 00:34:20.507 "nvme_io": false 00:34:20.507 }, 00:34:20.507 "memory_domains": [ 00:34:20.507 { 00:34:20.507 "dma_device_id": "system", 00:34:20.507 "dma_device_type": 1 00:34:20.507 }, 00:34:20.507 { 00:34:20.507 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:34:20.507 "dma_device_type": 2 00:34:20.507 }, 00:34:20.507 { 00:34:20.507 "dma_device_id": "system", 00:34:20.507 "dma_device_type": 1 00:34:20.507 }, 00:34:20.507 { 00:34:20.507 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:34:20.507 "dma_device_type": 2 00:34:20.507 }, 00:34:20.507 { 00:34:20.507 "dma_device_id": "system", 00:34:20.507 "dma_device_type": 1 00:34:20.507 }, 00:34:20.507 { 00:34:20.507 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:34:20.507 "dma_device_type": 2 00:34:20.507 }, 00:34:20.507 { 00:34:20.507 "dma_device_id": "system", 00:34:20.507 "dma_device_type": 1 00:34:20.507 }, 00:34:20.507 { 00:34:20.507 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:34:20.507 "dma_device_type": 2 00:34:20.507 } 00:34:20.507 ], 00:34:20.507 "driver_specific": { 00:34:20.507 "raid": { 00:34:20.507 "uuid": "da9e6010-0760-458e-9219-66db7dfea2ed", 00:34:20.507 "strip_size_kb": 64, 00:34:20.507 "state": "online", 00:34:20.507 "raid_level": "concat", 00:34:20.507 "superblock": false, 00:34:20.507 "num_base_bdevs": 4, 00:34:20.507 "num_base_bdevs_discovered": 4, 00:34:20.507 "num_base_bdevs_operational": 4, 00:34:20.507 "base_bdevs_list": [ 00:34:20.507 { 00:34:20.507 "name": "BaseBdev1", 00:34:20.507 "uuid": "41ebf850-2233-41c1-b5bb-7a8c01c6b487", 00:34:20.507 "is_configured": true, 00:34:20.507 "data_offset": 0, 00:34:20.507 "data_size": 65536 00:34:20.507 }, 00:34:20.507 { 00:34:20.507 "name": "BaseBdev2", 00:34:20.507 "uuid": "a481ea3a-7f64-4d79-9cf2-2408fb7b68cd", 00:34:20.507 "is_configured": true, 00:34:20.508 "data_offset": 0, 00:34:20.508 "data_size": 65536 00:34:20.508 }, 00:34:20.508 { 00:34:20.508 "name": "BaseBdev3", 00:34:20.508 "uuid": "4e2d6bd1-e42d-4084-afd1-62e7206ce072", 00:34:20.508 "is_configured": true, 00:34:20.508 "data_offset": 0, 00:34:20.508 "data_size": 65536 00:34:20.508 }, 00:34:20.508 { 00:34:20.508 "name": "BaseBdev4", 00:34:20.508 "uuid": "ac5ed165-cc09-4f9d-8e3f-b4b2c1c30b9a", 00:34:20.508 "is_configured": true, 00:34:20.508 "data_offset": 0, 00:34:20.508 "data_size": 65536 00:34:20.508 } 00:34:20.508 ] 00:34:20.508 } 00:34:20.508 } 00:34:20.508 }' 00:34:20.508 11:45:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:34:20.508 11:45:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:34:20.508 BaseBdev2 00:34:20.508 BaseBdev3 00:34:20.508 BaseBdev4' 00:34:20.508 11:45:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:34:20.508 11:45:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:34:20.508 11:45:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:34:20.765 11:45:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:34:20.765 "name": "BaseBdev1", 00:34:20.765 "aliases": [ 00:34:20.765 "41ebf850-2233-41c1-b5bb-7a8c01c6b487" 00:34:20.765 ], 00:34:20.765 "product_name": "Malloc disk", 00:34:20.765 "block_size": 512, 00:34:20.765 "num_blocks": 65536, 00:34:20.765 "uuid": "41ebf850-2233-41c1-b5bb-7a8c01c6b487", 00:34:20.765 "assigned_rate_limits": { 00:34:20.765 "rw_ios_per_sec": 0, 00:34:20.765 "rw_mbytes_per_sec": 0, 00:34:20.765 "r_mbytes_per_sec": 0, 00:34:20.765 "w_mbytes_per_sec": 0 00:34:20.765 }, 00:34:20.765 "claimed": true, 00:34:20.765 "claim_type": "exclusive_write", 00:34:20.765 "zoned": false, 00:34:20.765 "supported_io_types": { 00:34:20.765 "read": true, 00:34:20.765 "write": true, 00:34:20.765 "unmap": true, 00:34:20.765 "write_zeroes": true, 00:34:20.765 "flush": true, 00:34:20.765 "reset": true, 00:34:20.765 "compare": false, 00:34:20.765 "compare_and_write": false, 00:34:20.765 "abort": true, 00:34:20.765 "nvme_admin": false, 00:34:20.765 "nvme_io": false 00:34:20.765 }, 00:34:20.765 "memory_domains": [ 00:34:20.765 { 00:34:20.765 "dma_device_id": "system", 00:34:20.765 "dma_device_type": 1 00:34:20.765 }, 00:34:20.765 { 00:34:20.765 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:34:20.765 "dma_device_type": 2 00:34:20.765 } 00:34:20.765 ], 00:34:20.765 "driver_specific": {} 00:34:20.765 }' 00:34:20.765 11:45:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:34:20.765 11:45:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:34:20.765 11:45:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:34:20.765 11:45:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:34:20.765 11:45:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:34:20.765 11:45:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:34:20.765 11:45:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:34:20.765 11:45:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:34:21.023 11:45:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:34:21.023 11:45:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:34:21.023 11:45:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:34:21.023 11:45:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:34:21.023 11:45:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:34:21.023 11:45:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:34:21.023 11:45:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:34:21.280 11:45:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:34:21.280 "name": "BaseBdev2", 00:34:21.280 "aliases": [ 00:34:21.280 "a481ea3a-7f64-4d79-9cf2-2408fb7b68cd" 00:34:21.280 ], 00:34:21.280 "product_name": "Malloc disk", 00:34:21.280 "block_size": 512, 00:34:21.280 "num_blocks": 65536, 00:34:21.280 "uuid": "a481ea3a-7f64-4d79-9cf2-2408fb7b68cd", 00:34:21.280 "assigned_rate_limits": { 00:34:21.280 "rw_ios_per_sec": 0, 00:34:21.280 "rw_mbytes_per_sec": 0, 00:34:21.280 "r_mbytes_per_sec": 0, 00:34:21.280 "w_mbytes_per_sec": 0 00:34:21.280 }, 00:34:21.280 "claimed": true, 00:34:21.280 "claim_type": "exclusive_write", 00:34:21.280 "zoned": false, 00:34:21.280 "supported_io_types": { 00:34:21.280 "read": true, 00:34:21.280 "write": true, 00:34:21.280 "unmap": true, 00:34:21.280 "write_zeroes": true, 00:34:21.280 "flush": true, 00:34:21.280 "reset": true, 00:34:21.280 "compare": false, 00:34:21.280 "compare_and_write": false, 00:34:21.280 "abort": true, 00:34:21.280 "nvme_admin": false, 00:34:21.280 "nvme_io": false 00:34:21.280 }, 00:34:21.280 "memory_domains": [ 00:34:21.280 { 00:34:21.280 "dma_device_id": "system", 00:34:21.280 "dma_device_type": 1 00:34:21.280 }, 00:34:21.280 { 00:34:21.280 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:34:21.280 "dma_device_type": 2 00:34:21.280 } 00:34:21.280 ], 00:34:21.280 "driver_specific": {} 00:34:21.280 }' 00:34:21.280 11:45:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:34:21.280 11:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:34:21.280 11:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:34:21.280 11:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:34:21.280 11:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:34:21.280 11:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:34:21.280 11:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:34:21.280 11:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:34:21.280 11:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:34:21.280 11:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:34:21.280 11:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:34:21.538 11:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:34:21.538 11:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:34:21.538 11:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:34:21.538 11:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:34:21.538 11:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:34:21.538 "name": "BaseBdev3", 00:34:21.538 "aliases": [ 00:34:21.538 "4e2d6bd1-e42d-4084-afd1-62e7206ce072" 00:34:21.538 ], 00:34:21.538 "product_name": "Malloc disk", 00:34:21.538 "block_size": 512, 00:34:21.538 "num_blocks": 65536, 00:34:21.538 "uuid": "4e2d6bd1-e42d-4084-afd1-62e7206ce072", 00:34:21.538 "assigned_rate_limits": { 00:34:21.538 "rw_ios_per_sec": 0, 00:34:21.538 "rw_mbytes_per_sec": 0, 00:34:21.538 "r_mbytes_per_sec": 0, 00:34:21.538 "w_mbytes_per_sec": 0 00:34:21.538 }, 00:34:21.538 "claimed": true, 00:34:21.538 "claim_type": "exclusive_write", 00:34:21.538 "zoned": false, 00:34:21.538 "supported_io_types": { 00:34:21.538 "read": true, 00:34:21.538 "write": true, 00:34:21.538 "unmap": true, 00:34:21.538 "write_zeroes": true, 00:34:21.538 "flush": true, 00:34:21.538 "reset": true, 00:34:21.538 "compare": false, 00:34:21.538 "compare_and_write": false, 00:34:21.538 "abort": true, 00:34:21.538 "nvme_admin": false, 00:34:21.538 "nvme_io": false 00:34:21.538 }, 00:34:21.538 "memory_domains": [ 00:34:21.538 { 00:34:21.538 "dma_device_id": "system", 00:34:21.538 "dma_device_type": 1 00:34:21.538 }, 00:34:21.538 { 00:34:21.538 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:34:21.538 "dma_device_type": 2 00:34:21.538 } 00:34:21.538 ], 00:34:21.538 "driver_specific": {} 00:34:21.538 }' 00:34:21.538 11:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:34:21.795 11:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:34:21.795 11:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:34:21.795 11:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:34:21.795 11:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:34:21.795 11:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:34:21.795 11:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:34:21.795 11:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:34:21.795 11:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:34:21.795 11:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:34:21.795 11:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:34:22.053 11:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:34:22.053 11:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:34:22.053 11:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:34:22.053 11:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:34:22.053 11:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:34:22.053 "name": "BaseBdev4", 00:34:22.053 "aliases": [ 00:34:22.053 "ac5ed165-cc09-4f9d-8e3f-b4b2c1c30b9a" 00:34:22.053 ], 00:34:22.053 "product_name": "Malloc disk", 00:34:22.053 "block_size": 512, 00:34:22.053 "num_blocks": 65536, 00:34:22.053 "uuid": "ac5ed165-cc09-4f9d-8e3f-b4b2c1c30b9a", 00:34:22.053 "assigned_rate_limits": { 00:34:22.053 "rw_ios_per_sec": 0, 00:34:22.053 "rw_mbytes_per_sec": 0, 00:34:22.053 "r_mbytes_per_sec": 0, 00:34:22.053 "w_mbytes_per_sec": 0 00:34:22.053 }, 00:34:22.053 "claimed": true, 00:34:22.053 "claim_type": "exclusive_write", 00:34:22.053 "zoned": false, 00:34:22.053 "supported_io_types": { 00:34:22.053 "read": true, 00:34:22.053 "write": true, 00:34:22.053 "unmap": true, 00:34:22.053 "write_zeroes": true, 00:34:22.053 "flush": true, 00:34:22.053 "reset": true, 00:34:22.053 "compare": false, 00:34:22.053 "compare_and_write": false, 00:34:22.053 "abort": true, 00:34:22.053 "nvme_admin": false, 00:34:22.053 "nvme_io": false 00:34:22.053 }, 00:34:22.053 "memory_domains": [ 00:34:22.053 { 00:34:22.053 "dma_device_id": "system", 00:34:22.053 "dma_device_type": 1 00:34:22.053 }, 00:34:22.053 { 00:34:22.053 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:34:22.053 "dma_device_type": 2 00:34:22.053 } 00:34:22.053 ], 00:34:22.053 "driver_specific": {} 00:34:22.053 }' 00:34:22.053 11:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:34:22.053 11:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:34:22.311 11:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:34:22.311 11:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:34:22.311 11:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:34:22.311 11:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:34:22.311 11:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:34:22.311 11:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:34:22.311 11:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:34:22.311 11:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:34:22.311 11:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:34:22.311 11:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:34:22.311 11:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:34:22.572 [2024-06-10 11:45:06.353412] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:34:22.572 [2024-06-10 11:45:06.353434] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:34:22.572 [2024-06-10 11:45:06.353468] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:34:22.572 11:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:34:22.572 11:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:34:22.572 11:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:34:22.572 11:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:34:22.572 11:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:34:22.572 11:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 3 00:34:22.572 11:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:34:22.572 11:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:34:22.572 11:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:34:22.572 11:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:34:22.572 11:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:34:22.572 11:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:34:22.572 11:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:34:22.572 11:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:34:22.572 11:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:34:22.572 11:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:22.572 11:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:34:22.830 11:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:34:22.830 "name": "Existed_Raid", 00:34:22.830 "uuid": "da9e6010-0760-458e-9219-66db7dfea2ed", 00:34:22.830 "strip_size_kb": 64, 00:34:22.830 "state": "offline", 00:34:22.830 "raid_level": "concat", 00:34:22.830 "superblock": false, 00:34:22.830 "num_base_bdevs": 4, 00:34:22.830 "num_base_bdevs_discovered": 3, 00:34:22.830 "num_base_bdevs_operational": 3, 00:34:22.830 "base_bdevs_list": [ 00:34:22.830 { 00:34:22.830 "name": null, 00:34:22.830 "uuid": "00000000-0000-0000-0000-000000000000", 00:34:22.830 "is_configured": false, 00:34:22.830 "data_offset": 0, 00:34:22.830 "data_size": 65536 00:34:22.830 }, 00:34:22.830 { 00:34:22.830 "name": "BaseBdev2", 00:34:22.830 "uuid": "a481ea3a-7f64-4d79-9cf2-2408fb7b68cd", 00:34:22.830 "is_configured": true, 00:34:22.830 "data_offset": 0, 00:34:22.830 "data_size": 65536 00:34:22.830 }, 00:34:22.830 { 00:34:22.830 "name": "BaseBdev3", 00:34:22.830 "uuid": "4e2d6bd1-e42d-4084-afd1-62e7206ce072", 00:34:22.830 "is_configured": true, 00:34:22.830 "data_offset": 0, 00:34:22.830 "data_size": 65536 00:34:22.830 }, 00:34:22.830 { 00:34:22.830 "name": "BaseBdev4", 00:34:22.830 "uuid": "ac5ed165-cc09-4f9d-8e3f-b4b2c1c30b9a", 00:34:22.830 "is_configured": true, 00:34:22.830 "data_offset": 0, 00:34:22.830 "data_size": 65536 00:34:22.830 } 00:34:22.830 ] 00:34:22.830 }' 00:34:22.830 11:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:34:22.830 11:45:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:34:23.088 11:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:34:23.088 11:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:34:23.088 11:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:23.088 11:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:34:23.346 11:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:34:23.346 11:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:34:23.346 11:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:34:23.604 [2024-06-10 11:45:07.348763] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:34:23.604 11:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:34:23.604 11:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:34:23.604 11:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:23.604 11:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:34:23.862 11:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:34:23.862 11:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:34:23.862 11:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:34:23.862 [2024-06-10 11:45:07.709555] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:34:23.862 11:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:34:23.862 11:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:34:23.862 11:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:23.862 11:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:34:24.119 11:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:34:24.119 11:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:34:24.120 11:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:34:24.120 [2024-06-10 11:45:08.050279] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:34:24.120 [2024-06-10 11:45:08.050309] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x265fe20 name Existed_Raid, state offline 00:34:24.377 11:45:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:34:24.377 11:45:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:34:24.377 11:45:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:24.377 11:45:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:34:24.377 11:45:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:34:24.377 11:45:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:34:24.377 11:45:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:34:24.377 11:45:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:34:24.377 11:45:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:34:24.377 11:45:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:34:24.635 BaseBdev2 00:34:24.635 11:45:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:34:24.635 11:45:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:34:24.635 11:45:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:34:24.635 11:45:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:34:24.635 11:45:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:34:24.635 11:45:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:34:24.635 11:45:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:34:24.635 11:45:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:34:24.892 [ 00:34:24.892 { 00:34:24.892 "name": "BaseBdev2", 00:34:24.892 "aliases": [ 00:34:24.892 "ad0124b0-1e73-43a2-9a11-12eaba34610b" 00:34:24.892 ], 00:34:24.892 "product_name": "Malloc disk", 00:34:24.892 "block_size": 512, 00:34:24.892 "num_blocks": 65536, 00:34:24.892 "uuid": "ad0124b0-1e73-43a2-9a11-12eaba34610b", 00:34:24.892 "assigned_rate_limits": { 00:34:24.892 "rw_ios_per_sec": 0, 00:34:24.892 "rw_mbytes_per_sec": 0, 00:34:24.892 "r_mbytes_per_sec": 0, 00:34:24.892 "w_mbytes_per_sec": 0 00:34:24.892 }, 00:34:24.892 "claimed": false, 00:34:24.892 "zoned": false, 00:34:24.892 "supported_io_types": { 00:34:24.892 "read": true, 00:34:24.892 "write": true, 00:34:24.892 "unmap": true, 00:34:24.892 "write_zeroes": true, 00:34:24.892 "flush": true, 00:34:24.892 "reset": true, 00:34:24.892 "compare": false, 00:34:24.892 "compare_and_write": false, 00:34:24.892 "abort": true, 00:34:24.892 "nvme_admin": false, 00:34:24.892 "nvme_io": false 00:34:24.892 }, 00:34:24.892 "memory_domains": [ 00:34:24.892 { 00:34:24.892 "dma_device_id": "system", 00:34:24.892 "dma_device_type": 1 00:34:24.892 }, 00:34:24.892 { 00:34:24.892 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:34:24.892 "dma_device_type": 2 00:34:24.892 } 00:34:24.892 ], 00:34:24.892 "driver_specific": {} 00:34:24.892 } 00:34:24.892 ] 00:34:24.892 11:45:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:34:24.892 11:45:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:34:24.892 11:45:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:34:24.892 11:45:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:34:25.151 BaseBdev3 00:34:25.151 11:45:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:34:25.151 11:45:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:34:25.151 11:45:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:34:25.151 11:45:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:34:25.151 11:45:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:34:25.151 11:45:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:34:25.151 11:45:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:34:25.151 11:45:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:34:25.409 [ 00:34:25.409 { 00:34:25.409 "name": "BaseBdev3", 00:34:25.409 "aliases": [ 00:34:25.409 "81b74ff7-084b-4f2e-a93e-ad33dadaee20" 00:34:25.409 ], 00:34:25.409 "product_name": "Malloc disk", 00:34:25.409 "block_size": 512, 00:34:25.409 "num_blocks": 65536, 00:34:25.409 "uuid": "81b74ff7-084b-4f2e-a93e-ad33dadaee20", 00:34:25.409 "assigned_rate_limits": { 00:34:25.409 "rw_ios_per_sec": 0, 00:34:25.409 "rw_mbytes_per_sec": 0, 00:34:25.409 "r_mbytes_per_sec": 0, 00:34:25.409 "w_mbytes_per_sec": 0 00:34:25.409 }, 00:34:25.409 "claimed": false, 00:34:25.409 "zoned": false, 00:34:25.409 "supported_io_types": { 00:34:25.409 "read": true, 00:34:25.409 "write": true, 00:34:25.409 "unmap": true, 00:34:25.409 "write_zeroes": true, 00:34:25.409 "flush": true, 00:34:25.409 "reset": true, 00:34:25.409 "compare": false, 00:34:25.409 "compare_and_write": false, 00:34:25.409 "abort": true, 00:34:25.409 "nvme_admin": false, 00:34:25.409 "nvme_io": false 00:34:25.409 }, 00:34:25.409 "memory_domains": [ 00:34:25.409 { 00:34:25.409 "dma_device_id": "system", 00:34:25.409 "dma_device_type": 1 00:34:25.409 }, 00:34:25.409 { 00:34:25.409 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:34:25.409 "dma_device_type": 2 00:34:25.409 } 00:34:25.409 ], 00:34:25.409 "driver_specific": {} 00:34:25.409 } 00:34:25.409 ] 00:34:25.409 11:45:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:34:25.409 11:45:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:34:25.409 11:45:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:34:25.409 11:45:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:34:25.667 BaseBdev4 00:34:25.667 11:45:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:34:25.667 11:45:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev4 00:34:25.667 11:45:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:34:25.667 11:45:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:34:25.667 11:45:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:34:25.667 11:45:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:34:25.667 11:45:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:34:25.667 11:45:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:34:25.924 [ 00:34:25.925 { 00:34:25.925 "name": "BaseBdev4", 00:34:25.925 "aliases": [ 00:34:25.925 "60c4b965-e6a9-40f7-9887-f559ed72500e" 00:34:25.925 ], 00:34:25.925 "product_name": "Malloc disk", 00:34:25.925 "block_size": 512, 00:34:25.925 "num_blocks": 65536, 00:34:25.925 "uuid": "60c4b965-e6a9-40f7-9887-f559ed72500e", 00:34:25.925 "assigned_rate_limits": { 00:34:25.925 "rw_ios_per_sec": 0, 00:34:25.925 "rw_mbytes_per_sec": 0, 00:34:25.925 "r_mbytes_per_sec": 0, 00:34:25.925 "w_mbytes_per_sec": 0 00:34:25.925 }, 00:34:25.925 "claimed": false, 00:34:25.925 "zoned": false, 00:34:25.925 "supported_io_types": { 00:34:25.925 "read": true, 00:34:25.925 "write": true, 00:34:25.925 "unmap": true, 00:34:25.925 "write_zeroes": true, 00:34:25.925 "flush": true, 00:34:25.925 "reset": true, 00:34:25.925 "compare": false, 00:34:25.925 "compare_and_write": false, 00:34:25.925 "abort": true, 00:34:25.925 "nvme_admin": false, 00:34:25.925 "nvme_io": false 00:34:25.925 }, 00:34:25.925 "memory_domains": [ 00:34:25.925 { 00:34:25.925 "dma_device_id": "system", 00:34:25.925 "dma_device_type": 1 00:34:25.925 }, 00:34:25.925 { 00:34:25.925 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:34:25.925 "dma_device_type": 2 00:34:25.925 } 00:34:25.925 ], 00:34:25.925 "driver_specific": {} 00:34:25.925 } 00:34:25.925 ] 00:34:25.925 11:45:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:34:25.925 11:45:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:34:25.925 11:45:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:34:25.925 11:45:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:34:26.183 [2024-06-10 11:45:09.882290] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:34:26.183 [2024-06-10 11:45:09.882323] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:34:26.183 [2024-06-10 11:45:09.882336] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:34:26.183 [2024-06-10 11:45:09.883268] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:34:26.183 [2024-06-10 11:45:09.883298] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:34:26.183 11:45:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:34:26.183 11:45:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:34:26.183 11:45:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:34:26.183 11:45:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:34:26.183 11:45:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:34:26.183 11:45:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:34:26.183 11:45:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:34:26.183 11:45:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:34:26.183 11:45:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:34:26.183 11:45:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:34:26.184 11:45:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:26.184 11:45:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:34:26.184 11:45:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:34:26.184 "name": "Existed_Raid", 00:34:26.184 "uuid": "00000000-0000-0000-0000-000000000000", 00:34:26.184 "strip_size_kb": 64, 00:34:26.184 "state": "configuring", 00:34:26.184 "raid_level": "concat", 00:34:26.184 "superblock": false, 00:34:26.184 "num_base_bdevs": 4, 00:34:26.184 "num_base_bdevs_discovered": 3, 00:34:26.184 "num_base_bdevs_operational": 4, 00:34:26.184 "base_bdevs_list": [ 00:34:26.184 { 00:34:26.184 "name": "BaseBdev1", 00:34:26.184 "uuid": "00000000-0000-0000-0000-000000000000", 00:34:26.184 "is_configured": false, 00:34:26.184 "data_offset": 0, 00:34:26.184 "data_size": 0 00:34:26.184 }, 00:34:26.184 { 00:34:26.184 "name": "BaseBdev2", 00:34:26.184 "uuid": "ad0124b0-1e73-43a2-9a11-12eaba34610b", 00:34:26.184 "is_configured": true, 00:34:26.184 "data_offset": 0, 00:34:26.184 "data_size": 65536 00:34:26.184 }, 00:34:26.184 { 00:34:26.184 "name": "BaseBdev3", 00:34:26.184 "uuid": "81b74ff7-084b-4f2e-a93e-ad33dadaee20", 00:34:26.184 "is_configured": true, 00:34:26.184 "data_offset": 0, 00:34:26.184 "data_size": 65536 00:34:26.184 }, 00:34:26.184 { 00:34:26.184 "name": "BaseBdev4", 00:34:26.184 "uuid": "60c4b965-e6a9-40f7-9887-f559ed72500e", 00:34:26.184 "is_configured": true, 00:34:26.184 "data_offset": 0, 00:34:26.184 "data_size": 65536 00:34:26.184 } 00:34:26.184 ] 00:34:26.184 }' 00:34:26.184 11:45:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:34:26.184 11:45:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:34:26.749 11:45:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:34:27.008 [2024-06-10 11:45:10.708429] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:34:27.008 11:45:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:34:27.008 11:45:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:34:27.008 11:45:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:34:27.008 11:45:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:34:27.008 11:45:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:34:27.008 11:45:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:34:27.008 11:45:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:34:27.008 11:45:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:34:27.008 11:45:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:34:27.008 11:45:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:34:27.008 11:45:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:27.008 11:45:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:34:27.008 11:45:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:34:27.008 "name": "Existed_Raid", 00:34:27.008 "uuid": "00000000-0000-0000-0000-000000000000", 00:34:27.008 "strip_size_kb": 64, 00:34:27.008 "state": "configuring", 00:34:27.008 "raid_level": "concat", 00:34:27.008 "superblock": false, 00:34:27.008 "num_base_bdevs": 4, 00:34:27.008 "num_base_bdevs_discovered": 2, 00:34:27.008 "num_base_bdevs_operational": 4, 00:34:27.008 "base_bdevs_list": [ 00:34:27.008 { 00:34:27.008 "name": "BaseBdev1", 00:34:27.008 "uuid": "00000000-0000-0000-0000-000000000000", 00:34:27.008 "is_configured": false, 00:34:27.008 "data_offset": 0, 00:34:27.008 "data_size": 0 00:34:27.008 }, 00:34:27.008 { 00:34:27.008 "name": null, 00:34:27.008 "uuid": "ad0124b0-1e73-43a2-9a11-12eaba34610b", 00:34:27.008 "is_configured": false, 00:34:27.008 "data_offset": 0, 00:34:27.008 "data_size": 65536 00:34:27.008 }, 00:34:27.008 { 00:34:27.008 "name": "BaseBdev3", 00:34:27.008 "uuid": "81b74ff7-084b-4f2e-a93e-ad33dadaee20", 00:34:27.008 "is_configured": true, 00:34:27.008 "data_offset": 0, 00:34:27.008 "data_size": 65536 00:34:27.008 }, 00:34:27.008 { 00:34:27.008 "name": "BaseBdev4", 00:34:27.008 "uuid": "60c4b965-e6a9-40f7-9887-f559ed72500e", 00:34:27.008 "is_configured": true, 00:34:27.008 "data_offset": 0, 00:34:27.008 "data_size": 65536 00:34:27.008 } 00:34:27.008 ] 00:34:27.008 }' 00:34:27.008 11:45:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:34:27.008 11:45:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:34:27.574 11:45:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:27.574 11:45:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:34:27.832 11:45:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:34:27.832 11:45:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:34:27.832 [2024-06-10 11:45:11.701742] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:34:27.832 BaseBdev1 00:34:27.832 11:45:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:34:27.832 11:45:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:34:27.832 11:45:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:34:27.832 11:45:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:34:27.832 11:45:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:34:27.832 11:45:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:34:27.832 11:45:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:34:28.089 11:45:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:34:28.349 [ 00:34:28.349 { 00:34:28.349 "name": "BaseBdev1", 00:34:28.349 "aliases": [ 00:34:28.349 "e1878df5-c862-481f-95c7-c00a0afe87cc" 00:34:28.349 ], 00:34:28.349 "product_name": "Malloc disk", 00:34:28.349 "block_size": 512, 00:34:28.349 "num_blocks": 65536, 00:34:28.349 "uuid": "e1878df5-c862-481f-95c7-c00a0afe87cc", 00:34:28.349 "assigned_rate_limits": { 00:34:28.349 "rw_ios_per_sec": 0, 00:34:28.349 "rw_mbytes_per_sec": 0, 00:34:28.349 "r_mbytes_per_sec": 0, 00:34:28.349 "w_mbytes_per_sec": 0 00:34:28.349 }, 00:34:28.349 "claimed": true, 00:34:28.349 "claim_type": "exclusive_write", 00:34:28.349 "zoned": false, 00:34:28.349 "supported_io_types": { 00:34:28.349 "read": true, 00:34:28.349 "write": true, 00:34:28.349 "unmap": true, 00:34:28.349 "write_zeroes": true, 00:34:28.349 "flush": true, 00:34:28.349 "reset": true, 00:34:28.350 "compare": false, 00:34:28.350 "compare_and_write": false, 00:34:28.350 "abort": true, 00:34:28.350 "nvme_admin": false, 00:34:28.350 "nvme_io": false 00:34:28.350 }, 00:34:28.350 "memory_domains": [ 00:34:28.350 { 00:34:28.350 "dma_device_id": "system", 00:34:28.350 "dma_device_type": 1 00:34:28.350 }, 00:34:28.350 { 00:34:28.350 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:34:28.350 "dma_device_type": 2 00:34:28.350 } 00:34:28.350 ], 00:34:28.350 "driver_specific": {} 00:34:28.350 } 00:34:28.350 ] 00:34:28.350 11:45:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:34:28.350 11:45:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:34:28.350 11:45:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:34:28.350 11:45:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:34:28.350 11:45:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:34:28.350 11:45:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:34:28.350 11:45:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:34:28.350 11:45:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:34:28.350 11:45:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:34:28.350 11:45:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:34:28.350 11:45:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:34:28.350 11:45:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:28.350 11:45:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:34:28.350 11:45:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:34:28.350 "name": "Existed_Raid", 00:34:28.350 "uuid": "00000000-0000-0000-0000-000000000000", 00:34:28.350 "strip_size_kb": 64, 00:34:28.350 "state": "configuring", 00:34:28.350 "raid_level": "concat", 00:34:28.350 "superblock": false, 00:34:28.350 "num_base_bdevs": 4, 00:34:28.350 "num_base_bdevs_discovered": 3, 00:34:28.350 "num_base_bdevs_operational": 4, 00:34:28.350 "base_bdevs_list": [ 00:34:28.350 { 00:34:28.350 "name": "BaseBdev1", 00:34:28.350 "uuid": "e1878df5-c862-481f-95c7-c00a0afe87cc", 00:34:28.350 "is_configured": true, 00:34:28.350 "data_offset": 0, 00:34:28.350 "data_size": 65536 00:34:28.350 }, 00:34:28.350 { 00:34:28.350 "name": null, 00:34:28.350 "uuid": "ad0124b0-1e73-43a2-9a11-12eaba34610b", 00:34:28.350 "is_configured": false, 00:34:28.350 "data_offset": 0, 00:34:28.350 "data_size": 65536 00:34:28.350 }, 00:34:28.350 { 00:34:28.350 "name": "BaseBdev3", 00:34:28.350 "uuid": "81b74ff7-084b-4f2e-a93e-ad33dadaee20", 00:34:28.350 "is_configured": true, 00:34:28.350 "data_offset": 0, 00:34:28.350 "data_size": 65536 00:34:28.350 }, 00:34:28.350 { 00:34:28.350 "name": "BaseBdev4", 00:34:28.350 "uuid": "60c4b965-e6a9-40f7-9887-f559ed72500e", 00:34:28.350 "is_configured": true, 00:34:28.350 "data_offset": 0, 00:34:28.350 "data_size": 65536 00:34:28.350 } 00:34:28.350 ] 00:34:28.350 }' 00:34:28.350 11:45:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:34:28.350 11:45:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:34:28.916 11:45:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:28.916 11:45:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:34:29.173 11:45:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:34:29.173 11:45:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:34:29.173 [2024-06-10 11:45:13.021186] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:34:29.173 11:45:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:34:29.173 11:45:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:34:29.174 11:45:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:34:29.174 11:45:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:34:29.174 11:45:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:34:29.174 11:45:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:34:29.174 11:45:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:34:29.174 11:45:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:34:29.174 11:45:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:34:29.174 11:45:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:34:29.174 11:45:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:29.174 11:45:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:34:29.431 11:45:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:34:29.431 "name": "Existed_Raid", 00:34:29.431 "uuid": "00000000-0000-0000-0000-000000000000", 00:34:29.431 "strip_size_kb": 64, 00:34:29.431 "state": "configuring", 00:34:29.431 "raid_level": "concat", 00:34:29.431 "superblock": false, 00:34:29.431 "num_base_bdevs": 4, 00:34:29.431 "num_base_bdevs_discovered": 2, 00:34:29.431 "num_base_bdevs_operational": 4, 00:34:29.431 "base_bdevs_list": [ 00:34:29.431 { 00:34:29.431 "name": "BaseBdev1", 00:34:29.431 "uuid": "e1878df5-c862-481f-95c7-c00a0afe87cc", 00:34:29.431 "is_configured": true, 00:34:29.431 "data_offset": 0, 00:34:29.431 "data_size": 65536 00:34:29.431 }, 00:34:29.431 { 00:34:29.431 "name": null, 00:34:29.432 "uuid": "ad0124b0-1e73-43a2-9a11-12eaba34610b", 00:34:29.432 "is_configured": false, 00:34:29.432 "data_offset": 0, 00:34:29.432 "data_size": 65536 00:34:29.432 }, 00:34:29.432 { 00:34:29.432 "name": null, 00:34:29.432 "uuid": "81b74ff7-084b-4f2e-a93e-ad33dadaee20", 00:34:29.432 "is_configured": false, 00:34:29.432 "data_offset": 0, 00:34:29.432 "data_size": 65536 00:34:29.432 }, 00:34:29.432 { 00:34:29.432 "name": "BaseBdev4", 00:34:29.432 "uuid": "60c4b965-e6a9-40f7-9887-f559ed72500e", 00:34:29.432 "is_configured": true, 00:34:29.432 "data_offset": 0, 00:34:29.432 "data_size": 65536 00:34:29.432 } 00:34:29.432 ] 00:34:29.432 }' 00:34:29.432 11:45:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:34:29.432 11:45:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:34:29.997 11:45:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:29.997 11:45:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:34:29.997 11:45:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:34:29.997 11:45:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:34:30.255 [2024-06-10 11:45:14.023778] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:34:30.255 11:45:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:34:30.255 11:45:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:34:30.255 11:45:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:34:30.255 11:45:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:34:30.255 11:45:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:34:30.255 11:45:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:34:30.255 11:45:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:34:30.255 11:45:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:34:30.255 11:45:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:34:30.255 11:45:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:34:30.255 11:45:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:30.255 11:45:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:34:30.513 11:45:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:34:30.513 "name": "Existed_Raid", 00:34:30.513 "uuid": "00000000-0000-0000-0000-000000000000", 00:34:30.513 "strip_size_kb": 64, 00:34:30.513 "state": "configuring", 00:34:30.513 "raid_level": "concat", 00:34:30.513 "superblock": false, 00:34:30.513 "num_base_bdevs": 4, 00:34:30.513 "num_base_bdevs_discovered": 3, 00:34:30.513 "num_base_bdevs_operational": 4, 00:34:30.513 "base_bdevs_list": [ 00:34:30.513 { 00:34:30.513 "name": "BaseBdev1", 00:34:30.513 "uuid": "e1878df5-c862-481f-95c7-c00a0afe87cc", 00:34:30.513 "is_configured": true, 00:34:30.513 "data_offset": 0, 00:34:30.513 "data_size": 65536 00:34:30.513 }, 00:34:30.513 { 00:34:30.513 "name": null, 00:34:30.514 "uuid": "ad0124b0-1e73-43a2-9a11-12eaba34610b", 00:34:30.514 "is_configured": false, 00:34:30.514 "data_offset": 0, 00:34:30.514 "data_size": 65536 00:34:30.514 }, 00:34:30.514 { 00:34:30.514 "name": "BaseBdev3", 00:34:30.514 "uuid": "81b74ff7-084b-4f2e-a93e-ad33dadaee20", 00:34:30.514 "is_configured": true, 00:34:30.514 "data_offset": 0, 00:34:30.514 "data_size": 65536 00:34:30.514 }, 00:34:30.514 { 00:34:30.514 "name": "BaseBdev4", 00:34:30.514 "uuid": "60c4b965-e6a9-40f7-9887-f559ed72500e", 00:34:30.514 "is_configured": true, 00:34:30.514 "data_offset": 0, 00:34:30.514 "data_size": 65536 00:34:30.514 } 00:34:30.514 ] 00:34:30.514 }' 00:34:30.514 11:45:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:34:30.514 11:45:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:34:30.772 11:45:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:30.772 11:45:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:34:31.029 11:45:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:34:31.029 11:45:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:34:31.029 [2024-06-10 11:45:14.974277] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:34:31.287 11:45:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:34:31.287 11:45:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:34:31.287 11:45:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:34:31.287 11:45:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:34:31.287 11:45:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:34:31.287 11:45:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:34:31.287 11:45:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:34:31.287 11:45:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:34:31.287 11:45:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:34:31.287 11:45:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:34:31.287 11:45:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:34:31.287 11:45:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:31.287 11:45:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:34:31.287 "name": "Existed_Raid", 00:34:31.287 "uuid": "00000000-0000-0000-0000-000000000000", 00:34:31.287 "strip_size_kb": 64, 00:34:31.287 "state": "configuring", 00:34:31.287 "raid_level": "concat", 00:34:31.287 "superblock": false, 00:34:31.287 "num_base_bdevs": 4, 00:34:31.287 "num_base_bdevs_discovered": 2, 00:34:31.287 "num_base_bdevs_operational": 4, 00:34:31.287 "base_bdevs_list": [ 00:34:31.287 { 00:34:31.287 "name": null, 00:34:31.287 "uuid": "e1878df5-c862-481f-95c7-c00a0afe87cc", 00:34:31.287 "is_configured": false, 00:34:31.287 "data_offset": 0, 00:34:31.287 "data_size": 65536 00:34:31.287 }, 00:34:31.287 { 00:34:31.287 "name": null, 00:34:31.287 "uuid": "ad0124b0-1e73-43a2-9a11-12eaba34610b", 00:34:31.287 "is_configured": false, 00:34:31.287 "data_offset": 0, 00:34:31.287 "data_size": 65536 00:34:31.287 }, 00:34:31.287 { 00:34:31.287 "name": "BaseBdev3", 00:34:31.287 "uuid": "81b74ff7-084b-4f2e-a93e-ad33dadaee20", 00:34:31.287 "is_configured": true, 00:34:31.287 "data_offset": 0, 00:34:31.287 "data_size": 65536 00:34:31.287 }, 00:34:31.287 { 00:34:31.287 "name": "BaseBdev4", 00:34:31.287 "uuid": "60c4b965-e6a9-40f7-9887-f559ed72500e", 00:34:31.287 "is_configured": true, 00:34:31.287 "data_offset": 0, 00:34:31.287 "data_size": 65536 00:34:31.287 } 00:34:31.287 ] 00:34:31.287 }' 00:34:31.287 11:45:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:34:31.287 11:45:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:34:31.852 11:45:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:31.852 11:45:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:34:32.111 11:45:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:34:32.111 11:45:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:34:32.111 [2024-06-10 11:45:16.004894] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:34:32.111 11:45:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:34:32.111 11:45:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:34:32.111 11:45:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:34:32.111 11:45:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:34:32.111 11:45:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:34:32.111 11:45:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:34:32.111 11:45:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:34:32.111 11:45:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:34:32.111 11:45:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:34:32.111 11:45:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:34:32.111 11:45:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:32.111 11:45:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:34:32.369 11:45:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:34:32.369 "name": "Existed_Raid", 00:34:32.369 "uuid": "00000000-0000-0000-0000-000000000000", 00:34:32.369 "strip_size_kb": 64, 00:34:32.369 "state": "configuring", 00:34:32.369 "raid_level": "concat", 00:34:32.369 "superblock": false, 00:34:32.369 "num_base_bdevs": 4, 00:34:32.369 "num_base_bdevs_discovered": 3, 00:34:32.369 "num_base_bdevs_operational": 4, 00:34:32.369 "base_bdevs_list": [ 00:34:32.369 { 00:34:32.369 "name": null, 00:34:32.369 "uuid": "e1878df5-c862-481f-95c7-c00a0afe87cc", 00:34:32.369 "is_configured": false, 00:34:32.369 "data_offset": 0, 00:34:32.369 "data_size": 65536 00:34:32.369 }, 00:34:32.369 { 00:34:32.369 "name": "BaseBdev2", 00:34:32.369 "uuid": "ad0124b0-1e73-43a2-9a11-12eaba34610b", 00:34:32.369 "is_configured": true, 00:34:32.369 "data_offset": 0, 00:34:32.369 "data_size": 65536 00:34:32.369 }, 00:34:32.369 { 00:34:32.369 "name": "BaseBdev3", 00:34:32.369 "uuid": "81b74ff7-084b-4f2e-a93e-ad33dadaee20", 00:34:32.369 "is_configured": true, 00:34:32.369 "data_offset": 0, 00:34:32.369 "data_size": 65536 00:34:32.369 }, 00:34:32.369 { 00:34:32.369 "name": "BaseBdev4", 00:34:32.369 "uuid": "60c4b965-e6a9-40f7-9887-f559ed72500e", 00:34:32.369 "is_configured": true, 00:34:32.369 "data_offset": 0, 00:34:32.369 "data_size": 65536 00:34:32.369 } 00:34:32.369 ] 00:34:32.369 }' 00:34:32.369 11:45:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:34:32.369 11:45:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:34:32.934 11:45:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:34:32.934 11:45:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:33.191 11:45:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:34:33.191 11:45:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:33.191 11:45:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:34:33.191 11:45:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u e1878df5-c862-481f-95c7-c00a0afe87cc 00:34:33.449 [2024-06-10 11:45:17.252126] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:34:33.449 [2024-06-10 11:45:17.252159] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2664050 00:34:33.449 [2024-06-10 11:45:17.252165] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:34:33.449 [2024-06-10 11:45:17.252296] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26677d0 00:34:33.449 [2024-06-10 11:45:17.252375] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2664050 00:34:33.449 [2024-06-10 11:45:17.252381] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2664050 00:34:33.449 [2024-06-10 11:45:17.252510] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:34:33.449 NewBaseBdev 00:34:33.449 11:45:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:34:33.449 11:45:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=NewBaseBdev 00:34:33.449 11:45:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:34:33.449 11:45:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:34:33.449 11:45:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:34:33.449 11:45:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:34:33.449 11:45:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:34:33.707 11:45:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:34:33.707 [ 00:34:33.707 { 00:34:33.707 "name": "NewBaseBdev", 00:34:33.707 "aliases": [ 00:34:33.707 "e1878df5-c862-481f-95c7-c00a0afe87cc" 00:34:33.707 ], 00:34:33.707 "product_name": "Malloc disk", 00:34:33.707 "block_size": 512, 00:34:33.707 "num_blocks": 65536, 00:34:33.707 "uuid": "e1878df5-c862-481f-95c7-c00a0afe87cc", 00:34:33.707 "assigned_rate_limits": { 00:34:33.707 "rw_ios_per_sec": 0, 00:34:33.707 "rw_mbytes_per_sec": 0, 00:34:33.707 "r_mbytes_per_sec": 0, 00:34:33.707 "w_mbytes_per_sec": 0 00:34:33.707 }, 00:34:33.707 "claimed": true, 00:34:33.707 "claim_type": "exclusive_write", 00:34:33.707 "zoned": false, 00:34:33.707 "supported_io_types": { 00:34:33.707 "read": true, 00:34:33.707 "write": true, 00:34:33.707 "unmap": true, 00:34:33.707 "write_zeroes": true, 00:34:33.707 "flush": true, 00:34:33.707 "reset": true, 00:34:33.707 "compare": false, 00:34:33.707 "compare_and_write": false, 00:34:33.707 "abort": true, 00:34:33.707 "nvme_admin": false, 00:34:33.707 "nvme_io": false 00:34:33.707 }, 00:34:33.707 "memory_domains": [ 00:34:33.707 { 00:34:33.707 "dma_device_id": "system", 00:34:33.707 "dma_device_type": 1 00:34:33.707 }, 00:34:33.707 { 00:34:33.707 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:34:33.707 "dma_device_type": 2 00:34:33.707 } 00:34:33.707 ], 00:34:33.707 "driver_specific": {} 00:34:33.707 } 00:34:33.707 ] 00:34:33.707 11:45:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:34:33.708 11:45:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:34:33.708 11:45:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:34:33.708 11:45:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:34:33.708 11:45:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:34:33.708 11:45:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:34:33.708 11:45:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:34:33.708 11:45:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:34:33.708 11:45:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:34:33.708 11:45:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:34:33.708 11:45:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:34:33.708 11:45:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:33.708 11:45:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:34:33.966 11:45:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:34:33.966 "name": "Existed_Raid", 00:34:33.966 "uuid": "7cfeef91-da8a-41ab-8539-bd5cc30690fe", 00:34:33.966 "strip_size_kb": 64, 00:34:33.966 "state": "online", 00:34:33.966 "raid_level": "concat", 00:34:33.966 "superblock": false, 00:34:33.966 "num_base_bdevs": 4, 00:34:33.966 "num_base_bdevs_discovered": 4, 00:34:33.966 "num_base_bdevs_operational": 4, 00:34:33.966 "base_bdevs_list": [ 00:34:33.966 { 00:34:33.966 "name": "NewBaseBdev", 00:34:33.966 "uuid": "e1878df5-c862-481f-95c7-c00a0afe87cc", 00:34:33.966 "is_configured": true, 00:34:33.966 "data_offset": 0, 00:34:33.966 "data_size": 65536 00:34:33.966 }, 00:34:33.966 { 00:34:33.966 "name": "BaseBdev2", 00:34:33.966 "uuid": "ad0124b0-1e73-43a2-9a11-12eaba34610b", 00:34:33.966 "is_configured": true, 00:34:33.966 "data_offset": 0, 00:34:33.966 "data_size": 65536 00:34:33.966 }, 00:34:33.966 { 00:34:33.966 "name": "BaseBdev3", 00:34:33.966 "uuid": "81b74ff7-084b-4f2e-a93e-ad33dadaee20", 00:34:33.966 "is_configured": true, 00:34:33.966 "data_offset": 0, 00:34:33.966 "data_size": 65536 00:34:33.966 }, 00:34:33.966 { 00:34:33.966 "name": "BaseBdev4", 00:34:33.966 "uuid": "60c4b965-e6a9-40f7-9887-f559ed72500e", 00:34:33.966 "is_configured": true, 00:34:33.966 "data_offset": 0, 00:34:33.966 "data_size": 65536 00:34:33.966 } 00:34:33.966 ] 00:34:33.966 }' 00:34:33.966 11:45:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:34:33.966 11:45:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:34:34.532 11:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:34:34.532 11:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:34:34.532 11:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:34:34.532 11:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:34:34.532 11:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:34:34.532 11:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:34:34.532 11:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:34:34.532 11:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:34:34.532 [2024-06-10 11:45:18.415347] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:34:34.532 11:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:34:34.532 "name": "Existed_Raid", 00:34:34.532 "aliases": [ 00:34:34.532 "7cfeef91-da8a-41ab-8539-bd5cc30690fe" 00:34:34.532 ], 00:34:34.532 "product_name": "Raid Volume", 00:34:34.532 "block_size": 512, 00:34:34.532 "num_blocks": 262144, 00:34:34.532 "uuid": "7cfeef91-da8a-41ab-8539-bd5cc30690fe", 00:34:34.532 "assigned_rate_limits": { 00:34:34.532 "rw_ios_per_sec": 0, 00:34:34.532 "rw_mbytes_per_sec": 0, 00:34:34.532 "r_mbytes_per_sec": 0, 00:34:34.532 "w_mbytes_per_sec": 0 00:34:34.532 }, 00:34:34.532 "claimed": false, 00:34:34.532 "zoned": false, 00:34:34.532 "supported_io_types": { 00:34:34.532 "read": true, 00:34:34.532 "write": true, 00:34:34.532 "unmap": true, 00:34:34.532 "write_zeroes": true, 00:34:34.532 "flush": true, 00:34:34.532 "reset": true, 00:34:34.532 "compare": false, 00:34:34.532 "compare_and_write": false, 00:34:34.532 "abort": false, 00:34:34.532 "nvme_admin": false, 00:34:34.532 "nvme_io": false 00:34:34.532 }, 00:34:34.532 "memory_domains": [ 00:34:34.532 { 00:34:34.532 "dma_device_id": "system", 00:34:34.532 "dma_device_type": 1 00:34:34.532 }, 00:34:34.532 { 00:34:34.532 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:34:34.532 "dma_device_type": 2 00:34:34.532 }, 00:34:34.532 { 00:34:34.532 "dma_device_id": "system", 00:34:34.532 "dma_device_type": 1 00:34:34.532 }, 00:34:34.532 { 00:34:34.532 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:34:34.532 "dma_device_type": 2 00:34:34.532 }, 00:34:34.532 { 00:34:34.532 "dma_device_id": "system", 00:34:34.532 "dma_device_type": 1 00:34:34.532 }, 00:34:34.532 { 00:34:34.532 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:34:34.532 "dma_device_type": 2 00:34:34.532 }, 00:34:34.532 { 00:34:34.532 "dma_device_id": "system", 00:34:34.532 "dma_device_type": 1 00:34:34.532 }, 00:34:34.532 { 00:34:34.532 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:34:34.532 "dma_device_type": 2 00:34:34.532 } 00:34:34.532 ], 00:34:34.532 "driver_specific": { 00:34:34.532 "raid": { 00:34:34.532 "uuid": "7cfeef91-da8a-41ab-8539-bd5cc30690fe", 00:34:34.532 "strip_size_kb": 64, 00:34:34.532 "state": "online", 00:34:34.532 "raid_level": "concat", 00:34:34.532 "superblock": false, 00:34:34.532 "num_base_bdevs": 4, 00:34:34.532 "num_base_bdevs_discovered": 4, 00:34:34.532 "num_base_bdevs_operational": 4, 00:34:34.532 "base_bdevs_list": [ 00:34:34.532 { 00:34:34.532 "name": "NewBaseBdev", 00:34:34.532 "uuid": "e1878df5-c862-481f-95c7-c00a0afe87cc", 00:34:34.532 "is_configured": true, 00:34:34.532 "data_offset": 0, 00:34:34.532 "data_size": 65536 00:34:34.532 }, 00:34:34.532 { 00:34:34.532 "name": "BaseBdev2", 00:34:34.532 "uuid": "ad0124b0-1e73-43a2-9a11-12eaba34610b", 00:34:34.532 "is_configured": true, 00:34:34.532 "data_offset": 0, 00:34:34.532 "data_size": 65536 00:34:34.532 }, 00:34:34.532 { 00:34:34.532 "name": "BaseBdev3", 00:34:34.532 "uuid": "81b74ff7-084b-4f2e-a93e-ad33dadaee20", 00:34:34.532 "is_configured": true, 00:34:34.532 "data_offset": 0, 00:34:34.532 "data_size": 65536 00:34:34.532 }, 00:34:34.532 { 00:34:34.532 "name": "BaseBdev4", 00:34:34.532 "uuid": "60c4b965-e6a9-40f7-9887-f559ed72500e", 00:34:34.532 "is_configured": true, 00:34:34.532 "data_offset": 0, 00:34:34.532 "data_size": 65536 00:34:34.532 } 00:34:34.532 ] 00:34:34.532 } 00:34:34.532 } 00:34:34.532 }' 00:34:34.532 11:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:34:34.790 11:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:34:34.790 BaseBdev2 00:34:34.790 BaseBdev3 00:34:34.790 BaseBdev4' 00:34:34.790 11:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:34:34.790 11:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:34:34.790 11:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:34:34.790 11:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:34:34.790 "name": "NewBaseBdev", 00:34:34.790 "aliases": [ 00:34:34.790 "e1878df5-c862-481f-95c7-c00a0afe87cc" 00:34:34.790 ], 00:34:34.790 "product_name": "Malloc disk", 00:34:34.790 "block_size": 512, 00:34:34.790 "num_blocks": 65536, 00:34:34.790 "uuid": "e1878df5-c862-481f-95c7-c00a0afe87cc", 00:34:34.790 "assigned_rate_limits": { 00:34:34.790 "rw_ios_per_sec": 0, 00:34:34.790 "rw_mbytes_per_sec": 0, 00:34:34.790 "r_mbytes_per_sec": 0, 00:34:34.790 "w_mbytes_per_sec": 0 00:34:34.790 }, 00:34:34.791 "claimed": true, 00:34:34.791 "claim_type": "exclusive_write", 00:34:34.791 "zoned": false, 00:34:34.791 "supported_io_types": { 00:34:34.791 "read": true, 00:34:34.791 "write": true, 00:34:34.791 "unmap": true, 00:34:34.791 "write_zeroes": true, 00:34:34.791 "flush": true, 00:34:34.791 "reset": true, 00:34:34.791 "compare": false, 00:34:34.791 "compare_and_write": false, 00:34:34.791 "abort": true, 00:34:34.791 "nvme_admin": false, 00:34:34.791 "nvme_io": false 00:34:34.791 }, 00:34:34.791 "memory_domains": [ 00:34:34.791 { 00:34:34.791 "dma_device_id": "system", 00:34:34.791 "dma_device_type": 1 00:34:34.791 }, 00:34:34.791 { 00:34:34.791 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:34:34.791 "dma_device_type": 2 00:34:34.791 } 00:34:34.791 ], 00:34:34.791 "driver_specific": {} 00:34:34.791 }' 00:34:34.791 11:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:34:34.791 11:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:34:34.791 11:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:34:34.791 11:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:34:35.049 11:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:34:35.049 11:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:34:35.049 11:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:34:35.049 11:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:34:35.049 11:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:34:35.049 11:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:34:35.049 11:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:34:35.049 11:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:34:35.049 11:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:34:35.049 11:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:34:35.049 11:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:34:35.307 11:45:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:34:35.307 "name": "BaseBdev2", 00:34:35.307 "aliases": [ 00:34:35.307 "ad0124b0-1e73-43a2-9a11-12eaba34610b" 00:34:35.307 ], 00:34:35.307 "product_name": "Malloc disk", 00:34:35.307 "block_size": 512, 00:34:35.307 "num_blocks": 65536, 00:34:35.307 "uuid": "ad0124b0-1e73-43a2-9a11-12eaba34610b", 00:34:35.307 "assigned_rate_limits": { 00:34:35.307 "rw_ios_per_sec": 0, 00:34:35.307 "rw_mbytes_per_sec": 0, 00:34:35.307 "r_mbytes_per_sec": 0, 00:34:35.307 "w_mbytes_per_sec": 0 00:34:35.307 }, 00:34:35.307 "claimed": true, 00:34:35.307 "claim_type": "exclusive_write", 00:34:35.307 "zoned": false, 00:34:35.307 "supported_io_types": { 00:34:35.307 "read": true, 00:34:35.307 "write": true, 00:34:35.307 "unmap": true, 00:34:35.307 "write_zeroes": true, 00:34:35.307 "flush": true, 00:34:35.307 "reset": true, 00:34:35.307 "compare": false, 00:34:35.307 "compare_and_write": false, 00:34:35.307 "abort": true, 00:34:35.307 "nvme_admin": false, 00:34:35.307 "nvme_io": false 00:34:35.307 }, 00:34:35.307 "memory_domains": [ 00:34:35.307 { 00:34:35.307 "dma_device_id": "system", 00:34:35.307 "dma_device_type": 1 00:34:35.307 }, 00:34:35.307 { 00:34:35.307 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:34:35.307 "dma_device_type": 2 00:34:35.307 } 00:34:35.307 ], 00:34:35.307 "driver_specific": {} 00:34:35.307 }' 00:34:35.307 11:45:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:34:35.307 11:45:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:34:35.307 11:45:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:34:35.307 11:45:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:34:35.307 11:45:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:34:35.569 11:45:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:34:35.569 11:45:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:34:35.569 11:45:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:34:35.569 11:45:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:34:35.569 11:45:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:34:35.569 11:45:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:34:35.569 11:45:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:34:35.569 11:45:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:34:35.569 11:45:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:34:35.569 11:45:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:34:35.827 11:45:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:34:35.827 "name": "BaseBdev3", 00:34:35.827 "aliases": [ 00:34:35.827 "81b74ff7-084b-4f2e-a93e-ad33dadaee20" 00:34:35.827 ], 00:34:35.827 "product_name": "Malloc disk", 00:34:35.827 "block_size": 512, 00:34:35.827 "num_blocks": 65536, 00:34:35.827 "uuid": "81b74ff7-084b-4f2e-a93e-ad33dadaee20", 00:34:35.827 "assigned_rate_limits": { 00:34:35.827 "rw_ios_per_sec": 0, 00:34:35.827 "rw_mbytes_per_sec": 0, 00:34:35.827 "r_mbytes_per_sec": 0, 00:34:35.827 "w_mbytes_per_sec": 0 00:34:35.827 }, 00:34:35.827 "claimed": true, 00:34:35.827 "claim_type": "exclusive_write", 00:34:35.827 "zoned": false, 00:34:35.827 "supported_io_types": { 00:34:35.827 "read": true, 00:34:35.827 "write": true, 00:34:35.827 "unmap": true, 00:34:35.827 "write_zeroes": true, 00:34:35.827 "flush": true, 00:34:35.827 "reset": true, 00:34:35.827 "compare": false, 00:34:35.827 "compare_and_write": false, 00:34:35.827 "abort": true, 00:34:35.827 "nvme_admin": false, 00:34:35.827 "nvme_io": false 00:34:35.827 }, 00:34:35.827 "memory_domains": [ 00:34:35.827 { 00:34:35.827 "dma_device_id": "system", 00:34:35.827 "dma_device_type": 1 00:34:35.827 }, 00:34:35.827 { 00:34:35.827 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:34:35.827 "dma_device_type": 2 00:34:35.827 } 00:34:35.827 ], 00:34:35.827 "driver_specific": {} 00:34:35.827 }' 00:34:35.827 11:45:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:34:35.827 11:45:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:34:35.827 11:45:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:34:35.827 11:45:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:34:35.827 11:45:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:34:35.827 11:45:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:34:35.827 11:45:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:34:35.827 11:45:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:34:36.086 11:45:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:34:36.086 11:45:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:34:36.086 11:45:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:34:36.086 11:45:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:34:36.086 11:45:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:34:36.086 11:45:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:34:36.086 11:45:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:34:36.086 11:45:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:34:36.086 "name": "BaseBdev4", 00:34:36.086 "aliases": [ 00:34:36.086 "60c4b965-e6a9-40f7-9887-f559ed72500e" 00:34:36.086 ], 00:34:36.086 "product_name": "Malloc disk", 00:34:36.086 "block_size": 512, 00:34:36.086 "num_blocks": 65536, 00:34:36.086 "uuid": "60c4b965-e6a9-40f7-9887-f559ed72500e", 00:34:36.086 "assigned_rate_limits": { 00:34:36.086 "rw_ios_per_sec": 0, 00:34:36.086 "rw_mbytes_per_sec": 0, 00:34:36.086 "r_mbytes_per_sec": 0, 00:34:36.086 "w_mbytes_per_sec": 0 00:34:36.086 }, 00:34:36.086 "claimed": true, 00:34:36.086 "claim_type": "exclusive_write", 00:34:36.086 "zoned": false, 00:34:36.086 "supported_io_types": { 00:34:36.086 "read": true, 00:34:36.086 "write": true, 00:34:36.086 "unmap": true, 00:34:36.086 "write_zeroes": true, 00:34:36.086 "flush": true, 00:34:36.086 "reset": true, 00:34:36.086 "compare": false, 00:34:36.086 "compare_and_write": false, 00:34:36.086 "abort": true, 00:34:36.086 "nvme_admin": false, 00:34:36.086 "nvme_io": false 00:34:36.086 }, 00:34:36.086 "memory_domains": [ 00:34:36.086 { 00:34:36.086 "dma_device_id": "system", 00:34:36.086 "dma_device_type": 1 00:34:36.086 }, 00:34:36.086 { 00:34:36.086 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:34:36.086 "dma_device_type": 2 00:34:36.086 } 00:34:36.086 ], 00:34:36.086 "driver_specific": {} 00:34:36.086 }' 00:34:36.086 11:45:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:34:36.344 11:45:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:34:36.344 11:45:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:34:36.344 11:45:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:34:36.344 11:45:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:34:36.344 11:45:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:34:36.344 11:45:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:34:36.344 11:45:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:34:36.344 11:45:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:34:36.344 11:45:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:34:36.601 11:45:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:34:36.601 11:45:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:34:36.601 11:45:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:34:36.601 [2024-06-10 11:45:20.492543] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:34:36.601 [2024-06-10 11:45:20.492565] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:34:36.601 [2024-06-10 11:45:20.492605] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:34:36.601 [2024-06-10 11:45:20.492646] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:34:36.601 [2024-06-10 11:45:20.492654] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2664050 name Existed_Raid, state offline 00:34:36.601 11:45:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 191452 00:34:36.601 11:45:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@949 -- # '[' -z 191452 ']' 00:34:36.601 11:45:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # kill -0 191452 00:34:36.601 11:45:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # uname 00:34:36.601 11:45:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:34:36.601 11:45:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 191452 00:34:36.859 11:45:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:34:36.859 11:45:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:34:36.859 11:45:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 191452' 00:34:36.859 killing process with pid 191452 00:34:36.859 11:45:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # kill 191452 00:34:36.859 [2024-06-10 11:45:20.561292] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:34:36.859 11:45:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@973 -- # wait 191452 00:34:36.859 [2024-06-10 11:45:20.598022] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:34:36.859 11:45:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:34:36.859 00:34:36.859 real 0m24.562s 00:34:36.859 user 0m44.788s 00:34:36.859 sys 0m4.749s 00:34:36.859 11:45:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:34:36.859 11:45:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:34:36.859 ************************************ 00:34:36.859 END TEST raid_state_function_test 00:34:36.859 ************************************ 00:34:37.175 11:45:20 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 4 true 00:34:37.175 11:45:20 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:34:37.175 11:45:20 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:34:37.175 11:45:20 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:34:37.175 ************************************ 00:34:37.175 START TEST raid_state_function_test_sb 00:34:37.175 ************************************ 00:34:37.175 11:45:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # raid_state_function_test concat 4 true 00:34:37.175 11:45:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:34:37.175 11:45:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:34:37.175 11:45:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:34:37.175 11:45:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:34:37.175 11:45:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:34:37.175 11:45:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:34:37.175 11:45:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:34:37.175 11:45:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:34:37.175 11:45:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:34:37.175 11:45:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:34:37.175 11:45:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:34:37.176 11:45:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:34:37.176 11:45:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:34:37.176 11:45:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:34:37.176 11:45:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:34:37.176 11:45:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:34:37.176 11:45:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:34:37.176 11:45:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:34:37.176 11:45:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:34:37.176 11:45:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:34:37.176 11:45:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:34:37.176 11:45:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:34:37.176 11:45:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:34:37.176 11:45:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:34:37.176 11:45:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:34:37.176 11:45:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:34:37.176 11:45:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:34:37.176 11:45:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:34:37.176 11:45:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:34:37.176 11:45:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=195822 00:34:37.176 11:45:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 195822' 00:34:37.176 Process raid pid: 195822 00:34:37.176 11:45:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:34:37.176 11:45:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 195822 /var/tmp/spdk-raid.sock 00:34:37.176 11:45:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@830 -- # '[' -z 195822 ']' 00:34:37.176 11:45:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:34:37.176 11:45:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local max_retries=100 00:34:37.176 11:45:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:34:37.176 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:34:37.176 11:45:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@839 -- # xtrace_disable 00:34:37.176 11:45:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:34:37.176 [2024-06-10 11:45:20.947307] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:34:37.176 [2024-06-10 11:45:20.947358] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:34:37.176 [2024-06-10 11:45:21.033573] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:37.442 [2024-06-10 11:45:21.115987] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:34:37.442 [2024-06-10 11:45:21.171550] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:34:37.442 [2024-06-10 11:45:21.171580] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:34:38.006 11:45:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:34:38.006 11:45:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@863 -- # return 0 00:34:38.006 11:45:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:34:38.006 [2024-06-10 11:45:21.917525] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:34:38.006 [2024-06-10 11:45:21.917562] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:34:38.006 [2024-06-10 11:45:21.917569] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:34:38.006 [2024-06-10 11:45:21.917577] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:34:38.006 [2024-06-10 11:45:21.917582] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:34:38.006 [2024-06-10 11:45:21.917589] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:34:38.006 [2024-06-10 11:45:21.917598] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:34:38.006 [2024-06-10 11:45:21.917605] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:34:38.006 11:45:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:34:38.006 11:45:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:34:38.006 11:45:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:34:38.006 11:45:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:34:38.006 11:45:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:34:38.006 11:45:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:34:38.006 11:45:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:34:38.006 11:45:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:34:38.007 11:45:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:34:38.007 11:45:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:34:38.007 11:45:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:38.007 11:45:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:34:38.264 11:45:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:34:38.264 "name": "Existed_Raid", 00:34:38.264 "uuid": "de38f1f2-075f-4514-9b4a-bf57d8eab6b8", 00:34:38.264 "strip_size_kb": 64, 00:34:38.264 "state": "configuring", 00:34:38.264 "raid_level": "concat", 00:34:38.264 "superblock": true, 00:34:38.264 "num_base_bdevs": 4, 00:34:38.264 "num_base_bdevs_discovered": 0, 00:34:38.264 "num_base_bdevs_operational": 4, 00:34:38.264 "base_bdevs_list": [ 00:34:38.264 { 00:34:38.264 "name": "BaseBdev1", 00:34:38.264 "uuid": "00000000-0000-0000-0000-000000000000", 00:34:38.264 "is_configured": false, 00:34:38.264 "data_offset": 0, 00:34:38.264 "data_size": 0 00:34:38.264 }, 00:34:38.264 { 00:34:38.264 "name": "BaseBdev2", 00:34:38.264 "uuid": "00000000-0000-0000-0000-000000000000", 00:34:38.264 "is_configured": false, 00:34:38.264 "data_offset": 0, 00:34:38.264 "data_size": 0 00:34:38.264 }, 00:34:38.264 { 00:34:38.264 "name": "BaseBdev3", 00:34:38.264 "uuid": "00000000-0000-0000-0000-000000000000", 00:34:38.264 "is_configured": false, 00:34:38.264 "data_offset": 0, 00:34:38.264 "data_size": 0 00:34:38.264 }, 00:34:38.264 { 00:34:38.264 "name": "BaseBdev4", 00:34:38.264 "uuid": "00000000-0000-0000-0000-000000000000", 00:34:38.264 "is_configured": false, 00:34:38.264 "data_offset": 0, 00:34:38.264 "data_size": 0 00:34:38.264 } 00:34:38.264 ] 00:34:38.264 }' 00:34:38.264 11:45:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:34:38.264 11:45:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:34:38.829 11:45:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:34:38.829 [2024-06-10 11:45:22.771634] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:34:38.829 [2024-06-10 11:45:22.771657] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x18a8550 name Existed_Raid, state configuring 00:34:39.087 11:45:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:34:39.087 [2024-06-10 11:45:22.940099] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:34:39.087 [2024-06-10 11:45:22.940125] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:34:39.087 [2024-06-10 11:45:22.940131] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:34:39.087 [2024-06-10 11:45:22.940138] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:34:39.087 [2024-06-10 11:45:22.940147] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:34:39.087 [2024-06-10 11:45:22.940155] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:34:39.087 [2024-06-10 11:45:22.940160] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:34:39.087 [2024-06-10 11:45:22.940167] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:34:39.087 11:45:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:34:39.346 [2024-06-10 11:45:23.133196] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:34:39.346 BaseBdev1 00:34:39.346 11:45:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:34:39.346 11:45:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:34:39.346 11:45:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:34:39.346 11:45:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:34:39.346 11:45:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:34:39.346 11:45:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:34:39.346 11:45:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:34:39.605 11:45:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:34:39.605 [ 00:34:39.605 { 00:34:39.605 "name": "BaseBdev1", 00:34:39.605 "aliases": [ 00:34:39.605 "2ec9ed64-8518-4e36-8404-1eb6bd9289cc" 00:34:39.605 ], 00:34:39.605 "product_name": "Malloc disk", 00:34:39.605 "block_size": 512, 00:34:39.605 "num_blocks": 65536, 00:34:39.605 "uuid": "2ec9ed64-8518-4e36-8404-1eb6bd9289cc", 00:34:39.605 "assigned_rate_limits": { 00:34:39.605 "rw_ios_per_sec": 0, 00:34:39.605 "rw_mbytes_per_sec": 0, 00:34:39.605 "r_mbytes_per_sec": 0, 00:34:39.605 "w_mbytes_per_sec": 0 00:34:39.605 }, 00:34:39.605 "claimed": true, 00:34:39.605 "claim_type": "exclusive_write", 00:34:39.605 "zoned": false, 00:34:39.605 "supported_io_types": { 00:34:39.605 "read": true, 00:34:39.605 "write": true, 00:34:39.605 "unmap": true, 00:34:39.605 "write_zeroes": true, 00:34:39.605 "flush": true, 00:34:39.605 "reset": true, 00:34:39.605 "compare": false, 00:34:39.605 "compare_and_write": false, 00:34:39.605 "abort": true, 00:34:39.605 "nvme_admin": false, 00:34:39.605 "nvme_io": false 00:34:39.605 }, 00:34:39.605 "memory_domains": [ 00:34:39.605 { 00:34:39.605 "dma_device_id": "system", 00:34:39.605 "dma_device_type": 1 00:34:39.605 }, 00:34:39.605 { 00:34:39.605 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:34:39.605 "dma_device_type": 2 00:34:39.605 } 00:34:39.605 ], 00:34:39.605 "driver_specific": {} 00:34:39.605 } 00:34:39.605 ] 00:34:39.605 11:45:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:34:39.605 11:45:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:34:39.605 11:45:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:34:39.605 11:45:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:34:39.605 11:45:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:34:39.605 11:45:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:34:39.605 11:45:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:34:39.605 11:45:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:34:39.605 11:45:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:34:39.605 11:45:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:34:39.605 11:45:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:34:39.605 11:45:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:39.605 11:45:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:34:39.863 11:45:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:34:39.863 "name": "Existed_Raid", 00:34:39.863 "uuid": "edc08fbf-50ff-4974-b710-ff47c3928fa1", 00:34:39.863 "strip_size_kb": 64, 00:34:39.863 "state": "configuring", 00:34:39.863 "raid_level": "concat", 00:34:39.863 "superblock": true, 00:34:39.863 "num_base_bdevs": 4, 00:34:39.863 "num_base_bdevs_discovered": 1, 00:34:39.863 "num_base_bdevs_operational": 4, 00:34:39.863 "base_bdevs_list": [ 00:34:39.863 { 00:34:39.863 "name": "BaseBdev1", 00:34:39.863 "uuid": "2ec9ed64-8518-4e36-8404-1eb6bd9289cc", 00:34:39.863 "is_configured": true, 00:34:39.863 "data_offset": 2048, 00:34:39.863 "data_size": 63488 00:34:39.863 }, 00:34:39.863 { 00:34:39.863 "name": "BaseBdev2", 00:34:39.863 "uuid": "00000000-0000-0000-0000-000000000000", 00:34:39.863 "is_configured": false, 00:34:39.863 "data_offset": 0, 00:34:39.863 "data_size": 0 00:34:39.863 }, 00:34:39.863 { 00:34:39.863 "name": "BaseBdev3", 00:34:39.863 "uuid": "00000000-0000-0000-0000-000000000000", 00:34:39.863 "is_configured": false, 00:34:39.863 "data_offset": 0, 00:34:39.863 "data_size": 0 00:34:39.863 }, 00:34:39.863 { 00:34:39.863 "name": "BaseBdev4", 00:34:39.863 "uuid": "00000000-0000-0000-0000-000000000000", 00:34:39.863 "is_configured": false, 00:34:39.863 "data_offset": 0, 00:34:39.863 "data_size": 0 00:34:39.863 } 00:34:39.863 ] 00:34:39.863 }' 00:34:39.863 11:45:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:34:39.863 11:45:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:34:40.431 11:45:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:34:40.431 [2024-06-10 11:45:24.272162] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:34:40.431 [2024-06-10 11:45:24.272196] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x18a7dc0 name Existed_Raid, state configuring 00:34:40.431 11:45:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:34:40.689 [2024-06-10 11:45:24.460688] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:34:40.689 [2024-06-10 11:45:24.461729] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:34:40.689 [2024-06-10 11:45:24.461756] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:34:40.689 [2024-06-10 11:45:24.461762] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:34:40.689 [2024-06-10 11:45:24.461770] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:34:40.689 [2024-06-10 11:45:24.461776] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:34:40.689 [2024-06-10 11:45:24.461782] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:34:40.689 11:45:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:34:40.689 11:45:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:34:40.689 11:45:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:34:40.689 11:45:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:34:40.689 11:45:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:34:40.689 11:45:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:34:40.689 11:45:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:34:40.689 11:45:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:34:40.689 11:45:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:34:40.689 11:45:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:34:40.689 11:45:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:34:40.689 11:45:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:34:40.689 11:45:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:40.689 11:45:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:34:40.948 11:45:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:34:40.948 "name": "Existed_Raid", 00:34:40.948 "uuid": "ba2c7a37-c71b-43e3-855d-93b12728ab33", 00:34:40.948 "strip_size_kb": 64, 00:34:40.948 "state": "configuring", 00:34:40.948 "raid_level": "concat", 00:34:40.948 "superblock": true, 00:34:40.948 "num_base_bdevs": 4, 00:34:40.948 "num_base_bdevs_discovered": 1, 00:34:40.948 "num_base_bdevs_operational": 4, 00:34:40.948 "base_bdevs_list": [ 00:34:40.948 { 00:34:40.948 "name": "BaseBdev1", 00:34:40.948 "uuid": "2ec9ed64-8518-4e36-8404-1eb6bd9289cc", 00:34:40.948 "is_configured": true, 00:34:40.948 "data_offset": 2048, 00:34:40.948 "data_size": 63488 00:34:40.948 }, 00:34:40.948 { 00:34:40.948 "name": "BaseBdev2", 00:34:40.948 "uuid": "00000000-0000-0000-0000-000000000000", 00:34:40.948 "is_configured": false, 00:34:40.948 "data_offset": 0, 00:34:40.948 "data_size": 0 00:34:40.948 }, 00:34:40.948 { 00:34:40.948 "name": "BaseBdev3", 00:34:40.948 "uuid": "00000000-0000-0000-0000-000000000000", 00:34:40.948 "is_configured": false, 00:34:40.948 "data_offset": 0, 00:34:40.948 "data_size": 0 00:34:40.948 }, 00:34:40.948 { 00:34:40.948 "name": "BaseBdev4", 00:34:40.948 "uuid": "00000000-0000-0000-0000-000000000000", 00:34:40.948 "is_configured": false, 00:34:40.948 "data_offset": 0, 00:34:40.948 "data_size": 0 00:34:40.948 } 00:34:40.948 ] 00:34:40.948 }' 00:34:40.948 11:45:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:34:40.948 11:45:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:34:41.516 11:45:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:34:41.516 [2024-06-10 11:45:25.333860] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:34:41.516 BaseBdev2 00:34:41.516 11:45:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:34:41.516 11:45:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:34:41.516 11:45:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:34:41.516 11:45:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:34:41.516 11:45:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:34:41.516 11:45:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:34:41.516 11:45:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:34:41.774 11:45:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:34:41.774 [ 00:34:41.774 { 00:34:41.774 "name": "BaseBdev2", 00:34:41.774 "aliases": [ 00:34:41.774 "2d7f1249-2598-4aea-8ce2-b662dd8aaa32" 00:34:41.774 ], 00:34:41.774 "product_name": "Malloc disk", 00:34:41.774 "block_size": 512, 00:34:41.774 "num_blocks": 65536, 00:34:41.774 "uuid": "2d7f1249-2598-4aea-8ce2-b662dd8aaa32", 00:34:41.774 "assigned_rate_limits": { 00:34:41.774 "rw_ios_per_sec": 0, 00:34:41.774 "rw_mbytes_per_sec": 0, 00:34:41.774 "r_mbytes_per_sec": 0, 00:34:41.774 "w_mbytes_per_sec": 0 00:34:41.774 }, 00:34:41.774 "claimed": true, 00:34:41.774 "claim_type": "exclusive_write", 00:34:41.774 "zoned": false, 00:34:41.774 "supported_io_types": { 00:34:41.774 "read": true, 00:34:41.774 "write": true, 00:34:41.774 "unmap": true, 00:34:41.774 "write_zeroes": true, 00:34:41.774 "flush": true, 00:34:41.774 "reset": true, 00:34:41.774 "compare": false, 00:34:41.774 "compare_and_write": false, 00:34:41.774 "abort": true, 00:34:41.774 "nvme_admin": false, 00:34:41.774 "nvme_io": false 00:34:41.774 }, 00:34:41.774 "memory_domains": [ 00:34:41.774 { 00:34:41.774 "dma_device_id": "system", 00:34:41.774 "dma_device_type": 1 00:34:41.774 }, 00:34:41.774 { 00:34:41.774 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:34:41.774 "dma_device_type": 2 00:34:41.774 } 00:34:41.774 ], 00:34:41.774 "driver_specific": {} 00:34:41.774 } 00:34:41.774 ] 00:34:41.774 11:45:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:34:41.774 11:45:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:34:41.774 11:45:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:34:41.774 11:45:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:34:41.774 11:45:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:34:41.774 11:45:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:34:41.774 11:45:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:34:41.774 11:45:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:34:41.774 11:45:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:34:41.774 11:45:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:34:41.774 11:45:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:34:41.774 11:45:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:34:41.774 11:45:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:34:41.774 11:45:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:41.774 11:45:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:34:42.033 11:45:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:34:42.033 "name": "Existed_Raid", 00:34:42.033 "uuid": "ba2c7a37-c71b-43e3-855d-93b12728ab33", 00:34:42.033 "strip_size_kb": 64, 00:34:42.033 "state": "configuring", 00:34:42.033 "raid_level": "concat", 00:34:42.033 "superblock": true, 00:34:42.033 "num_base_bdevs": 4, 00:34:42.033 "num_base_bdevs_discovered": 2, 00:34:42.033 "num_base_bdevs_operational": 4, 00:34:42.033 "base_bdevs_list": [ 00:34:42.033 { 00:34:42.033 "name": "BaseBdev1", 00:34:42.033 "uuid": "2ec9ed64-8518-4e36-8404-1eb6bd9289cc", 00:34:42.033 "is_configured": true, 00:34:42.033 "data_offset": 2048, 00:34:42.033 "data_size": 63488 00:34:42.033 }, 00:34:42.033 { 00:34:42.033 "name": "BaseBdev2", 00:34:42.033 "uuid": "2d7f1249-2598-4aea-8ce2-b662dd8aaa32", 00:34:42.033 "is_configured": true, 00:34:42.033 "data_offset": 2048, 00:34:42.033 "data_size": 63488 00:34:42.033 }, 00:34:42.033 { 00:34:42.033 "name": "BaseBdev3", 00:34:42.033 "uuid": "00000000-0000-0000-0000-000000000000", 00:34:42.033 "is_configured": false, 00:34:42.033 "data_offset": 0, 00:34:42.033 "data_size": 0 00:34:42.033 }, 00:34:42.033 { 00:34:42.033 "name": "BaseBdev4", 00:34:42.033 "uuid": "00000000-0000-0000-0000-000000000000", 00:34:42.033 "is_configured": false, 00:34:42.033 "data_offset": 0, 00:34:42.033 "data_size": 0 00:34:42.033 } 00:34:42.033 ] 00:34:42.033 }' 00:34:42.033 11:45:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:34:42.033 11:45:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:34:42.599 11:45:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:34:42.599 [2024-06-10 11:45:26.544039] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:34:42.599 BaseBdev3 00:34:42.858 11:45:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:34:42.858 11:45:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:34:42.858 11:45:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:34:42.858 11:45:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:34:42.858 11:45:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:34:42.858 11:45:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:34:42.858 11:45:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:34:42.858 11:45:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:34:43.116 [ 00:34:43.116 { 00:34:43.116 "name": "BaseBdev3", 00:34:43.116 "aliases": [ 00:34:43.116 "f15310a2-42ff-465e-b216-d7a3524a9c14" 00:34:43.116 ], 00:34:43.116 "product_name": "Malloc disk", 00:34:43.116 "block_size": 512, 00:34:43.116 "num_blocks": 65536, 00:34:43.116 "uuid": "f15310a2-42ff-465e-b216-d7a3524a9c14", 00:34:43.116 "assigned_rate_limits": { 00:34:43.116 "rw_ios_per_sec": 0, 00:34:43.116 "rw_mbytes_per_sec": 0, 00:34:43.116 "r_mbytes_per_sec": 0, 00:34:43.116 "w_mbytes_per_sec": 0 00:34:43.116 }, 00:34:43.116 "claimed": true, 00:34:43.116 "claim_type": "exclusive_write", 00:34:43.116 "zoned": false, 00:34:43.116 "supported_io_types": { 00:34:43.116 "read": true, 00:34:43.116 "write": true, 00:34:43.116 "unmap": true, 00:34:43.116 "write_zeroes": true, 00:34:43.116 "flush": true, 00:34:43.116 "reset": true, 00:34:43.116 "compare": false, 00:34:43.116 "compare_and_write": false, 00:34:43.116 "abort": true, 00:34:43.116 "nvme_admin": false, 00:34:43.116 "nvme_io": false 00:34:43.116 }, 00:34:43.116 "memory_domains": [ 00:34:43.116 { 00:34:43.116 "dma_device_id": "system", 00:34:43.116 "dma_device_type": 1 00:34:43.116 }, 00:34:43.116 { 00:34:43.116 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:34:43.116 "dma_device_type": 2 00:34:43.116 } 00:34:43.116 ], 00:34:43.116 "driver_specific": {} 00:34:43.116 } 00:34:43.116 ] 00:34:43.116 11:45:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:34:43.116 11:45:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:34:43.116 11:45:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:34:43.116 11:45:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:34:43.116 11:45:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:34:43.116 11:45:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:34:43.116 11:45:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:34:43.116 11:45:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:34:43.116 11:45:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:34:43.116 11:45:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:34:43.116 11:45:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:34:43.116 11:45:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:34:43.116 11:45:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:34:43.116 11:45:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:34:43.116 11:45:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:43.375 11:45:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:34:43.375 "name": "Existed_Raid", 00:34:43.375 "uuid": "ba2c7a37-c71b-43e3-855d-93b12728ab33", 00:34:43.375 "strip_size_kb": 64, 00:34:43.375 "state": "configuring", 00:34:43.375 "raid_level": "concat", 00:34:43.375 "superblock": true, 00:34:43.375 "num_base_bdevs": 4, 00:34:43.375 "num_base_bdevs_discovered": 3, 00:34:43.375 "num_base_bdevs_operational": 4, 00:34:43.375 "base_bdevs_list": [ 00:34:43.375 { 00:34:43.375 "name": "BaseBdev1", 00:34:43.375 "uuid": "2ec9ed64-8518-4e36-8404-1eb6bd9289cc", 00:34:43.375 "is_configured": true, 00:34:43.375 "data_offset": 2048, 00:34:43.375 "data_size": 63488 00:34:43.375 }, 00:34:43.375 { 00:34:43.375 "name": "BaseBdev2", 00:34:43.375 "uuid": "2d7f1249-2598-4aea-8ce2-b662dd8aaa32", 00:34:43.375 "is_configured": true, 00:34:43.375 "data_offset": 2048, 00:34:43.375 "data_size": 63488 00:34:43.375 }, 00:34:43.375 { 00:34:43.375 "name": "BaseBdev3", 00:34:43.375 "uuid": "f15310a2-42ff-465e-b216-d7a3524a9c14", 00:34:43.375 "is_configured": true, 00:34:43.375 "data_offset": 2048, 00:34:43.375 "data_size": 63488 00:34:43.375 }, 00:34:43.375 { 00:34:43.375 "name": "BaseBdev4", 00:34:43.375 "uuid": "00000000-0000-0000-0000-000000000000", 00:34:43.375 "is_configured": false, 00:34:43.375 "data_offset": 0, 00:34:43.375 "data_size": 0 00:34:43.375 } 00:34:43.375 ] 00:34:43.375 }' 00:34:43.375 11:45:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:34:43.375 11:45:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:34:43.941 11:45:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:34:43.941 [2024-06-10 11:45:27.753965] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:34:43.941 [2024-06-10 11:45:27.754094] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x18a8e20 00:34:43.941 [2024-06-10 11:45:27.754104] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:34:43.941 [2024-06-10 11:45:27.754224] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18a9a70 00:34:43.941 [2024-06-10 11:45:27.754309] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x18a8e20 00:34:43.941 [2024-06-10 11:45:27.754315] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x18a8e20 00:34:43.941 [2024-06-10 11:45:27.754379] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:34:43.941 BaseBdev4 00:34:43.941 11:45:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:34:43.941 11:45:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev4 00:34:43.941 11:45:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:34:43.941 11:45:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:34:43.941 11:45:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:34:43.941 11:45:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:34:43.941 11:45:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:34:44.200 11:45:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:34:44.200 [ 00:34:44.200 { 00:34:44.200 "name": "BaseBdev4", 00:34:44.200 "aliases": [ 00:34:44.200 "d633f4a9-8ad1-4990-82f9-3da50ee497c3" 00:34:44.200 ], 00:34:44.200 "product_name": "Malloc disk", 00:34:44.200 "block_size": 512, 00:34:44.200 "num_blocks": 65536, 00:34:44.200 "uuid": "d633f4a9-8ad1-4990-82f9-3da50ee497c3", 00:34:44.200 "assigned_rate_limits": { 00:34:44.200 "rw_ios_per_sec": 0, 00:34:44.200 "rw_mbytes_per_sec": 0, 00:34:44.200 "r_mbytes_per_sec": 0, 00:34:44.200 "w_mbytes_per_sec": 0 00:34:44.200 }, 00:34:44.200 "claimed": true, 00:34:44.200 "claim_type": "exclusive_write", 00:34:44.200 "zoned": false, 00:34:44.200 "supported_io_types": { 00:34:44.200 "read": true, 00:34:44.200 "write": true, 00:34:44.200 "unmap": true, 00:34:44.200 "write_zeroes": true, 00:34:44.200 "flush": true, 00:34:44.200 "reset": true, 00:34:44.200 "compare": false, 00:34:44.200 "compare_and_write": false, 00:34:44.200 "abort": true, 00:34:44.200 "nvme_admin": false, 00:34:44.200 "nvme_io": false 00:34:44.200 }, 00:34:44.200 "memory_domains": [ 00:34:44.200 { 00:34:44.200 "dma_device_id": "system", 00:34:44.200 "dma_device_type": 1 00:34:44.200 }, 00:34:44.200 { 00:34:44.200 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:34:44.200 "dma_device_type": 2 00:34:44.200 } 00:34:44.200 ], 00:34:44.200 "driver_specific": {} 00:34:44.200 } 00:34:44.200 ] 00:34:44.200 11:45:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:34:44.200 11:45:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:34:44.200 11:45:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:34:44.200 11:45:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:34:44.200 11:45:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:34:44.200 11:45:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:34:44.200 11:45:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:34:44.200 11:45:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:34:44.200 11:45:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:34:44.200 11:45:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:34:44.200 11:45:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:34:44.200 11:45:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:34:44.200 11:45:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:34:44.200 11:45:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:44.200 11:45:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:34:44.459 11:45:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:34:44.459 "name": "Existed_Raid", 00:34:44.459 "uuid": "ba2c7a37-c71b-43e3-855d-93b12728ab33", 00:34:44.459 "strip_size_kb": 64, 00:34:44.459 "state": "online", 00:34:44.459 "raid_level": "concat", 00:34:44.459 "superblock": true, 00:34:44.459 "num_base_bdevs": 4, 00:34:44.459 "num_base_bdevs_discovered": 4, 00:34:44.459 "num_base_bdevs_operational": 4, 00:34:44.459 "base_bdevs_list": [ 00:34:44.459 { 00:34:44.459 "name": "BaseBdev1", 00:34:44.459 "uuid": "2ec9ed64-8518-4e36-8404-1eb6bd9289cc", 00:34:44.459 "is_configured": true, 00:34:44.459 "data_offset": 2048, 00:34:44.459 "data_size": 63488 00:34:44.459 }, 00:34:44.459 { 00:34:44.459 "name": "BaseBdev2", 00:34:44.459 "uuid": "2d7f1249-2598-4aea-8ce2-b662dd8aaa32", 00:34:44.459 "is_configured": true, 00:34:44.459 "data_offset": 2048, 00:34:44.459 "data_size": 63488 00:34:44.459 }, 00:34:44.459 { 00:34:44.459 "name": "BaseBdev3", 00:34:44.459 "uuid": "f15310a2-42ff-465e-b216-d7a3524a9c14", 00:34:44.459 "is_configured": true, 00:34:44.459 "data_offset": 2048, 00:34:44.459 "data_size": 63488 00:34:44.459 }, 00:34:44.459 { 00:34:44.459 "name": "BaseBdev4", 00:34:44.459 "uuid": "d633f4a9-8ad1-4990-82f9-3da50ee497c3", 00:34:44.459 "is_configured": true, 00:34:44.459 "data_offset": 2048, 00:34:44.459 "data_size": 63488 00:34:44.459 } 00:34:44.459 ] 00:34:44.459 }' 00:34:44.459 11:45:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:34:44.459 11:45:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:34:45.026 11:45:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:34:45.026 11:45:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:34:45.026 11:45:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:34:45.026 11:45:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:34:45.026 11:45:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:34:45.026 11:45:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:34:45.026 11:45:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:34:45.026 11:45:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:34:45.026 [2024-06-10 11:45:28.949249] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:34:45.026 11:45:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:34:45.026 "name": "Existed_Raid", 00:34:45.026 "aliases": [ 00:34:45.026 "ba2c7a37-c71b-43e3-855d-93b12728ab33" 00:34:45.026 ], 00:34:45.026 "product_name": "Raid Volume", 00:34:45.026 "block_size": 512, 00:34:45.026 "num_blocks": 253952, 00:34:45.026 "uuid": "ba2c7a37-c71b-43e3-855d-93b12728ab33", 00:34:45.026 "assigned_rate_limits": { 00:34:45.026 "rw_ios_per_sec": 0, 00:34:45.026 "rw_mbytes_per_sec": 0, 00:34:45.026 "r_mbytes_per_sec": 0, 00:34:45.026 "w_mbytes_per_sec": 0 00:34:45.026 }, 00:34:45.026 "claimed": false, 00:34:45.026 "zoned": false, 00:34:45.026 "supported_io_types": { 00:34:45.026 "read": true, 00:34:45.026 "write": true, 00:34:45.026 "unmap": true, 00:34:45.026 "write_zeroes": true, 00:34:45.026 "flush": true, 00:34:45.026 "reset": true, 00:34:45.026 "compare": false, 00:34:45.026 "compare_and_write": false, 00:34:45.026 "abort": false, 00:34:45.026 "nvme_admin": false, 00:34:45.026 "nvme_io": false 00:34:45.026 }, 00:34:45.026 "memory_domains": [ 00:34:45.026 { 00:34:45.026 "dma_device_id": "system", 00:34:45.026 "dma_device_type": 1 00:34:45.026 }, 00:34:45.026 { 00:34:45.026 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:34:45.026 "dma_device_type": 2 00:34:45.026 }, 00:34:45.026 { 00:34:45.026 "dma_device_id": "system", 00:34:45.026 "dma_device_type": 1 00:34:45.026 }, 00:34:45.026 { 00:34:45.026 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:34:45.026 "dma_device_type": 2 00:34:45.026 }, 00:34:45.026 { 00:34:45.026 "dma_device_id": "system", 00:34:45.026 "dma_device_type": 1 00:34:45.026 }, 00:34:45.026 { 00:34:45.026 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:34:45.026 "dma_device_type": 2 00:34:45.026 }, 00:34:45.026 { 00:34:45.026 "dma_device_id": "system", 00:34:45.026 "dma_device_type": 1 00:34:45.026 }, 00:34:45.026 { 00:34:45.026 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:34:45.026 "dma_device_type": 2 00:34:45.026 } 00:34:45.026 ], 00:34:45.026 "driver_specific": { 00:34:45.026 "raid": { 00:34:45.026 "uuid": "ba2c7a37-c71b-43e3-855d-93b12728ab33", 00:34:45.026 "strip_size_kb": 64, 00:34:45.026 "state": "online", 00:34:45.026 "raid_level": "concat", 00:34:45.026 "superblock": true, 00:34:45.026 "num_base_bdevs": 4, 00:34:45.026 "num_base_bdevs_discovered": 4, 00:34:45.026 "num_base_bdevs_operational": 4, 00:34:45.026 "base_bdevs_list": [ 00:34:45.026 { 00:34:45.026 "name": "BaseBdev1", 00:34:45.026 "uuid": "2ec9ed64-8518-4e36-8404-1eb6bd9289cc", 00:34:45.026 "is_configured": true, 00:34:45.026 "data_offset": 2048, 00:34:45.026 "data_size": 63488 00:34:45.026 }, 00:34:45.026 { 00:34:45.026 "name": "BaseBdev2", 00:34:45.026 "uuid": "2d7f1249-2598-4aea-8ce2-b662dd8aaa32", 00:34:45.026 "is_configured": true, 00:34:45.026 "data_offset": 2048, 00:34:45.026 "data_size": 63488 00:34:45.026 }, 00:34:45.026 { 00:34:45.026 "name": "BaseBdev3", 00:34:45.026 "uuid": "f15310a2-42ff-465e-b216-d7a3524a9c14", 00:34:45.026 "is_configured": true, 00:34:45.026 "data_offset": 2048, 00:34:45.026 "data_size": 63488 00:34:45.026 }, 00:34:45.026 { 00:34:45.026 "name": "BaseBdev4", 00:34:45.026 "uuid": "d633f4a9-8ad1-4990-82f9-3da50ee497c3", 00:34:45.026 "is_configured": true, 00:34:45.026 "data_offset": 2048, 00:34:45.026 "data_size": 63488 00:34:45.026 } 00:34:45.026 ] 00:34:45.026 } 00:34:45.026 } 00:34:45.026 }' 00:34:45.284 11:45:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:34:45.284 11:45:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:34:45.284 BaseBdev2 00:34:45.284 BaseBdev3 00:34:45.284 BaseBdev4' 00:34:45.284 11:45:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:34:45.284 11:45:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:34:45.284 11:45:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:34:45.284 11:45:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:34:45.284 "name": "BaseBdev1", 00:34:45.284 "aliases": [ 00:34:45.284 "2ec9ed64-8518-4e36-8404-1eb6bd9289cc" 00:34:45.284 ], 00:34:45.284 "product_name": "Malloc disk", 00:34:45.284 "block_size": 512, 00:34:45.284 "num_blocks": 65536, 00:34:45.284 "uuid": "2ec9ed64-8518-4e36-8404-1eb6bd9289cc", 00:34:45.284 "assigned_rate_limits": { 00:34:45.284 "rw_ios_per_sec": 0, 00:34:45.284 "rw_mbytes_per_sec": 0, 00:34:45.284 "r_mbytes_per_sec": 0, 00:34:45.284 "w_mbytes_per_sec": 0 00:34:45.284 }, 00:34:45.284 "claimed": true, 00:34:45.284 "claim_type": "exclusive_write", 00:34:45.284 "zoned": false, 00:34:45.284 "supported_io_types": { 00:34:45.284 "read": true, 00:34:45.284 "write": true, 00:34:45.284 "unmap": true, 00:34:45.284 "write_zeroes": true, 00:34:45.284 "flush": true, 00:34:45.284 "reset": true, 00:34:45.284 "compare": false, 00:34:45.284 "compare_and_write": false, 00:34:45.284 "abort": true, 00:34:45.285 "nvme_admin": false, 00:34:45.285 "nvme_io": false 00:34:45.285 }, 00:34:45.285 "memory_domains": [ 00:34:45.285 { 00:34:45.285 "dma_device_id": "system", 00:34:45.285 "dma_device_type": 1 00:34:45.285 }, 00:34:45.285 { 00:34:45.285 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:34:45.285 "dma_device_type": 2 00:34:45.285 } 00:34:45.285 ], 00:34:45.285 "driver_specific": {} 00:34:45.285 }' 00:34:45.285 11:45:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:34:45.543 11:45:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:34:45.543 11:45:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:34:45.543 11:45:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:34:45.543 11:45:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:34:45.543 11:45:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:34:45.543 11:45:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:34:45.543 11:45:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:34:45.543 11:45:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:34:45.543 11:45:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:34:45.543 11:45:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:34:45.801 11:45:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:34:45.801 11:45:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:34:45.801 11:45:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:34:45.801 11:45:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:34:45.801 11:45:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:34:45.801 "name": "BaseBdev2", 00:34:45.801 "aliases": [ 00:34:45.801 "2d7f1249-2598-4aea-8ce2-b662dd8aaa32" 00:34:45.801 ], 00:34:45.801 "product_name": "Malloc disk", 00:34:45.801 "block_size": 512, 00:34:45.801 "num_blocks": 65536, 00:34:45.801 "uuid": "2d7f1249-2598-4aea-8ce2-b662dd8aaa32", 00:34:45.801 "assigned_rate_limits": { 00:34:45.801 "rw_ios_per_sec": 0, 00:34:45.801 "rw_mbytes_per_sec": 0, 00:34:45.801 "r_mbytes_per_sec": 0, 00:34:45.801 "w_mbytes_per_sec": 0 00:34:45.801 }, 00:34:45.801 "claimed": true, 00:34:45.801 "claim_type": "exclusive_write", 00:34:45.801 "zoned": false, 00:34:45.801 "supported_io_types": { 00:34:45.801 "read": true, 00:34:45.801 "write": true, 00:34:45.801 "unmap": true, 00:34:45.801 "write_zeroes": true, 00:34:45.801 "flush": true, 00:34:45.801 "reset": true, 00:34:45.801 "compare": false, 00:34:45.801 "compare_and_write": false, 00:34:45.801 "abort": true, 00:34:45.801 "nvme_admin": false, 00:34:45.801 "nvme_io": false 00:34:45.801 }, 00:34:45.801 "memory_domains": [ 00:34:45.801 { 00:34:45.801 "dma_device_id": "system", 00:34:45.801 "dma_device_type": 1 00:34:45.801 }, 00:34:45.801 { 00:34:45.801 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:34:45.801 "dma_device_type": 2 00:34:45.801 } 00:34:45.801 ], 00:34:45.801 "driver_specific": {} 00:34:45.802 }' 00:34:45.802 11:45:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:34:45.802 11:45:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:34:46.060 11:45:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:34:46.060 11:45:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:34:46.060 11:45:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:34:46.060 11:45:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:34:46.060 11:45:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:34:46.060 11:45:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:34:46.060 11:45:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:34:46.060 11:45:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:34:46.060 11:45:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:34:46.330 11:45:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:34:46.331 11:45:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:34:46.331 11:45:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:34:46.331 11:45:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:34:46.331 11:45:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:34:46.331 "name": "BaseBdev3", 00:34:46.331 "aliases": [ 00:34:46.331 "f15310a2-42ff-465e-b216-d7a3524a9c14" 00:34:46.331 ], 00:34:46.331 "product_name": "Malloc disk", 00:34:46.331 "block_size": 512, 00:34:46.331 "num_blocks": 65536, 00:34:46.331 "uuid": "f15310a2-42ff-465e-b216-d7a3524a9c14", 00:34:46.331 "assigned_rate_limits": { 00:34:46.331 "rw_ios_per_sec": 0, 00:34:46.331 "rw_mbytes_per_sec": 0, 00:34:46.331 "r_mbytes_per_sec": 0, 00:34:46.331 "w_mbytes_per_sec": 0 00:34:46.331 }, 00:34:46.331 "claimed": true, 00:34:46.331 "claim_type": "exclusive_write", 00:34:46.331 "zoned": false, 00:34:46.331 "supported_io_types": { 00:34:46.331 "read": true, 00:34:46.331 "write": true, 00:34:46.331 "unmap": true, 00:34:46.331 "write_zeroes": true, 00:34:46.331 "flush": true, 00:34:46.331 "reset": true, 00:34:46.331 "compare": false, 00:34:46.331 "compare_and_write": false, 00:34:46.331 "abort": true, 00:34:46.331 "nvme_admin": false, 00:34:46.331 "nvme_io": false 00:34:46.331 }, 00:34:46.331 "memory_domains": [ 00:34:46.331 { 00:34:46.331 "dma_device_id": "system", 00:34:46.331 "dma_device_type": 1 00:34:46.331 }, 00:34:46.331 { 00:34:46.331 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:34:46.331 "dma_device_type": 2 00:34:46.331 } 00:34:46.331 ], 00:34:46.331 "driver_specific": {} 00:34:46.331 }' 00:34:46.331 11:45:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:34:46.331 11:45:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:34:46.589 11:45:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:34:46.589 11:45:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:34:46.589 11:45:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:34:46.589 11:45:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:34:46.589 11:45:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:34:46.589 11:45:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:34:46.589 11:45:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:34:46.589 11:45:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:34:46.589 11:45:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:34:46.589 11:45:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:34:46.589 11:45:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:34:46.589 11:45:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:34:46.589 11:45:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:34:46.847 11:45:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:34:46.847 "name": "BaseBdev4", 00:34:46.847 "aliases": [ 00:34:46.847 "d633f4a9-8ad1-4990-82f9-3da50ee497c3" 00:34:46.847 ], 00:34:46.847 "product_name": "Malloc disk", 00:34:46.847 "block_size": 512, 00:34:46.847 "num_blocks": 65536, 00:34:46.847 "uuid": "d633f4a9-8ad1-4990-82f9-3da50ee497c3", 00:34:46.847 "assigned_rate_limits": { 00:34:46.847 "rw_ios_per_sec": 0, 00:34:46.847 "rw_mbytes_per_sec": 0, 00:34:46.847 "r_mbytes_per_sec": 0, 00:34:46.847 "w_mbytes_per_sec": 0 00:34:46.847 }, 00:34:46.847 "claimed": true, 00:34:46.847 "claim_type": "exclusive_write", 00:34:46.847 "zoned": false, 00:34:46.847 "supported_io_types": { 00:34:46.847 "read": true, 00:34:46.847 "write": true, 00:34:46.847 "unmap": true, 00:34:46.847 "write_zeroes": true, 00:34:46.847 "flush": true, 00:34:46.847 "reset": true, 00:34:46.847 "compare": false, 00:34:46.847 "compare_and_write": false, 00:34:46.847 "abort": true, 00:34:46.847 "nvme_admin": false, 00:34:46.847 "nvme_io": false 00:34:46.847 }, 00:34:46.847 "memory_domains": [ 00:34:46.847 { 00:34:46.847 "dma_device_id": "system", 00:34:46.847 "dma_device_type": 1 00:34:46.847 }, 00:34:46.847 { 00:34:46.847 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:34:46.847 "dma_device_type": 2 00:34:46.847 } 00:34:46.847 ], 00:34:46.847 "driver_specific": {} 00:34:46.847 }' 00:34:46.847 11:45:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:34:46.847 11:45:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:34:46.847 11:45:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:34:46.847 11:45:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:34:47.105 11:45:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:34:47.105 11:45:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:34:47.105 11:45:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:34:47.106 11:45:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:34:47.106 11:45:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:34:47.106 11:45:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:34:47.106 11:45:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:34:47.106 11:45:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:34:47.106 11:45:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:34:47.364 [2024-06-10 11:45:31.134773] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:34:47.364 [2024-06-10 11:45:31.134801] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:34:47.364 [2024-06-10 11:45:31.134836] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:34:47.364 11:45:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:34:47.364 11:45:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:34:47.364 11:45:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:34:47.364 11:45:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:34:47.364 11:45:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:34:47.364 11:45:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 3 00:34:47.364 11:45:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:34:47.364 11:45:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:34:47.364 11:45:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:34:47.364 11:45:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:34:47.364 11:45:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:34:47.364 11:45:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:34:47.364 11:45:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:34:47.365 11:45:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:34:47.365 11:45:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:34:47.365 11:45:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:47.365 11:45:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:34:47.622 11:45:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:34:47.622 "name": "Existed_Raid", 00:34:47.622 "uuid": "ba2c7a37-c71b-43e3-855d-93b12728ab33", 00:34:47.622 "strip_size_kb": 64, 00:34:47.622 "state": "offline", 00:34:47.622 "raid_level": "concat", 00:34:47.622 "superblock": true, 00:34:47.622 "num_base_bdevs": 4, 00:34:47.622 "num_base_bdevs_discovered": 3, 00:34:47.622 "num_base_bdevs_operational": 3, 00:34:47.622 "base_bdevs_list": [ 00:34:47.622 { 00:34:47.622 "name": null, 00:34:47.622 "uuid": "00000000-0000-0000-0000-000000000000", 00:34:47.622 "is_configured": false, 00:34:47.622 "data_offset": 2048, 00:34:47.622 "data_size": 63488 00:34:47.622 }, 00:34:47.622 { 00:34:47.622 "name": "BaseBdev2", 00:34:47.622 "uuid": "2d7f1249-2598-4aea-8ce2-b662dd8aaa32", 00:34:47.622 "is_configured": true, 00:34:47.622 "data_offset": 2048, 00:34:47.622 "data_size": 63488 00:34:47.622 }, 00:34:47.622 { 00:34:47.622 "name": "BaseBdev3", 00:34:47.622 "uuid": "f15310a2-42ff-465e-b216-d7a3524a9c14", 00:34:47.622 "is_configured": true, 00:34:47.622 "data_offset": 2048, 00:34:47.622 "data_size": 63488 00:34:47.622 }, 00:34:47.622 { 00:34:47.622 "name": "BaseBdev4", 00:34:47.622 "uuid": "d633f4a9-8ad1-4990-82f9-3da50ee497c3", 00:34:47.622 "is_configured": true, 00:34:47.622 "data_offset": 2048, 00:34:47.622 "data_size": 63488 00:34:47.622 } 00:34:47.622 ] 00:34:47.622 }' 00:34:47.622 11:45:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:34:47.622 11:45:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:34:47.879 11:45:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:34:47.879 11:45:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:34:48.135 11:45:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:48.135 11:45:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:34:48.135 11:45:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:34:48.135 11:45:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:34:48.135 11:45:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:34:48.391 [2024-06-10 11:45:32.143095] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:34:48.391 11:45:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:34:48.391 11:45:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:34:48.391 11:45:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:48.391 11:45:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:34:48.647 11:45:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:34:48.647 11:45:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:34:48.647 11:45:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:34:48.647 [2024-06-10 11:45:32.511092] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:34:48.647 11:45:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:34:48.647 11:45:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:34:48.647 11:45:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:34:48.647 11:45:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:48.904 11:45:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:34:48.904 11:45:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:34:48.904 11:45:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:34:49.162 [2024-06-10 11:45:32.863844] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:34:49.162 [2024-06-10 11:45:32.863884] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x18a8e20 name Existed_Raid, state offline 00:34:49.162 11:45:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:34:49.162 11:45:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:34:49.162 11:45:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:49.162 11:45:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:34:49.162 11:45:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:34:49.162 11:45:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:34:49.162 11:45:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:34:49.162 11:45:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:34:49.162 11:45:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:34:49.162 11:45:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:34:49.420 BaseBdev2 00:34:49.420 11:45:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:34:49.420 11:45:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:34:49.420 11:45:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:34:49.420 11:45:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:34:49.420 11:45:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:34:49.420 11:45:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:34:49.420 11:45:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:34:49.678 11:45:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:34:49.678 [ 00:34:49.678 { 00:34:49.678 "name": "BaseBdev2", 00:34:49.678 "aliases": [ 00:34:49.678 "27681500-b8c2-4431-b16f-0091f93dd823" 00:34:49.678 ], 00:34:49.678 "product_name": "Malloc disk", 00:34:49.678 "block_size": 512, 00:34:49.678 "num_blocks": 65536, 00:34:49.678 "uuid": "27681500-b8c2-4431-b16f-0091f93dd823", 00:34:49.678 "assigned_rate_limits": { 00:34:49.678 "rw_ios_per_sec": 0, 00:34:49.678 "rw_mbytes_per_sec": 0, 00:34:49.678 "r_mbytes_per_sec": 0, 00:34:49.678 "w_mbytes_per_sec": 0 00:34:49.678 }, 00:34:49.678 "claimed": false, 00:34:49.678 "zoned": false, 00:34:49.678 "supported_io_types": { 00:34:49.678 "read": true, 00:34:49.678 "write": true, 00:34:49.678 "unmap": true, 00:34:49.678 "write_zeroes": true, 00:34:49.678 "flush": true, 00:34:49.678 "reset": true, 00:34:49.678 "compare": false, 00:34:49.678 "compare_and_write": false, 00:34:49.678 "abort": true, 00:34:49.678 "nvme_admin": false, 00:34:49.678 "nvme_io": false 00:34:49.678 }, 00:34:49.678 "memory_domains": [ 00:34:49.678 { 00:34:49.678 "dma_device_id": "system", 00:34:49.678 "dma_device_type": 1 00:34:49.678 }, 00:34:49.678 { 00:34:49.678 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:34:49.678 "dma_device_type": 2 00:34:49.678 } 00:34:49.678 ], 00:34:49.678 "driver_specific": {} 00:34:49.678 } 00:34:49.678 ] 00:34:49.678 11:45:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:34:49.678 11:45:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:34:49.678 11:45:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:34:49.678 11:45:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:34:49.936 BaseBdev3 00:34:49.936 11:45:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:34:49.936 11:45:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:34:49.936 11:45:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:34:49.936 11:45:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:34:49.936 11:45:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:34:49.936 11:45:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:34:49.936 11:45:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:34:50.194 11:45:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:34:50.194 [ 00:34:50.194 { 00:34:50.194 "name": "BaseBdev3", 00:34:50.194 "aliases": [ 00:34:50.194 "a91f2a3f-d8da-4bdd-8931-57017b098e46" 00:34:50.194 ], 00:34:50.194 "product_name": "Malloc disk", 00:34:50.194 "block_size": 512, 00:34:50.194 "num_blocks": 65536, 00:34:50.194 "uuid": "a91f2a3f-d8da-4bdd-8931-57017b098e46", 00:34:50.194 "assigned_rate_limits": { 00:34:50.194 "rw_ios_per_sec": 0, 00:34:50.194 "rw_mbytes_per_sec": 0, 00:34:50.194 "r_mbytes_per_sec": 0, 00:34:50.194 "w_mbytes_per_sec": 0 00:34:50.194 }, 00:34:50.194 "claimed": false, 00:34:50.194 "zoned": false, 00:34:50.194 "supported_io_types": { 00:34:50.194 "read": true, 00:34:50.194 "write": true, 00:34:50.195 "unmap": true, 00:34:50.195 "write_zeroes": true, 00:34:50.195 "flush": true, 00:34:50.195 "reset": true, 00:34:50.195 "compare": false, 00:34:50.195 "compare_and_write": false, 00:34:50.195 "abort": true, 00:34:50.195 "nvme_admin": false, 00:34:50.195 "nvme_io": false 00:34:50.195 }, 00:34:50.195 "memory_domains": [ 00:34:50.195 { 00:34:50.195 "dma_device_id": "system", 00:34:50.195 "dma_device_type": 1 00:34:50.195 }, 00:34:50.195 { 00:34:50.195 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:34:50.195 "dma_device_type": 2 00:34:50.195 } 00:34:50.195 ], 00:34:50.195 "driver_specific": {} 00:34:50.195 } 00:34:50.195 ] 00:34:50.195 11:45:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:34:50.195 11:45:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:34:50.195 11:45:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:34:50.195 11:45:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:34:50.453 BaseBdev4 00:34:50.453 11:45:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:34:50.453 11:45:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev4 00:34:50.453 11:45:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:34:50.453 11:45:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:34:50.453 11:45:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:34:50.453 11:45:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:34:50.453 11:45:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:34:50.711 11:45:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:34:50.711 [ 00:34:50.711 { 00:34:50.711 "name": "BaseBdev4", 00:34:50.711 "aliases": [ 00:34:50.711 "0b111ff7-e42d-4340-9688-4ca323d52390" 00:34:50.711 ], 00:34:50.711 "product_name": "Malloc disk", 00:34:50.711 "block_size": 512, 00:34:50.711 "num_blocks": 65536, 00:34:50.711 "uuid": "0b111ff7-e42d-4340-9688-4ca323d52390", 00:34:50.711 "assigned_rate_limits": { 00:34:50.711 "rw_ios_per_sec": 0, 00:34:50.711 "rw_mbytes_per_sec": 0, 00:34:50.711 "r_mbytes_per_sec": 0, 00:34:50.711 "w_mbytes_per_sec": 0 00:34:50.711 }, 00:34:50.711 "claimed": false, 00:34:50.711 "zoned": false, 00:34:50.711 "supported_io_types": { 00:34:50.711 "read": true, 00:34:50.711 "write": true, 00:34:50.711 "unmap": true, 00:34:50.711 "write_zeroes": true, 00:34:50.711 "flush": true, 00:34:50.711 "reset": true, 00:34:50.711 "compare": false, 00:34:50.711 "compare_and_write": false, 00:34:50.711 "abort": true, 00:34:50.711 "nvme_admin": false, 00:34:50.711 "nvme_io": false 00:34:50.711 }, 00:34:50.711 "memory_domains": [ 00:34:50.711 { 00:34:50.711 "dma_device_id": "system", 00:34:50.711 "dma_device_type": 1 00:34:50.711 }, 00:34:50.711 { 00:34:50.711 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:34:50.711 "dma_device_type": 2 00:34:50.711 } 00:34:50.711 ], 00:34:50.711 "driver_specific": {} 00:34:50.711 } 00:34:50.711 ] 00:34:50.711 11:45:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:34:50.711 11:45:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:34:50.711 11:45:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:34:50.711 11:45:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:34:50.978 [2024-06-10 11:45:34.726233] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:34:50.978 [2024-06-10 11:45:34.726269] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:34:50.978 [2024-06-10 11:45:34.726281] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:34:50.978 [2024-06-10 11:45:34.727269] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:34:50.978 [2024-06-10 11:45:34.727303] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:34:50.978 11:45:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:34:50.978 11:45:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:34:50.978 11:45:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:34:50.978 11:45:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:34:50.978 11:45:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:34:50.978 11:45:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:34:50.978 11:45:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:34:50.978 11:45:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:34:50.978 11:45:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:34:50.978 11:45:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:34:50.978 11:45:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:50.978 11:45:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:34:50.978 11:45:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:34:50.978 "name": "Existed_Raid", 00:34:50.978 "uuid": "2d435a1a-9b8a-4f91-b103-e445165ecc70", 00:34:50.978 "strip_size_kb": 64, 00:34:50.978 "state": "configuring", 00:34:50.978 "raid_level": "concat", 00:34:50.978 "superblock": true, 00:34:50.978 "num_base_bdevs": 4, 00:34:50.978 "num_base_bdevs_discovered": 3, 00:34:50.978 "num_base_bdevs_operational": 4, 00:34:50.978 "base_bdevs_list": [ 00:34:50.978 { 00:34:50.978 "name": "BaseBdev1", 00:34:50.978 "uuid": "00000000-0000-0000-0000-000000000000", 00:34:50.978 "is_configured": false, 00:34:50.978 "data_offset": 0, 00:34:50.978 "data_size": 0 00:34:50.978 }, 00:34:50.978 { 00:34:50.978 "name": "BaseBdev2", 00:34:50.978 "uuid": "27681500-b8c2-4431-b16f-0091f93dd823", 00:34:50.978 "is_configured": true, 00:34:50.978 "data_offset": 2048, 00:34:50.978 "data_size": 63488 00:34:50.978 }, 00:34:50.978 { 00:34:50.978 "name": "BaseBdev3", 00:34:50.978 "uuid": "a91f2a3f-d8da-4bdd-8931-57017b098e46", 00:34:50.978 "is_configured": true, 00:34:50.978 "data_offset": 2048, 00:34:50.978 "data_size": 63488 00:34:50.978 }, 00:34:50.978 { 00:34:50.978 "name": "BaseBdev4", 00:34:50.978 "uuid": "0b111ff7-e42d-4340-9688-4ca323d52390", 00:34:50.978 "is_configured": true, 00:34:50.978 "data_offset": 2048, 00:34:50.978 "data_size": 63488 00:34:50.978 } 00:34:50.978 ] 00:34:50.978 }' 00:34:50.978 11:45:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:34:50.978 11:45:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:34:51.545 11:45:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:34:51.803 [2024-06-10 11:45:35.516252] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:34:51.803 11:45:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:34:51.803 11:45:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:34:51.803 11:45:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:34:51.803 11:45:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:34:51.803 11:45:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:34:51.803 11:45:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:34:51.803 11:45:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:34:51.803 11:45:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:34:51.803 11:45:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:34:51.803 11:45:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:34:51.803 11:45:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:51.803 11:45:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:34:51.803 11:45:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:34:51.803 "name": "Existed_Raid", 00:34:51.803 "uuid": "2d435a1a-9b8a-4f91-b103-e445165ecc70", 00:34:51.803 "strip_size_kb": 64, 00:34:51.803 "state": "configuring", 00:34:51.803 "raid_level": "concat", 00:34:51.803 "superblock": true, 00:34:51.803 "num_base_bdevs": 4, 00:34:51.803 "num_base_bdevs_discovered": 2, 00:34:51.803 "num_base_bdevs_operational": 4, 00:34:51.803 "base_bdevs_list": [ 00:34:51.803 { 00:34:51.803 "name": "BaseBdev1", 00:34:51.803 "uuid": "00000000-0000-0000-0000-000000000000", 00:34:51.803 "is_configured": false, 00:34:51.803 "data_offset": 0, 00:34:51.803 "data_size": 0 00:34:51.803 }, 00:34:51.803 { 00:34:51.803 "name": null, 00:34:51.803 "uuid": "27681500-b8c2-4431-b16f-0091f93dd823", 00:34:51.803 "is_configured": false, 00:34:51.803 "data_offset": 2048, 00:34:51.803 "data_size": 63488 00:34:51.803 }, 00:34:51.803 { 00:34:51.803 "name": "BaseBdev3", 00:34:51.803 "uuid": "a91f2a3f-d8da-4bdd-8931-57017b098e46", 00:34:51.803 "is_configured": true, 00:34:51.803 "data_offset": 2048, 00:34:51.803 "data_size": 63488 00:34:51.803 }, 00:34:51.803 { 00:34:51.803 "name": "BaseBdev4", 00:34:51.803 "uuid": "0b111ff7-e42d-4340-9688-4ca323d52390", 00:34:51.803 "is_configured": true, 00:34:51.803 "data_offset": 2048, 00:34:51.803 "data_size": 63488 00:34:51.803 } 00:34:51.803 ] 00:34:51.803 }' 00:34:51.803 11:45:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:34:51.803 11:45:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:34:52.365 11:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:52.365 11:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:34:52.621 11:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:34:52.621 11:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:34:52.621 [2024-06-10 11:45:36.533816] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:34:52.621 BaseBdev1 00:34:52.621 11:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:34:52.622 11:45:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:34:52.622 11:45:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:34:52.622 11:45:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:34:52.622 11:45:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:34:52.622 11:45:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:34:52.622 11:45:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:34:52.878 11:45:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:34:53.136 [ 00:34:53.136 { 00:34:53.136 "name": "BaseBdev1", 00:34:53.136 "aliases": [ 00:34:53.136 "e63935fb-2530-4342-b909-e2ff3107f2d5" 00:34:53.136 ], 00:34:53.136 "product_name": "Malloc disk", 00:34:53.136 "block_size": 512, 00:34:53.136 "num_blocks": 65536, 00:34:53.136 "uuid": "e63935fb-2530-4342-b909-e2ff3107f2d5", 00:34:53.136 "assigned_rate_limits": { 00:34:53.136 "rw_ios_per_sec": 0, 00:34:53.136 "rw_mbytes_per_sec": 0, 00:34:53.136 "r_mbytes_per_sec": 0, 00:34:53.136 "w_mbytes_per_sec": 0 00:34:53.136 }, 00:34:53.136 "claimed": true, 00:34:53.136 "claim_type": "exclusive_write", 00:34:53.136 "zoned": false, 00:34:53.136 "supported_io_types": { 00:34:53.136 "read": true, 00:34:53.136 "write": true, 00:34:53.136 "unmap": true, 00:34:53.136 "write_zeroes": true, 00:34:53.136 "flush": true, 00:34:53.136 "reset": true, 00:34:53.136 "compare": false, 00:34:53.136 "compare_and_write": false, 00:34:53.136 "abort": true, 00:34:53.136 "nvme_admin": false, 00:34:53.136 "nvme_io": false 00:34:53.136 }, 00:34:53.136 "memory_domains": [ 00:34:53.136 { 00:34:53.136 "dma_device_id": "system", 00:34:53.136 "dma_device_type": 1 00:34:53.136 }, 00:34:53.136 { 00:34:53.136 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:34:53.136 "dma_device_type": 2 00:34:53.136 } 00:34:53.136 ], 00:34:53.136 "driver_specific": {} 00:34:53.136 } 00:34:53.136 ] 00:34:53.136 11:45:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:34:53.136 11:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:34:53.136 11:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:34:53.136 11:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:34:53.136 11:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:34:53.136 11:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:34:53.136 11:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:34:53.136 11:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:34:53.136 11:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:34:53.136 11:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:34:53.136 11:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:34:53.136 11:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:53.136 11:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:34:53.136 11:45:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:34:53.136 "name": "Existed_Raid", 00:34:53.136 "uuid": "2d435a1a-9b8a-4f91-b103-e445165ecc70", 00:34:53.136 "strip_size_kb": 64, 00:34:53.136 "state": "configuring", 00:34:53.136 "raid_level": "concat", 00:34:53.136 "superblock": true, 00:34:53.136 "num_base_bdevs": 4, 00:34:53.136 "num_base_bdevs_discovered": 3, 00:34:53.136 "num_base_bdevs_operational": 4, 00:34:53.136 "base_bdevs_list": [ 00:34:53.136 { 00:34:53.136 "name": "BaseBdev1", 00:34:53.136 "uuid": "e63935fb-2530-4342-b909-e2ff3107f2d5", 00:34:53.136 "is_configured": true, 00:34:53.136 "data_offset": 2048, 00:34:53.136 "data_size": 63488 00:34:53.136 }, 00:34:53.136 { 00:34:53.136 "name": null, 00:34:53.136 "uuid": "27681500-b8c2-4431-b16f-0091f93dd823", 00:34:53.136 "is_configured": false, 00:34:53.136 "data_offset": 2048, 00:34:53.136 "data_size": 63488 00:34:53.136 }, 00:34:53.136 { 00:34:53.136 "name": "BaseBdev3", 00:34:53.136 "uuid": "a91f2a3f-d8da-4bdd-8931-57017b098e46", 00:34:53.136 "is_configured": true, 00:34:53.136 "data_offset": 2048, 00:34:53.136 "data_size": 63488 00:34:53.136 }, 00:34:53.136 { 00:34:53.136 "name": "BaseBdev4", 00:34:53.136 "uuid": "0b111ff7-e42d-4340-9688-4ca323d52390", 00:34:53.136 "is_configured": true, 00:34:53.136 "data_offset": 2048, 00:34:53.136 "data_size": 63488 00:34:53.136 } 00:34:53.136 ] 00:34:53.136 }' 00:34:53.136 11:45:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:34:53.136 11:45:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:34:53.701 11:45:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:34:53.701 11:45:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:53.959 11:45:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:34:53.959 11:45:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:34:53.959 [2024-06-10 11:45:37.865314] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:34:53.959 11:45:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:34:53.959 11:45:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:34:53.959 11:45:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:34:53.959 11:45:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:34:53.959 11:45:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:34:53.959 11:45:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:34:53.959 11:45:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:34:53.959 11:45:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:34:53.959 11:45:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:34:53.959 11:45:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:34:53.959 11:45:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:53.959 11:45:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:34:54.217 11:45:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:34:54.217 "name": "Existed_Raid", 00:34:54.217 "uuid": "2d435a1a-9b8a-4f91-b103-e445165ecc70", 00:34:54.217 "strip_size_kb": 64, 00:34:54.217 "state": "configuring", 00:34:54.217 "raid_level": "concat", 00:34:54.217 "superblock": true, 00:34:54.217 "num_base_bdevs": 4, 00:34:54.217 "num_base_bdevs_discovered": 2, 00:34:54.217 "num_base_bdevs_operational": 4, 00:34:54.217 "base_bdevs_list": [ 00:34:54.217 { 00:34:54.217 "name": "BaseBdev1", 00:34:54.217 "uuid": "e63935fb-2530-4342-b909-e2ff3107f2d5", 00:34:54.217 "is_configured": true, 00:34:54.217 "data_offset": 2048, 00:34:54.217 "data_size": 63488 00:34:54.217 }, 00:34:54.217 { 00:34:54.217 "name": null, 00:34:54.217 "uuid": "27681500-b8c2-4431-b16f-0091f93dd823", 00:34:54.217 "is_configured": false, 00:34:54.217 "data_offset": 2048, 00:34:54.217 "data_size": 63488 00:34:54.217 }, 00:34:54.217 { 00:34:54.217 "name": null, 00:34:54.217 "uuid": "a91f2a3f-d8da-4bdd-8931-57017b098e46", 00:34:54.217 "is_configured": false, 00:34:54.217 "data_offset": 2048, 00:34:54.217 "data_size": 63488 00:34:54.217 }, 00:34:54.217 { 00:34:54.217 "name": "BaseBdev4", 00:34:54.217 "uuid": "0b111ff7-e42d-4340-9688-4ca323d52390", 00:34:54.217 "is_configured": true, 00:34:54.217 "data_offset": 2048, 00:34:54.217 "data_size": 63488 00:34:54.217 } 00:34:54.217 ] 00:34:54.217 }' 00:34:54.217 11:45:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:34:54.217 11:45:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:34:54.783 11:45:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:54.783 11:45:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:34:54.783 11:45:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:34:54.783 11:45:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:34:55.041 [2024-06-10 11:45:38.871950] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:34:55.041 11:45:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:34:55.041 11:45:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:34:55.041 11:45:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:34:55.041 11:45:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:34:55.041 11:45:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:34:55.041 11:45:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:34:55.041 11:45:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:34:55.041 11:45:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:34:55.041 11:45:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:34:55.041 11:45:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:34:55.041 11:45:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:55.041 11:45:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:34:55.299 11:45:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:34:55.299 "name": "Existed_Raid", 00:34:55.299 "uuid": "2d435a1a-9b8a-4f91-b103-e445165ecc70", 00:34:55.299 "strip_size_kb": 64, 00:34:55.299 "state": "configuring", 00:34:55.299 "raid_level": "concat", 00:34:55.299 "superblock": true, 00:34:55.299 "num_base_bdevs": 4, 00:34:55.299 "num_base_bdevs_discovered": 3, 00:34:55.299 "num_base_bdevs_operational": 4, 00:34:55.299 "base_bdevs_list": [ 00:34:55.299 { 00:34:55.299 "name": "BaseBdev1", 00:34:55.299 "uuid": "e63935fb-2530-4342-b909-e2ff3107f2d5", 00:34:55.299 "is_configured": true, 00:34:55.299 "data_offset": 2048, 00:34:55.299 "data_size": 63488 00:34:55.299 }, 00:34:55.299 { 00:34:55.299 "name": null, 00:34:55.299 "uuid": "27681500-b8c2-4431-b16f-0091f93dd823", 00:34:55.299 "is_configured": false, 00:34:55.299 "data_offset": 2048, 00:34:55.299 "data_size": 63488 00:34:55.299 }, 00:34:55.299 { 00:34:55.299 "name": "BaseBdev3", 00:34:55.299 "uuid": "a91f2a3f-d8da-4bdd-8931-57017b098e46", 00:34:55.299 "is_configured": true, 00:34:55.299 "data_offset": 2048, 00:34:55.299 "data_size": 63488 00:34:55.299 }, 00:34:55.299 { 00:34:55.299 "name": "BaseBdev4", 00:34:55.299 "uuid": "0b111ff7-e42d-4340-9688-4ca323d52390", 00:34:55.299 "is_configured": true, 00:34:55.299 "data_offset": 2048, 00:34:55.299 "data_size": 63488 00:34:55.299 } 00:34:55.299 ] 00:34:55.300 }' 00:34:55.300 11:45:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:34:55.300 11:45:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:34:55.866 11:45:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:55.866 11:45:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:34:55.866 11:45:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:34:55.866 11:45:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:34:56.125 [2024-06-10 11:45:39.922682] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:34:56.125 11:45:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:34:56.125 11:45:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:34:56.125 11:45:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:34:56.125 11:45:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:34:56.125 11:45:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:34:56.125 11:45:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:34:56.125 11:45:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:34:56.125 11:45:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:34:56.125 11:45:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:34:56.125 11:45:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:34:56.125 11:45:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:56.125 11:45:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:34:56.383 11:45:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:34:56.383 "name": "Existed_Raid", 00:34:56.383 "uuid": "2d435a1a-9b8a-4f91-b103-e445165ecc70", 00:34:56.383 "strip_size_kb": 64, 00:34:56.383 "state": "configuring", 00:34:56.383 "raid_level": "concat", 00:34:56.383 "superblock": true, 00:34:56.383 "num_base_bdevs": 4, 00:34:56.383 "num_base_bdevs_discovered": 2, 00:34:56.383 "num_base_bdevs_operational": 4, 00:34:56.383 "base_bdevs_list": [ 00:34:56.383 { 00:34:56.383 "name": null, 00:34:56.383 "uuid": "e63935fb-2530-4342-b909-e2ff3107f2d5", 00:34:56.383 "is_configured": false, 00:34:56.383 "data_offset": 2048, 00:34:56.383 "data_size": 63488 00:34:56.383 }, 00:34:56.383 { 00:34:56.383 "name": null, 00:34:56.383 "uuid": "27681500-b8c2-4431-b16f-0091f93dd823", 00:34:56.383 "is_configured": false, 00:34:56.383 "data_offset": 2048, 00:34:56.383 "data_size": 63488 00:34:56.383 }, 00:34:56.383 { 00:34:56.383 "name": "BaseBdev3", 00:34:56.383 "uuid": "a91f2a3f-d8da-4bdd-8931-57017b098e46", 00:34:56.383 "is_configured": true, 00:34:56.383 "data_offset": 2048, 00:34:56.383 "data_size": 63488 00:34:56.383 }, 00:34:56.383 { 00:34:56.383 "name": "BaseBdev4", 00:34:56.383 "uuid": "0b111ff7-e42d-4340-9688-4ca323d52390", 00:34:56.383 "is_configured": true, 00:34:56.383 "data_offset": 2048, 00:34:56.383 "data_size": 63488 00:34:56.383 } 00:34:56.383 ] 00:34:56.383 }' 00:34:56.383 11:45:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:34:56.383 11:45:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:34:56.949 11:45:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:56.949 11:45:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:34:56.949 11:45:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:34:56.949 11:45:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:34:57.207 [2024-06-10 11:45:40.935243] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:34:57.207 11:45:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:34:57.207 11:45:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:34:57.207 11:45:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:34:57.207 11:45:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:34:57.207 11:45:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:34:57.207 11:45:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:34:57.207 11:45:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:34:57.207 11:45:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:34:57.207 11:45:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:34:57.207 11:45:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:34:57.207 11:45:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:57.207 11:45:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:34:57.207 11:45:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:34:57.207 "name": "Existed_Raid", 00:34:57.207 "uuid": "2d435a1a-9b8a-4f91-b103-e445165ecc70", 00:34:57.207 "strip_size_kb": 64, 00:34:57.207 "state": "configuring", 00:34:57.207 "raid_level": "concat", 00:34:57.207 "superblock": true, 00:34:57.207 "num_base_bdevs": 4, 00:34:57.207 "num_base_bdevs_discovered": 3, 00:34:57.207 "num_base_bdevs_operational": 4, 00:34:57.207 "base_bdevs_list": [ 00:34:57.207 { 00:34:57.207 "name": null, 00:34:57.207 "uuid": "e63935fb-2530-4342-b909-e2ff3107f2d5", 00:34:57.207 "is_configured": false, 00:34:57.207 "data_offset": 2048, 00:34:57.207 "data_size": 63488 00:34:57.207 }, 00:34:57.207 { 00:34:57.207 "name": "BaseBdev2", 00:34:57.207 "uuid": "27681500-b8c2-4431-b16f-0091f93dd823", 00:34:57.207 "is_configured": true, 00:34:57.207 "data_offset": 2048, 00:34:57.207 "data_size": 63488 00:34:57.207 }, 00:34:57.207 { 00:34:57.207 "name": "BaseBdev3", 00:34:57.207 "uuid": "a91f2a3f-d8da-4bdd-8931-57017b098e46", 00:34:57.207 "is_configured": true, 00:34:57.207 "data_offset": 2048, 00:34:57.207 "data_size": 63488 00:34:57.207 }, 00:34:57.207 { 00:34:57.207 "name": "BaseBdev4", 00:34:57.207 "uuid": "0b111ff7-e42d-4340-9688-4ca323d52390", 00:34:57.207 "is_configured": true, 00:34:57.207 "data_offset": 2048, 00:34:57.207 "data_size": 63488 00:34:57.207 } 00:34:57.207 ] 00:34:57.207 }' 00:34:57.207 11:45:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:34:57.207 11:45:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:34:57.773 11:45:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:57.773 11:45:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:34:58.030 11:45:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:34:58.030 11:45:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:58.030 11:45:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:34:58.030 11:45:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u e63935fb-2530-4342-b909-e2ff3107f2d5 00:34:58.288 [2024-06-10 11:45:42.122448] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:34:58.288 [2024-06-10 11:45:42.122602] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x18a8b00 00:34:58.288 [2024-06-10 11:45:42.122611] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:34:58.288 [2024-06-10 11:45:42.122735] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18adef0 00:34:58.288 [2024-06-10 11:45:42.122825] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x18a8b00 00:34:58.288 [2024-06-10 11:45:42.122831] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x18a8b00 00:34:58.288 [2024-06-10 11:45:42.122906] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:34:58.288 NewBaseBdev 00:34:58.288 11:45:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:34:58.288 11:45:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=NewBaseBdev 00:34:58.288 11:45:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:34:58.288 11:45:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:34:58.288 11:45:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:34:58.288 11:45:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:34:58.288 11:45:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:34:58.545 11:45:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:34:58.546 [ 00:34:58.546 { 00:34:58.546 "name": "NewBaseBdev", 00:34:58.546 "aliases": [ 00:34:58.546 "e63935fb-2530-4342-b909-e2ff3107f2d5" 00:34:58.546 ], 00:34:58.546 "product_name": "Malloc disk", 00:34:58.546 "block_size": 512, 00:34:58.546 "num_blocks": 65536, 00:34:58.546 "uuid": "e63935fb-2530-4342-b909-e2ff3107f2d5", 00:34:58.546 "assigned_rate_limits": { 00:34:58.546 "rw_ios_per_sec": 0, 00:34:58.546 "rw_mbytes_per_sec": 0, 00:34:58.546 "r_mbytes_per_sec": 0, 00:34:58.546 "w_mbytes_per_sec": 0 00:34:58.546 }, 00:34:58.546 "claimed": true, 00:34:58.546 "claim_type": "exclusive_write", 00:34:58.546 "zoned": false, 00:34:58.546 "supported_io_types": { 00:34:58.546 "read": true, 00:34:58.546 "write": true, 00:34:58.546 "unmap": true, 00:34:58.546 "write_zeroes": true, 00:34:58.546 "flush": true, 00:34:58.546 "reset": true, 00:34:58.546 "compare": false, 00:34:58.546 "compare_and_write": false, 00:34:58.546 "abort": true, 00:34:58.546 "nvme_admin": false, 00:34:58.546 "nvme_io": false 00:34:58.546 }, 00:34:58.546 "memory_domains": [ 00:34:58.546 { 00:34:58.546 "dma_device_id": "system", 00:34:58.546 "dma_device_type": 1 00:34:58.546 }, 00:34:58.546 { 00:34:58.546 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:34:58.546 "dma_device_type": 2 00:34:58.546 } 00:34:58.546 ], 00:34:58.546 "driver_specific": {} 00:34:58.546 } 00:34:58.546 ] 00:34:58.546 11:45:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:34:58.546 11:45:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:34:58.546 11:45:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:34:58.546 11:45:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:34:58.546 11:45:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:34:58.546 11:45:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:34:58.546 11:45:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:34:58.546 11:45:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:34:58.546 11:45:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:34:58.546 11:45:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:34:58.546 11:45:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:34:58.546 11:45:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:58.546 11:45:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:34:58.803 11:45:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:34:58.803 "name": "Existed_Raid", 00:34:58.803 "uuid": "2d435a1a-9b8a-4f91-b103-e445165ecc70", 00:34:58.803 "strip_size_kb": 64, 00:34:58.803 "state": "online", 00:34:58.803 "raid_level": "concat", 00:34:58.803 "superblock": true, 00:34:58.803 "num_base_bdevs": 4, 00:34:58.803 "num_base_bdevs_discovered": 4, 00:34:58.803 "num_base_bdevs_operational": 4, 00:34:58.803 "base_bdevs_list": [ 00:34:58.803 { 00:34:58.803 "name": "NewBaseBdev", 00:34:58.803 "uuid": "e63935fb-2530-4342-b909-e2ff3107f2d5", 00:34:58.803 "is_configured": true, 00:34:58.803 "data_offset": 2048, 00:34:58.803 "data_size": 63488 00:34:58.803 }, 00:34:58.803 { 00:34:58.803 "name": "BaseBdev2", 00:34:58.803 "uuid": "27681500-b8c2-4431-b16f-0091f93dd823", 00:34:58.803 "is_configured": true, 00:34:58.803 "data_offset": 2048, 00:34:58.803 "data_size": 63488 00:34:58.803 }, 00:34:58.803 { 00:34:58.803 "name": "BaseBdev3", 00:34:58.803 "uuid": "a91f2a3f-d8da-4bdd-8931-57017b098e46", 00:34:58.803 "is_configured": true, 00:34:58.803 "data_offset": 2048, 00:34:58.803 "data_size": 63488 00:34:58.803 }, 00:34:58.803 { 00:34:58.803 "name": "BaseBdev4", 00:34:58.803 "uuid": "0b111ff7-e42d-4340-9688-4ca323d52390", 00:34:58.803 "is_configured": true, 00:34:58.803 "data_offset": 2048, 00:34:58.803 "data_size": 63488 00:34:58.803 } 00:34:58.803 ] 00:34:58.803 }' 00:34:58.803 11:45:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:34:58.803 11:45:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:34:59.369 11:45:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:34:59.369 11:45:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:34:59.369 11:45:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:34:59.369 11:45:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:34:59.369 11:45:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:34:59.369 11:45:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:34:59.369 11:45:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:34:59.369 11:45:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:34:59.369 [2024-06-10 11:45:43.305720] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:34:59.628 11:45:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:34:59.628 "name": "Existed_Raid", 00:34:59.628 "aliases": [ 00:34:59.628 "2d435a1a-9b8a-4f91-b103-e445165ecc70" 00:34:59.628 ], 00:34:59.628 "product_name": "Raid Volume", 00:34:59.628 "block_size": 512, 00:34:59.628 "num_blocks": 253952, 00:34:59.628 "uuid": "2d435a1a-9b8a-4f91-b103-e445165ecc70", 00:34:59.628 "assigned_rate_limits": { 00:34:59.628 "rw_ios_per_sec": 0, 00:34:59.628 "rw_mbytes_per_sec": 0, 00:34:59.628 "r_mbytes_per_sec": 0, 00:34:59.628 "w_mbytes_per_sec": 0 00:34:59.628 }, 00:34:59.628 "claimed": false, 00:34:59.628 "zoned": false, 00:34:59.628 "supported_io_types": { 00:34:59.628 "read": true, 00:34:59.628 "write": true, 00:34:59.628 "unmap": true, 00:34:59.628 "write_zeroes": true, 00:34:59.628 "flush": true, 00:34:59.628 "reset": true, 00:34:59.628 "compare": false, 00:34:59.628 "compare_and_write": false, 00:34:59.628 "abort": false, 00:34:59.628 "nvme_admin": false, 00:34:59.628 "nvme_io": false 00:34:59.628 }, 00:34:59.628 "memory_domains": [ 00:34:59.628 { 00:34:59.628 "dma_device_id": "system", 00:34:59.628 "dma_device_type": 1 00:34:59.628 }, 00:34:59.628 { 00:34:59.628 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:34:59.628 "dma_device_type": 2 00:34:59.628 }, 00:34:59.628 { 00:34:59.628 "dma_device_id": "system", 00:34:59.628 "dma_device_type": 1 00:34:59.628 }, 00:34:59.628 { 00:34:59.628 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:34:59.628 "dma_device_type": 2 00:34:59.628 }, 00:34:59.628 { 00:34:59.628 "dma_device_id": "system", 00:34:59.628 "dma_device_type": 1 00:34:59.628 }, 00:34:59.628 { 00:34:59.628 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:34:59.628 "dma_device_type": 2 00:34:59.628 }, 00:34:59.628 { 00:34:59.628 "dma_device_id": "system", 00:34:59.628 "dma_device_type": 1 00:34:59.628 }, 00:34:59.628 { 00:34:59.628 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:34:59.628 "dma_device_type": 2 00:34:59.628 } 00:34:59.628 ], 00:34:59.628 "driver_specific": { 00:34:59.628 "raid": { 00:34:59.628 "uuid": "2d435a1a-9b8a-4f91-b103-e445165ecc70", 00:34:59.628 "strip_size_kb": 64, 00:34:59.628 "state": "online", 00:34:59.628 "raid_level": "concat", 00:34:59.628 "superblock": true, 00:34:59.628 "num_base_bdevs": 4, 00:34:59.628 "num_base_bdevs_discovered": 4, 00:34:59.628 "num_base_bdevs_operational": 4, 00:34:59.628 "base_bdevs_list": [ 00:34:59.628 { 00:34:59.628 "name": "NewBaseBdev", 00:34:59.628 "uuid": "e63935fb-2530-4342-b909-e2ff3107f2d5", 00:34:59.628 "is_configured": true, 00:34:59.628 "data_offset": 2048, 00:34:59.628 "data_size": 63488 00:34:59.628 }, 00:34:59.628 { 00:34:59.628 "name": "BaseBdev2", 00:34:59.628 "uuid": "27681500-b8c2-4431-b16f-0091f93dd823", 00:34:59.628 "is_configured": true, 00:34:59.628 "data_offset": 2048, 00:34:59.628 "data_size": 63488 00:34:59.628 }, 00:34:59.628 { 00:34:59.628 "name": "BaseBdev3", 00:34:59.628 "uuid": "a91f2a3f-d8da-4bdd-8931-57017b098e46", 00:34:59.628 "is_configured": true, 00:34:59.628 "data_offset": 2048, 00:34:59.628 "data_size": 63488 00:34:59.628 }, 00:34:59.628 { 00:34:59.628 "name": "BaseBdev4", 00:34:59.628 "uuid": "0b111ff7-e42d-4340-9688-4ca323d52390", 00:34:59.628 "is_configured": true, 00:34:59.628 "data_offset": 2048, 00:34:59.628 "data_size": 63488 00:34:59.628 } 00:34:59.628 ] 00:34:59.628 } 00:34:59.628 } 00:34:59.628 }' 00:34:59.628 11:45:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:34:59.628 11:45:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:34:59.628 BaseBdev2 00:34:59.628 BaseBdev3 00:34:59.628 BaseBdev4' 00:34:59.628 11:45:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:34:59.628 11:45:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:34:59.628 11:45:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:34:59.628 11:45:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:34:59.628 "name": "NewBaseBdev", 00:34:59.628 "aliases": [ 00:34:59.628 "e63935fb-2530-4342-b909-e2ff3107f2d5" 00:34:59.628 ], 00:34:59.629 "product_name": "Malloc disk", 00:34:59.629 "block_size": 512, 00:34:59.629 "num_blocks": 65536, 00:34:59.629 "uuid": "e63935fb-2530-4342-b909-e2ff3107f2d5", 00:34:59.629 "assigned_rate_limits": { 00:34:59.629 "rw_ios_per_sec": 0, 00:34:59.629 "rw_mbytes_per_sec": 0, 00:34:59.629 "r_mbytes_per_sec": 0, 00:34:59.629 "w_mbytes_per_sec": 0 00:34:59.629 }, 00:34:59.629 "claimed": true, 00:34:59.629 "claim_type": "exclusive_write", 00:34:59.629 "zoned": false, 00:34:59.629 "supported_io_types": { 00:34:59.629 "read": true, 00:34:59.629 "write": true, 00:34:59.629 "unmap": true, 00:34:59.629 "write_zeroes": true, 00:34:59.629 "flush": true, 00:34:59.629 "reset": true, 00:34:59.629 "compare": false, 00:34:59.629 "compare_and_write": false, 00:34:59.629 "abort": true, 00:34:59.629 "nvme_admin": false, 00:34:59.629 "nvme_io": false 00:34:59.629 }, 00:34:59.629 "memory_domains": [ 00:34:59.629 { 00:34:59.629 "dma_device_id": "system", 00:34:59.629 "dma_device_type": 1 00:34:59.629 }, 00:34:59.629 { 00:34:59.629 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:34:59.629 "dma_device_type": 2 00:34:59.629 } 00:34:59.629 ], 00:34:59.629 "driver_specific": {} 00:34:59.629 }' 00:34:59.629 11:45:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:34:59.629 11:45:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:34:59.887 11:45:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:34:59.887 11:45:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:34:59.887 11:45:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:34:59.887 11:45:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:34:59.887 11:45:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:34:59.887 11:45:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:34:59.887 11:45:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:34:59.887 11:45:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:34:59.887 11:45:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:35:00.145 11:45:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:35:00.145 11:45:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:35:00.145 11:45:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:35:00.145 11:45:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:35:00.145 11:45:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:35:00.145 "name": "BaseBdev2", 00:35:00.145 "aliases": [ 00:35:00.145 "27681500-b8c2-4431-b16f-0091f93dd823" 00:35:00.145 ], 00:35:00.145 "product_name": "Malloc disk", 00:35:00.145 "block_size": 512, 00:35:00.145 "num_blocks": 65536, 00:35:00.145 "uuid": "27681500-b8c2-4431-b16f-0091f93dd823", 00:35:00.145 "assigned_rate_limits": { 00:35:00.145 "rw_ios_per_sec": 0, 00:35:00.145 "rw_mbytes_per_sec": 0, 00:35:00.145 "r_mbytes_per_sec": 0, 00:35:00.145 "w_mbytes_per_sec": 0 00:35:00.145 }, 00:35:00.145 "claimed": true, 00:35:00.145 "claim_type": "exclusive_write", 00:35:00.145 "zoned": false, 00:35:00.145 "supported_io_types": { 00:35:00.145 "read": true, 00:35:00.145 "write": true, 00:35:00.145 "unmap": true, 00:35:00.145 "write_zeroes": true, 00:35:00.145 "flush": true, 00:35:00.145 "reset": true, 00:35:00.145 "compare": false, 00:35:00.146 "compare_and_write": false, 00:35:00.146 "abort": true, 00:35:00.146 "nvme_admin": false, 00:35:00.146 "nvme_io": false 00:35:00.146 }, 00:35:00.146 "memory_domains": [ 00:35:00.146 { 00:35:00.146 "dma_device_id": "system", 00:35:00.146 "dma_device_type": 1 00:35:00.146 }, 00:35:00.146 { 00:35:00.146 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:35:00.146 "dma_device_type": 2 00:35:00.146 } 00:35:00.146 ], 00:35:00.146 "driver_specific": {} 00:35:00.146 }' 00:35:00.146 11:45:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:35:00.146 11:45:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:35:00.403 11:45:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:35:00.403 11:45:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:35:00.403 11:45:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:35:00.403 11:45:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:35:00.403 11:45:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:35:00.403 11:45:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:35:00.403 11:45:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:35:00.403 11:45:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:35:00.403 11:45:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:35:00.663 11:45:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:35:00.663 11:45:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:35:00.663 11:45:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:35:00.663 11:45:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:35:00.663 11:45:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:35:00.663 "name": "BaseBdev3", 00:35:00.663 "aliases": [ 00:35:00.663 "a91f2a3f-d8da-4bdd-8931-57017b098e46" 00:35:00.663 ], 00:35:00.663 "product_name": "Malloc disk", 00:35:00.663 "block_size": 512, 00:35:00.663 "num_blocks": 65536, 00:35:00.663 "uuid": "a91f2a3f-d8da-4bdd-8931-57017b098e46", 00:35:00.663 "assigned_rate_limits": { 00:35:00.663 "rw_ios_per_sec": 0, 00:35:00.663 "rw_mbytes_per_sec": 0, 00:35:00.663 "r_mbytes_per_sec": 0, 00:35:00.663 "w_mbytes_per_sec": 0 00:35:00.663 }, 00:35:00.663 "claimed": true, 00:35:00.663 "claim_type": "exclusive_write", 00:35:00.663 "zoned": false, 00:35:00.663 "supported_io_types": { 00:35:00.663 "read": true, 00:35:00.663 "write": true, 00:35:00.664 "unmap": true, 00:35:00.664 "write_zeroes": true, 00:35:00.664 "flush": true, 00:35:00.664 "reset": true, 00:35:00.664 "compare": false, 00:35:00.664 "compare_and_write": false, 00:35:00.664 "abort": true, 00:35:00.664 "nvme_admin": false, 00:35:00.664 "nvme_io": false 00:35:00.664 }, 00:35:00.664 "memory_domains": [ 00:35:00.664 { 00:35:00.664 "dma_device_id": "system", 00:35:00.664 "dma_device_type": 1 00:35:00.664 }, 00:35:00.664 { 00:35:00.664 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:35:00.664 "dma_device_type": 2 00:35:00.664 } 00:35:00.664 ], 00:35:00.664 "driver_specific": {} 00:35:00.664 }' 00:35:00.664 11:45:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:35:00.664 11:45:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:35:01.007 11:45:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:35:01.007 11:45:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:35:01.007 11:45:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:35:01.007 11:45:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:35:01.007 11:45:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:35:01.007 11:45:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:35:01.007 11:45:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:35:01.007 11:45:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:35:01.007 11:45:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:35:01.007 11:45:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:35:01.007 11:45:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:35:01.007 11:45:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:35:01.007 11:45:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:35:01.266 11:45:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:35:01.266 "name": "BaseBdev4", 00:35:01.266 "aliases": [ 00:35:01.266 "0b111ff7-e42d-4340-9688-4ca323d52390" 00:35:01.266 ], 00:35:01.266 "product_name": "Malloc disk", 00:35:01.266 "block_size": 512, 00:35:01.266 "num_blocks": 65536, 00:35:01.266 "uuid": "0b111ff7-e42d-4340-9688-4ca323d52390", 00:35:01.266 "assigned_rate_limits": { 00:35:01.266 "rw_ios_per_sec": 0, 00:35:01.266 "rw_mbytes_per_sec": 0, 00:35:01.266 "r_mbytes_per_sec": 0, 00:35:01.266 "w_mbytes_per_sec": 0 00:35:01.266 }, 00:35:01.266 "claimed": true, 00:35:01.266 "claim_type": "exclusive_write", 00:35:01.266 "zoned": false, 00:35:01.266 "supported_io_types": { 00:35:01.266 "read": true, 00:35:01.266 "write": true, 00:35:01.266 "unmap": true, 00:35:01.266 "write_zeroes": true, 00:35:01.266 "flush": true, 00:35:01.266 "reset": true, 00:35:01.266 "compare": false, 00:35:01.266 "compare_and_write": false, 00:35:01.266 "abort": true, 00:35:01.266 "nvme_admin": false, 00:35:01.266 "nvme_io": false 00:35:01.266 }, 00:35:01.266 "memory_domains": [ 00:35:01.266 { 00:35:01.266 "dma_device_id": "system", 00:35:01.266 "dma_device_type": 1 00:35:01.266 }, 00:35:01.266 { 00:35:01.266 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:35:01.266 "dma_device_type": 2 00:35:01.266 } 00:35:01.266 ], 00:35:01.266 "driver_specific": {} 00:35:01.266 }' 00:35:01.266 11:45:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:35:01.266 11:45:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:35:01.266 11:45:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:35:01.266 11:45:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:35:01.266 11:45:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:35:01.266 11:45:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:35:01.266 11:45:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:35:01.524 11:45:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:35:01.524 11:45:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:35:01.524 11:45:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:35:01.524 11:45:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:35:01.524 11:45:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:35:01.524 11:45:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:35:01.783 [2024-06-10 11:45:45.491183] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:35:01.783 [2024-06-10 11:45:45.491204] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:35:01.783 [2024-06-10 11:45:45.491242] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:35:01.783 [2024-06-10 11:45:45.491285] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:35:01.783 [2024-06-10 11:45:45.491298] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x18a8b00 name Existed_Raid, state offline 00:35:01.783 11:45:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 195822 00:35:01.783 11:45:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@949 -- # '[' -z 195822 ']' 00:35:01.783 11:45:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # kill -0 195822 00:35:01.783 11:45:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # uname 00:35:01.783 11:45:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:35:01.783 11:45:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 195822 00:35:01.783 11:45:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:35:01.783 11:45:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:35:01.783 11:45:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # echo 'killing process with pid 195822' 00:35:01.783 killing process with pid 195822 00:35:01.783 11:45:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # kill 195822 00:35:01.783 [2024-06-10 11:45:45.559144] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:35:01.783 11:45:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@973 -- # wait 195822 00:35:01.783 [2024-06-10 11:45:45.594309] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:35:02.042 11:45:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:35:02.042 00:35:02.042 real 0m24.905s 00:35:02.042 user 0m45.457s 00:35:02.042 sys 0m4.820s 00:35:02.042 11:45:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # xtrace_disable 00:35:02.042 11:45:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:35:02.042 ************************************ 00:35:02.042 END TEST raid_state_function_test_sb 00:35:02.042 ************************************ 00:35:02.042 11:45:45 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 4 00:35:02.042 11:45:45 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:35:02.042 11:45:45 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:35:02.042 11:45:45 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:35:02.042 ************************************ 00:35:02.042 START TEST raid_superblock_test 00:35:02.042 ************************************ 00:35:02.042 11:45:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # raid_superblock_test concat 4 00:35:02.042 11:45:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:35:02.042 11:45:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:35:02.042 11:45:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:35:02.042 11:45:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:35:02.042 11:45:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:35:02.042 11:45:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:35:02.042 11:45:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:35:02.042 11:45:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:35:02.042 11:45:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:35:02.042 11:45:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:35:02.042 11:45:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:35:02.043 11:45:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:35:02.043 11:45:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:35:02.043 11:45:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:35:02.043 11:45:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:35:02.043 11:45:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:35:02.043 11:45:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=199690 00:35:02.043 11:45:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 199690 /var/tmp/spdk-raid.sock 00:35:02.043 11:45:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@830 -- # '[' -z 199690 ']' 00:35:02.043 11:45:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:35:02.043 11:45:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:35:02.043 11:45:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:35:02.043 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:35:02.043 11:45:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:35:02.043 11:45:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:35:02.043 11:45:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:35:02.043 [2024-06-10 11:45:45.924679] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:35:02.043 [2024-06-10 11:45:45.924730] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid199690 ] 00:35:02.301 [2024-06-10 11:45:46.010997] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:02.301 [2024-06-10 11:45:46.094951] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:35:02.301 [2024-06-10 11:45:46.146817] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:35:02.301 [2024-06-10 11:45:46.146847] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:35:02.868 11:45:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:35:02.868 11:45:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@863 -- # return 0 00:35:02.868 11:45:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:35:02.868 11:45:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:35:02.869 11:45:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:35:02.869 11:45:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:35:02.869 11:45:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:35:02.869 11:45:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:35:02.869 11:45:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:35:02.869 11:45:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:35:02.869 11:45:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:35:03.127 malloc1 00:35:03.127 11:45:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:35:03.127 [2024-06-10 11:45:47.049189] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:35:03.127 [2024-06-10 11:45:47.049229] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:35:03.127 [2024-06-10 11:45:47.049244] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c95100 00:35:03.127 [2024-06-10 11:45:47.049252] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:35:03.127 [2024-06-10 11:45:47.050499] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:35:03.127 [2024-06-10 11:45:47.050523] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:35:03.127 pt1 00:35:03.127 11:45:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:35:03.127 11:45:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:35:03.127 11:45:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:35:03.127 11:45:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:35:03.127 11:45:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:35:03.127 11:45:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:35:03.127 11:45:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:35:03.127 11:45:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:35:03.127 11:45:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:35:03.386 malloc2 00:35:03.386 11:45:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:35:03.644 [2024-06-10 11:45:47.391103] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:35:03.644 [2024-06-10 11:45:47.391142] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:35:03.644 [2024-06-10 11:45:47.391157] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c96500 00:35:03.644 [2024-06-10 11:45:47.391166] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:35:03.644 [2024-06-10 11:45:47.392340] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:35:03.644 [2024-06-10 11:45:47.392364] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:35:03.644 pt2 00:35:03.644 11:45:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:35:03.644 11:45:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:35:03.644 11:45:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:35:03.644 11:45:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:35:03.644 11:45:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:35:03.644 11:45:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:35:03.644 11:45:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:35:03.644 11:45:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:35:03.644 11:45:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:35:03.644 malloc3 00:35:03.644 11:45:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:35:03.903 [2024-06-10 11:45:47.731688] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:35:03.903 [2024-06-10 11:45:47.731727] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:35:03.903 [2024-06-10 11:45:47.731740] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e407a0 00:35:03.903 [2024-06-10 11:45:47.731748] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:35:03.903 [2024-06-10 11:45:47.732852] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:35:03.903 [2024-06-10 11:45:47.732883] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:35:03.903 pt3 00:35:03.903 11:45:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:35:03.903 11:45:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:35:03.903 11:45:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:35:03.903 11:45:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:35:03.903 11:45:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:35:03.903 11:45:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:35:03.903 11:45:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:35:03.903 11:45:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:35:03.903 11:45:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:35:04.161 malloc4 00:35:04.161 11:45:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:35:04.161 [2024-06-10 11:45:48.084464] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:35:04.161 [2024-06-10 11:45:48.084503] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:35:04.161 [2024-06-10 11:45:48.084515] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e42b50 00:35:04.161 [2024-06-10 11:45:48.084523] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:35:04.161 [2024-06-10 11:45:48.085680] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:35:04.161 [2024-06-10 11:45:48.085704] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:35:04.161 pt4 00:35:04.161 11:45:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:35:04.161 11:45:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:35:04.161 11:45:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:35:04.420 [2024-06-10 11:45:48.256921] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:35:04.420 [2024-06-10 11:45:48.257895] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:35:04.420 [2024-06-10 11:45:48.257933] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:35:04.420 [2024-06-10 11:45:48.257963] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:35:04.420 [2024-06-10 11:45:48.258080] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1e431a0 00:35:04.420 [2024-06-10 11:45:48.258088] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:35:04.420 [2024-06-10 11:45:48.258226] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c93f50 00:35:04.420 [2024-06-10 11:45:48.258322] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1e431a0 00:35:04.420 [2024-06-10 11:45:48.258328] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1e431a0 00:35:04.420 [2024-06-10 11:45:48.258393] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:35:04.420 11:45:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:35:04.420 11:45:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:35:04.420 11:45:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:35:04.420 11:45:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:35:04.420 11:45:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:35:04.420 11:45:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:35:04.420 11:45:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:35:04.420 11:45:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:35:04.420 11:45:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:35:04.420 11:45:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:35:04.420 11:45:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:04.420 11:45:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:35:04.678 11:45:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:35:04.678 "name": "raid_bdev1", 00:35:04.678 "uuid": "9a2b8166-f437-4fb5-ac08-4ed3cc30cc41", 00:35:04.678 "strip_size_kb": 64, 00:35:04.678 "state": "online", 00:35:04.678 "raid_level": "concat", 00:35:04.678 "superblock": true, 00:35:04.678 "num_base_bdevs": 4, 00:35:04.678 "num_base_bdevs_discovered": 4, 00:35:04.678 "num_base_bdevs_operational": 4, 00:35:04.678 "base_bdevs_list": [ 00:35:04.678 { 00:35:04.678 "name": "pt1", 00:35:04.678 "uuid": "00000000-0000-0000-0000-000000000001", 00:35:04.678 "is_configured": true, 00:35:04.678 "data_offset": 2048, 00:35:04.678 "data_size": 63488 00:35:04.678 }, 00:35:04.678 { 00:35:04.678 "name": "pt2", 00:35:04.678 "uuid": "00000000-0000-0000-0000-000000000002", 00:35:04.678 "is_configured": true, 00:35:04.679 "data_offset": 2048, 00:35:04.679 "data_size": 63488 00:35:04.679 }, 00:35:04.679 { 00:35:04.679 "name": "pt3", 00:35:04.679 "uuid": "00000000-0000-0000-0000-000000000003", 00:35:04.679 "is_configured": true, 00:35:04.679 "data_offset": 2048, 00:35:04.679 "data_size": 63488 00:35:04.679 }, 00:35:04.679 { 00:35:04.679 "name": "pt4", 00:35:04.679 "uuid": "00000000-0000-0000-0000-000000000004", 00:35:04.679 "is_configured": true, 00:35:04.679 "data_offset": 2048, 00:35:04.679 "data_size": 63488 00:35:04.679 } 00:35:04.679 ] 00:35:04.679 }' 00:35:04.679 11:45:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:35:04.679 11:45:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:35:05.245 11:45:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:35:05.245 11:45:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:35:05.245 11:45:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:35:05.245 11:45:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:35:05.245 11:45:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:35:05.245 11:45:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:35:05.245 11:45:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:35:05.245 11:45:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:35:05.245 [2024-06-10 11:45:49.107373] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:35:05.245 11:45:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:35:05.245 "name": "raid_bdev1", 00:35:05.245 "aliases": [ 00:35:05.245 "9a2b8166-f437-4fb5-ac08-4ed3cc30cc41" 00:35:05.245 ], 00:35:05.245 "product_name": "Raid Volume", 00:35:05.245 "block_size": 512, 00:35:05.245 "num_blocks": 253952, 00:35:05.245 "uuid": "9a2b8166-f437-4fb5-ac08-4ed3cc30cc41", 00:35:05.245 "assigned_rate_limits": { 00:35:05.245 "rw_ios_per_sec": 0, 00:35:05.245 "rw_mbytes_per_sec": 0, 00:35:05.245 "r_mbytes_per_sec": 0, 00:35:05.245 "w_mbytes_per_sec": 0 00:35:05.245 }, 00:35:05.245 "claimed": false, 00:35:05.245 "zoned": false, 00:35:05.245 "supported_io_types": { 00:35:05.245 "read": true, 00:35:05.245 "write": true, 00:35:05.245 "unmap": true, 00:35:05.245 "write_zeroes": true, 00:35:05.245 "flush": true, 00:35:05.245 "reset": true, 00:35:05.245 "compare": false, 00:35:05.245 "compare_and_write": false, 00:35:05.245 "abort": false, 00:35:05.245 "nvme_admin": false, 00:35:05.245 "nvme_io": false 00:35:05.245 }, 00:35:05.245 "memory_domains": [ 00:35:05.245 { 00:35:05.245 "dma_device_id": "system", 00:35:05.245 "dma_device_type": 1 00:35:05.245 }, 00:35:05.245 { 00:35:05.245 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:35:05.245 "dma_device_type": 2 00:35:05.245 }, 00:35:05.245 { 00:35:05.245 "dma_device_id": "system", 00:35:05.245 "dma_device_type": 1 00:35:05.245 }, 00:35:05.245 { 00:35:05.245 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:35:05.245 "dma_device_type": 2 00:35:05.245 }, 00:35:05.245 { 00:35:05.245 "dma_device_id": "system", 00:35:05.245 "dma_device_type": 1 00:35:05.245 }, 00:35:05.245 { 00:35:05.245 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:35:05.245 "dma_device_type": 2 00:35:05.245 }, 00:35:05.246 { 00:35:05.246 "dma_device_id": "system", 00:35:05.246 "dma_device_type": 1 00:35:05.246 }, 00:35:05.246 { 00:35:05.246 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:35:05.246 "dma_device_type": 2 00:35:05.246 } 00:35:05.246 ], 00:35:05.246 "driver_specific": { 00:35:05.246 "raid": { 00:35:05.246 "uuid": "9a2b8166-f437-4fb5-ac08-4ed3cc30cc41", 00:35:05.246 "strip_size_kb": 64, 00:35:05.246 "state": "online", 00:35:05.246 "raid_level": "concat", 00:35:05.246 "superblock": true, 00:35:05.246 "num_base_bdevs": 4, 00:35:05.246 "num_base_bdevs_discovered": 4, 00:35:05.246 "num_base_bdevs_operational": 4, 00:35:05.246 "base_bdevs_list": [ 00:35:05.246 { 00:35:05.246 "name": "pt1", 00:35:05.246 "uuid": "00000000-0000-0000-0000-000000000001", 00:35:05.246 "is_configured": true, 00:35:05.246 "data_offset": 2048, 00:35:05.246 "data_size": 63488 00:35:05.246 }, 00:35:05.246 { 00:35:05.246 "name": "pt2", 00:35:05.246 "uuid": "00000000-0000-0000-0000-000000000002", 00:35:05.246 "is_configured": true, 00:35:05.246 "data_offset": 2048, 00:35:05.246 "data_size": 63488 00:35:05.246 }, 00:35:05.246 { 00:35:05.246 "name": "pt3", 00:35:05.246 "uuid": "00000000-0000-0000-0000-000000000003", 00:35:05.246 "is_configured": true, 00:35:05.246 "data_offset": 2048, 00:35:05.246 "data_size": 63488 00:35:05.246 }, 00:35:05.246 { 00:35:05.246 "name": "pt4", 00:35:05.246 "uuid": "00000000-0000-0000-0000-000000000004", 00:35:05.246 "is_configured": true, 00:35:05.246 "data_offset": 2048, 00:35:05.246 "data_size": 63488 00:35:05.246 } 00:35:05.246 ] 00:35:05.246 } 00:35:05.246 } 00:35:05.246 }' 00:35:05.246 11:45:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:35:05.246 11:45:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:35:05.246 pt2 00:35:05.246 pt3 00:35:05.246 pt4' 00:35:05.246 11:45:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:35:05.246 11:45:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:35:05.246 11:45:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:35:05.504 11:45:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:35:05.504 "name": "pt1", 00:35:05.504 "aliases": [ 00:35:05.504 "00000000-0000-0000-0000-000000000001" 00:35:05.504 ], 00:35:05.504 "product_name": "passthru", 00:35:05.504 "block_size": 512, 00:35:05.504 "num_blocks": 65536, 00:35:05.504 "uuid": "00000000-0000-0000-0000-000000000001", 00:35:05.504 "assigned_rate_limits": { 00:35:05.504 "rw_ios_per_sec": 0, 00:35:05.504 "rw_mbytes_per_sec": 0, 00:35:05.504 "r_mbytes_per_sec": 0, 00:35:05.504 "w_mbytes_per_sec": 0 00:35:05.504 }, 00:35:05.504 "claimed": true, 00:35:05.504 "claim_type": "exclusive_write", 00:35:05.504 "zoned": false, 00:35:05.504 "supported_io_types": { 00:35:05.504 "read": true, 00:35:05.504 "write": true, 00:35:05.504 "unmap": true, 00:35:05.504 "write_zeroes": true, 00:35:05.504 "flush": true, 00:35:05.504 "reset": true, 00:35:05.504 "compare": false, 00:35:05.504 "compare_and_write": false, 00:35:05.504 "abort": true, 00:35:05.504 "nvme_admin": false, 00:35:05.504 "nvme_io": false 00:35:05.504 }, 00:35:05.504 "memory_domains": [ 00:35:05.504 { 00:35:05.504 "dma_device_id": "system", 00:35:05.504 "dma_device_type": 1 00:35:05.504 }, 00:35:05.504 { 00:35:05.504 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:35:05.504 "dma_device_type": 2 00:35:05.504 } 00:35:05.504 ], 00:35:05.504 "driver_specific": { 00:35:05.504 "passthru": { 00:35:05.504 "name": "pt1", 00:35:05.504 "base_bdev_name": "malloc1" 00:35:05.504 } 00:35:05.504 } 00:35:05.504 }' 00:35:05.504 11:45:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:35:05.504 11:45:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:35:05.504 11:45:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:35:05.504 11:45:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:35:05.763 11:45:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:35:05.763 11:45:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:35:05.763 11:45:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:35:05.763 11:45:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:35:05.763 11:45:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:35:05.763 11:45:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:35:05.763 11:45:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:35:05.763 11:45:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:35:05.763 11:45:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:35:05.763 11:45:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:35:05.763 11:45:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:35:06.021 11:45:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:35:06.021 "name": "pt2", 00:35:06.021 "aliases": [ 00:35:06.021 "00000000-0000-0000-0000-000000000002" 00:35:06.021 ], 00:35:06.021 "product_name": "passthru", 00:35:06.021 "block_size": 512, 00:35:06.021 "num_blocks": 65536, 00:35:06.021 "uuid": "00000000-0000-0000-0000-000000000002", 00:35:06.021 "assigned_rate_limits": { 00:35:06.021 "rw_ios_per_sec": 0, 00:35:06.021 "rw_mbytes_per_sec": 0, 00:35:06.021 "r_mbytes_per_sec": 0, 00:35:06.021 "w_mbytes_per_sec": 0 00:35:06.021 }, 00:35:06.021 "claimed": true, 00:35:06.021 "claim_type": "exclusive_write", 00:35:06.021 "zoned": false, 00:35:06.021 "supported_io_types": { 00:35:06.021 "read": true, 00:35:06.021 "write": true, 00:35:06.021 "unmap": true, 00:35:06.021 "write_zeroes": true, 00:35:06.021 "flush": true, 00:35:06.021 "reset": true, 00:35:06.021 "compare": false, 00:35:06.021 "compare_and_write": false, 00:35:06.021 "abort": true, 00:35:06.021 "nvme_admin": false, 00:35:06.021 "nvme_io": false 00:35:06.021 }, 00:35:06.021 "memory_domains": [ 00:35:06.021 { 00:35:06.021 "dma_device_id": "system", 00:35:06.021 "dma_device_type": 1 00:35:06.021 }, 00:35:06.021 { 00:35:06.021 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:35:06.021 "dma_device_type": 2 00:35:06.021 } 00:35:06.021 ], 00:35:06.021 "driver_specific": { 00:35:06.021 "passthru": { 00:35:06.021 "name": "pt2", 00:35:06.021 "base_bdev_name": "malloc2" 00:35:06.021 } 00:35:06.021 } 00:35:06.021 }' 00:35:06.021 11:45:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:35:06.021 11:45:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:35:06.021 11:45:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:35:06.021 11:45:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:35:06.279 11:45:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:35:06.280 11:45:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:35:06.280 11:45:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:35:06.280 11:45:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:35:06.280 11:45:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:35:06.280 11:45:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:35:06.280 11:45:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:35:06.280 11:45:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:35:06.280 11:45:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:35:06.280 11:45:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:35:06.280 11:45:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:35:06.538 11:45:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:35:06.538 "name": "pt3", 00:35:06.538 "aliases": [ 00:35:06.538 "00000000-0000-0000-0000-000000000003" 00:35:06.538 ], 00:35:06.538 "product_name": "passthru", 00:35:06.538 "block_size": 512, 00:35:06.538 "num_blocks": 65536, 00:35:06.538 "uuid": "00000000-0000-0000-0000-000000000003", 00:35:06.538 "assigned_rate_limits": { 00:35:06.538 "rw_ios_per_sec": 0, 00:35:06.538 "rw_mbytes_per_sec": 0, 00:35:06.538 "r_mbytes_per_sec": 0, 00:35:06.538 "w_mbytes_per_sec": 0 00:35:06.538 }, 00:35:06.538 "claimed": true, 00:35:06.538 "claim_type": "exclusive_write", 00:35:06.538 "zoned": false, 00:35:06.538 "supported_io_types": { 00:35:06.538 "read": true, 00:35:06.538 "write": true, 00:35:06.538 "unmap": true, 00:35:06.538 "write_zeroes": true, 00:35:06.538 "flush": true, 00:35:06.538 "reset": true, 00:35:06.538 "compare": false, 00:35:06.538 "compare_and_write": false, 00:35:06.538 "abort": true, 00:35:06.538 "nvme_admin": false, 00:35:06.538 "nvme_io": false 00:35:06.538 }, 00:35:06.538 "memory_domains": [ 00:35:06.538 { 00:35:06.538 "dma_device_id": "system", 00:35:06.538 "dma_device_type": 1 00:35:06.538 }, 00:35:06.538 { 00:35:06.538 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:35:06.538 "dma_device_type": 2 00:35:06.538 } 00:35:06.538 ], 00:35:06.538 "driver_specific": { 00:35:06.538 "passthru": { 00:35:06.538 "name": "pt3", 00:35:06.538 "base_bdev_name": "malloc3" 00:35:06.538 } 00:35:06.538 } 00:35:06.538 }' 00:35:06.539 11:45:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:35:06.539 11:45:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:35:06.539 11:45:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:35:06.539 11:45:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:35:06.539 11:45:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:35:06.539 11:45:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:35:06.539 11:45:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:35:06.797 11:45:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:35:06.797 11:45:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:35:06.797 11:45:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:35:06.797 11:45:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:35:06.797 11:45:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:35:06.797 11:45:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:35:06.797 11:45:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:35:06.797 11:45:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:35:07.056 11:45:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:35:07.056 "name": "pt4", 00:35:07.056 "aliases": [ 00:35:07.056 "00000000-0000-0000-0000-000000000004" 00:35:07.056 ], 00:35:07.056 "product_name": "passthru", 00:35:07.056 "block_size": 512, 00:35:07.056 "num_blocks": 65536, 00:35:07.056 "uuid": "00000000-0000-0000-0000-000000000004", 00:35:07.056 "assigned_rate_limits": { 00:35:07.056 "rw_ios_per_sec": 0, 00:35:07.056 "rw_mbytes_per_sec": 0, 00:35:07.056 "r_mbytes_per_sec": 0, 00:35:07.056 "w_mbytes_per_sec": 0 00:35:07.056 }, 00:35:07.056 "claimed": true, 00:35:07.056 "claim_type": "exclusive_write", 00:35:07.056 "zoned": false, 00:35:07.056 "supported_io_types": { 00:35:07.056 "read": true, 00:35:07.056 "write": true, 00:35:07.056 "unmap": true, 00:35:07.056 "write_zeroes": true, 00:35:07.056 "flush": true, 00:35:07.056 "reset": true, 00:35:07.056 "compare": false, 00:35:07.056 "compare_and_write": false, 00:35:07.056 "abort": true, 00:35:07.056 "nvme_admin": false, 00:35:07.056 "nvme_io": false 00:35:07.056 }, 00:35:07.056 "memory_domains": [ 00:35:07.056 { 00:35:07.056 "dma_device_id": "system", 00:35:07.056 "dma_device_type": 1 00:35:07.056 }, 00:35:07.056 { 00:35:07.056 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:35:07.056 "dma_device_type": 2 00:35:07.056 } 00:35:07.056 ], 00:35:07.056 "driver_specific": { 00:35:07.056 "passthru": { 00:35:07.056 "name": "pt4", 00:35:07.056 "base_bdev_name": "malloc4" 00:35:07.056 } 00:35:07.056 } 00:35:07.056 }' 00:35:07.056 11:45:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:35:07.056 11:45:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:35:07.056 11:45:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:35:07.056 11:45:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:35:07.056 11:45:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:35:07.056 11:45:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:35:07.056 11:45:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:35:07.314 11:45:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:35:07.314 11:45:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:35:07.314 11:45:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:35:07.314 11:45:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:35:07.314 11:45:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:35:07.314 11:45:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:35:07.314 11:45:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:35:07.574 [2024-06-10 11:45:51.289067] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:35:07.574 11:45:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=9a2b8166-f437-4fb5-ac08-4ed3cc30cc41 00:35:07.574 11:45:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 9a2b8166-f437-4fb5-ac08-4ed3cc30cc41 ']' 00:35:07.574 11:45:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:35:07.574 [2024-06-10 11:45:51.465324] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:35:07.574 [2024-06-10 11:45:51.465341] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:35:07.574 [2024-06-10 11:45:51.465376] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:35:07.574 [2024-06-10 11:45:51.465418] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:35:07.574 [2024-06-10 11:45:51.465425] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e431a0 name raid_bdev1, state offline 00:35:07.574 11:45:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:07.574 11:45:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:35:07.833 11:45:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:35:07.833 11:45:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:35:07.833 11:45:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:35:07.833 11:45:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:35:08.092 11:45:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:35:08.092 11:45:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:35:08.092 11:45:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:35:08.092 11:45:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:35:08.354 11:45:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:35:08.354 11:45:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:35:08.616 11:45:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:35:08.616 11:45:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:35:08.616 11:45:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:35:08.616 11:45:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:35:08.616 11:45:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@649 -- # local es=0 00:35:08.616 11:45:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:35:08.616 11:45:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:35:08.616 11:45:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:35:08.616 11:45:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:35:08.616 11:45:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:35:08.616 11:45:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:35:08.616 11:45:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:35:08.616 11:45:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:35:08.616 11:45:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:35:08.616 11:45:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:35:08.874 [2024-06-10 11:45:52.684447] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:35:08.874 [2024-06-10 11:45:52.685458] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:35:08.874 [2024-06-10 11:45:52.685491] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:35:08.874 [2024-06-10 11:45:52.685514] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:35:08.874 [2024-06-10 11:45:52.685547] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:35:08.874 [2024-06-10 11:45:52.685577] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:35:08.874 [2024-06-10 11:45:52.685592] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:35:08.874 [2024-06-10 11:45:52.685607] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:35:08.874 [2024-06-10 11:45:52.685618] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:35:08.874 [2024-06-10 11:45:52.685626] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e41ad0 name raid_bdev1, state configuring 00:35:08.874 request: 00:35:08.874 { 00:35:08.874 "name": "raid_bdev1", 00:35:08.874 "raid_level": "concat", 00:35:08.874 "base_bdevs": [ 00:35:08.874 "malloc1", 00:35:08.874 "malloc2", 00:35:08.874 "malloc3", 00:35:08.874 "malloc4" 00:35:08.874 ], 00:35:08.875 "superblock": false, 00:35:08.875 "strip_size_kb": 64, 00:35:08.875 "method": "bdev_raid_create", 00:35:08.875 "req_id": 1 00:35:08.875 } 00:35:08.875 Got JSON-RPC error response 00:35:08.875 response: 00:35:08.875 { 00:35:08.875 "code": -17, 00:35:08.875 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:35:08.875 } 00:35:08.875 11:45:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # es=1 00:35:08.875 11:45:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:35:08.875 11:45:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:35:08.875 11:45:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:35:08.875 11:45:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:08.875 11:45:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:35:09.133 11:45:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:35:09.133 11:45:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:35:09.133 11:45:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:35:09.133 [2024-06-10 11:45:53.033321] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:35:09.133 [2024-06-10 11:45:53.033357] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:35:09.133 [2024-06-10 11:45:53.033370] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e464a0 00:35:09.133 [2024-06-10 11:45:53.033378] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:35:09.133 [2024-06-10 11:45:53.034609] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:35:09.133 [2024-06-10 11:45:53.034633] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:35:09.133 [2024-06-10 11:45:53.034685] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:35:09.133 [2024-06-10 11:45:53.034720] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:35:09.133 pt1 00:35:09.133 11:45:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 4 00:35:09.133 11:45:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:35:09.133 11:45:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:35:09.133 11:45:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:35:09.133 11:45:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:35:09.133 11:45:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:35:09.133 11:45:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:35:09.133 11:45:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:35:09.133 11:45:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:35:09.133 11:45:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:35:09.133 11:45:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:09.133 11:45:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:35:09.393 11:45:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:35:09.393 "name": "raid_bdev1", 00:35:09.393 "uuid": "9a2b8166-f437-4fb5-ac08-4ed3cc30cc41", 00:35:09.393 "strip_size_kb": 64, 00:35:09.393 "state": "configuring", 00:35:09.393 "raid_level": "concat", 00:35:09.393 "superblock": true, 00:35:09.393 "num_base_bdevs": 4, 00:35:09.393 "num_base_bdevs_discovered": 1, 00:35:09.393 "num_base_bdevs_operational": 4, 00:35:09.393 "base_bdevs_list": [ 00:35:09.393 { 00:35:09.393 "name": "pt1", 00:35:09.393 "uuid": "00000000-0000-0000-0000-000000000001", 00:35:09.393 "is_configured": true, 00:35:09.393 "data_offset": 2048, 00:35:09.393 "data_size": 63488 00:35:09.393 }, 00:35:09.393 { 00:35:09.393 "name": null, 00:35:09.393 "uuid": "00000000-0000-0000-0000-000000000002", 00:35:09.393 "is_configured": false, 00:35:09.393 "data_offset": 2048, 00:35:09.393 "data_size": 63488 00:35:09.393 }, 00:35:09.393 { 00:35:09.393 "name": null, 00:35:09.393 "uuid": "00000000-0000-0000-0000-000000000003", 00:35:09.393 "is_configured": false, 00:35:09.393 "data_offset": 2048, 00:35:09.393 "data_size": 63488 00:35:09.393 }, 00:35:09.393 { 00:35:09.393 "name": null, 00:35:09.393 "uuid": "00000000-0000-0000-0000-000000000004", 00:35:09.393 "is_configured": false, 00:35:09.393 "data_offset": 2048, 00:35:09.393 "data_size": 63488 00:35:09.393 } 00:35:09.393 ] 00:35:09.393 }' 00:35:09.393 11:45:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:35:09.393 11:45:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:35:09.961 11:45:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:35:09.961 11:45:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:35:09.961 [2024-06-10 11:45:53.863472] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:35:09.961 [2024-06-10 11:45:53.863510] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:35:09.961 [2024-06-10 11:45:53.863524] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e45a60 00:35:09.961 [2024-06-10 11:45:53.863532] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:35:09.961 [2024-06-10 11:45:53.863774] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:35:09.961 [2024-06-10 11:45:53.863788] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:35:09.961 [2024-06-10 11:45:53.863834] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:35:09.961 [2024-06-10 11:45:53.863848] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:35:09.961 pt2 00:35:09.961 11:45:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:35:10.221 [2024-06-10 11:45:54.035932] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:35:10.221 11:45:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 4 00:35:10.221 11:45:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:35:10.221 11:45:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:35:10.221 11:45:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:35:10.221 11:45:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:35:10.221 11:45:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:35:10.221 11:45:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:35:10.221 11:45:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:35:10.221 11:45:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:35:10.221 11:45:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:35:10.221 11:45:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:10.221 11:45:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:35:10.480 11:45:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:35:10.480 "name": "raid_bdev1", 00:35:10.480 "uuid": "9a2b8166-f437-4fb5-ac08-4ed3cc30cc41", 00:35:10.480 "strip_size_kb": 64, 00:35:10.480 "state": "configuring", 00:35:10.480 "raid_level": "concat", 00:35:10.480 "superblock": true, 00:35:10.480 "num_base_bdevs": 4, 00:35:10.480 "num_base_bdevs_discovered": 1, 00:35:10.480 "num_base_bdevs_operational": 4, 00:35:10.480 "base_bdevs_list": [ 00:35:10.480 { 00:35:10.480 "name": "pt1", 00:35:10.480 "uuid": "00000000-0000-0000-0000-000000000001", 00:35:10.480 "is_configured": true, 00:35:10.480 "data_offset": 2048, 00:35:10.480 "data_size": 63488 00:35:10.480 }, 00:35:10.480 { 00:35:10.480 "name": null, 00:35:10.480 "uuid": "00000000-0000-0000-0000-000000000002", 00:35:10.480 "is_configured": false, 00:35:10.480 "data_offset": 2048, 00:35:10.480 "data_size": 63488 00:35:10.480 }, 00:35:10.480 { 00:35:10.480 "name": null, 00:35:10.480 "uuid": "00000000-0000-0000-0000-000000000003", 00:35:10.480 "is_configured": false, 00:35:10.480 "data_offset": 2048, 00:35:10.480 "data_size": 63488 00:35:10.480 }, 00:35:10.480 { 00:35:10.480 "name": null, 00:35:10.480 "uuid": "00000000-0000-0000-0000-000000000004", 00:35:10.480 "is_configured": false, 00:35:10.480 "data_offset": 2048, 00:35:10.480 "data_size": 63488 00:35:10.480 } 00:35:10.480 ] 00:35:10.480 }' 00:35:10.480 11:45:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:35:10.480 11:45:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:35:11.049 11:45:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:35:11.049 11:45:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:35:11.049 11:45:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:35:11.049 [2024-06-10 11:45:54.866050] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:35:11.049 [2024-06-10 11:45:54.866085] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:35:11.049 [2024-06-10 11:45:54.866097] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e44880 00:35:11.049 [2024-06-10 11:45:54.866104] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:35:11.049 [2024-06-10 11:45:54.866335] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:35:11.049 [2024-06-10 11:45:54.866347] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:35:11.049 [2024-06-10 11:45:54.866390] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:35:11.049 [2024-06-10 11:45:54.866404] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:35:11.049 pt2 00:35:11.049 11:45:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:35:11.049 11:45:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:35:11.049 11:45:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:35:11.308 [2024-06-10 11:45:55.042507] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:35:11.308 [2024-06-10 11:45:55.042533] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:35:11.308 [2024-06-10 11:45:55.042544] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c93a00 00:35:11.308 [2024-06-10 11:45:55.042551] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:35:11.308 [2024-06-10 11:45:55.042737] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:35:11.308 [2024-06-10 11:45:55.042748] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:35:11.308 [2024-06-10 11:45:55.042786] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:35:11.308 [2024-06-10 11:45:55.042796] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:35:11.308 pt3 00:35:11.308 11:45:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:35:11.308 11:45:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:35:11.308 11:45:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:35:11.308 [2024-06-10 11:45:55.214948] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:35:11.308 [2024-06-10 11:45:55.214973] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:35:11.308 [2024-06-10 11:45:55.214987] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e45060 00:35:11.308 [2024-06-10 11:45:55.214996] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:35:11.308 [2024-06-10 11:45:55.215191] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:35:11.308 [2024-06-10 11:45:55.215203] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:35:11.308 [2024-06-10 11:45:55.215236] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:35:11.308 [2024-06-10 11:45:55.215247] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:35:11.308 [2024-06-10 11:45:55.215328] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1e45db0 00:35:11.308 [2024-06-10 11:45:55.215336] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:35:11.308 [2024-06-10 11:45:55.215447] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e29f90 00:35:11.308 [2024-06-10 11:45:55.215536] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1e45db0 00:35:11.308 [2024-06-10 11:45:55.215543] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1e45db0 00:35:11.308 [2024-06-10 11:45:55.215610] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:35:11.308 pt4 00:35:11.308 11:45:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:35:11.308 11:45:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:35:11.308 11:45:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:35:11.308 11:45:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:35:11.308 11:45:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:35:11.308 11:45:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:35:11.308 11:45:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:35:11.308 11:45:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:35:11.308 11:45:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:35:11.308 11:45:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:35:11.308 11:45:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:35:11.308 11:45:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:35:11.308 11:45:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:11.308 11:45:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:35:11.568 11:45:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:35:11.568 "name": "raid_bdev1", 00:35:11.568 "uuid": "9a2b8166-f437-4fb5-ac08-4ed3cc30cc41", 00:35:11.568 "strip_size_kb": 64, 00:35:11.568 "state": "online", 00:35:11.568 "raid_level": "concat", 00:35:11.568 "superblock": true, 00:35:11.568 "num_base_bdevs": 4, 00:35:11.568 "num_base_bdevs_discovered": 4, 00:35:11.568 "num_base_bdevs_operational": 4, 00:35:11.568 "base_bdevs_list": [ 00:35:11.568 { 00:35:11.568 "name": "pt1", 00:35:11.568 "uuid": "00000000-0000-0000-0000-000000000001", 00:35:11.568 "is_configured": true, 00:35:11.568 "data_offset": 2048, 00:35:11.568 "data_size": 63488 00:35:11.568 }, 00:35:11.568 { 00:35:11.568 "name": "pt2", 00:35:11.568 "uuid": "00000000-0000-0000-0000-000000000002", 00:35:11.568 "is_configured": true, 00:35:11.568 "data_offset": 2048, 00:35:11.568 "data_size": 63488 00:35:11.568 }, 00:35:11.568 { 00:35:11.568 "name": "pt3", 00:35:11.568 "uuid": "00000000-0000-0000-0000-000000000003", 00:35:11.568 "is_configured": true, 00:35:11.568 "data_offset": 2048, 00:35:11.568 "data_size": 63488 00:35:11.568 }, 00:35:11.568 { 00:35:11.568 "name": "pt4", 00:35:11.568 "uuid": "00000000-0000-0000-0000-000000000004", 00:35:11.568 "is_configured": true, 00:35:11.568 "data_offset": 2048, 00:35:11.568 "data_size": 63488 00:35:11.568 } 00:35:11.568 ] 00:35:11.568 }' 00:35:11.568 11:45:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:35:11.568 11:45:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:35:12.136 11:45:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:35:12.136 11:45:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:35:12.136 11:45:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:35:12.136 11:45:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:35:12.136 11:45:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:35:12.136 11:45:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:35:12.136 11:45:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:35:12.136 11:45:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:35:12.136 [2024-06-10 11:45:56.069331] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:35:12.396 11:45:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:35:12.396 "name": "raid_bdev1", 00:35:12.396 "aliases": [ 00:35:12.396 "9a2b8166-f437-4fb5-ac08-4ed3cc30cc41" 00:35:12.396 ], 00:35:12.396 "product_name": "Raid Volume", 00:35:12.396 "block_size": 512, 00:35:12.396 "num_blocks": 253952, 00:35:12.396 "uuid": "9a2b8166-f437-4fb5-ac08-4ed3cc30cc41", 00:35:12.396 "assigned_rate_limits": { 00:35:12.396 "rw_ios_per_sec": 0, 00:35:12.396 "rw_mbytes_per_sec": 0, 00:35:12.396 "r_mbytes_per_sec": 0, 00:35:12.396 "w_mbytes_per_sec": 0 00:35:12.396 }, 00:35:12.396 "claimed": false, 00:35:12.396 "zoned": false, 00:35:12.396 "supported_io_types": { 00:35:12.396 "read": true, 00:35:12.396 "write": true, 00:35:12.396 "unmap": true, 00:35:12.396 "write_zeroes": true, 00:35:12.396 "flush": true, 00:35:12.396 "reset": true, 00:35:12.396 "compare": false, 00:35:12.396 "compare_and_write": false, 00:35:12.396 "abort": false, 00:35:12.396 "nvme_admin": false, 00:35:12.396 "nvme_io": false 00:35:12.396 }, 00:35:12.396 "memory_domains": [ 00:35:12.396 { 00:35:12.396 "dma_device_id": "system", 00:35:12.396 "dma_device_type": 1 00:35:12.396 }, 00:35:12.396 { 00:35:12.396 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:35:12.396 "dma_device_type": 2 00:35:12.396 }, 00:35:12.396 { 00:35:12.396 "dma_device_id": "system", 00:35:12.396 "dma_device_type": 1 00:35:12.396 }, 00:35:12.396 { 00:35:12.396 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:35:12.396 "dma_device_type": 2 00:35:12.396 }, 00:35:12.396 { 00:35:12.396 "dma_device_id": "system", 00:35:12.396 "dma_device_type": 1 00:35:12.396 }, 00:35:12.396 { 00:35:12.396 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:35:12.396 "dma_device_type": 2 00:35:12.396 }, 00:35:12.396 { 00:35:12.396 "dma_device_id": "system", 00:35:12.396 "dma_device_type": 1 00:35:12.396 }, 00:35:12.396 { 00:35:12.396 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:35:12.396 "dma_device_type": 2 00:35:12.396 } 00:35:12.396 ], 00:35:12.396 "driver_specific": { 00:35:12.396 "raid": { 00:35:12.396 "uuid": "9a2b8166-f437-4fb5-ac08-4ed3cc30cc41", 00:35:12.396 "strip_size_kb": 64, 00:35:12.396 "state": "online", 00:35:12.396 "raid_level": "concat", 00:35:12.396 "superblock": true, 00:35:12.396 "num_base_bdevs": 4, 00:35:12.396 "num_base_bdevs_discovered": 4, 00:35:12.396 "num_base_bdevs_operational": 4, 00:35:12.396 "base_bdevs_list": [ 00:35:12.396 { 00:35:12.396 "name": "pt1", 00:35:12.396 "uuid": "00000000-0000-0000-0000-000000000001", 00:35:12.396 "is_configured": true, 00:35:12.396 "data_offset": 2048, 00:35:12.396 "data_size": 63488 00:35:12.396 }, 00:35:12.396 { 00:35:12.396 "name": "pt2", 00:35:12.396 "uuid": "00000000-0000-0000-0000-000000000002", 00:35:12.396 "is_configured": true, 00:35:12.396 "data_offset": 2048, 00:35:12.396 "data_size": 63488 00:35:12.396 }, 00:35:12.396 { 00:35:12.396 "name": "pt3", 00:35:12.396 "uuid": "00000000-0000-0000-0000-000000000003", 00:35:12.396 "is_configured": true, 00:35:12.396 "data_offset": 2048, 00:35:12.396 "data_size": 63488 00:35:12.396 }, 00:35:12.396 { 00:35:12.396 "name": "pt4", 00:35:12.396 "uuid": "00000000-0000-0000-0000-000000000004", 00:35:12.396 "is_configured": true, 00:35:12.396 "data_offset": 2048, 00:35:12.396 "data_size": 63488 00:35:12.396 } 00:35:12.396 ] 00:35:12.396 } 00:35:12.396 } 00:35:12.396 }' 00:35:12.396 11:45:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:35:12.396 11:45:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:35:12.396 pt2 00:35:12.396 pt3 00:35:12.396 pt4' 00:35:12.396 11:45:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:35:12.396 11:45:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:35:12.396 11:45:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:35:12.396 11:45:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:35:12.396 "name": "pt1", 00:35:12.396 "aliases": [ 00:35:12.396 "00000000-0000-0000-0000-000000000001" 00:35:12.396 ], 00:35:12.396 "product_name": "passthru", 00:35:12.396 "block_size": 512, 00:35:12.396 "num_blocks": 65536, 00:35:12.396 "uuid": "00000000-0000-0000-0000-000000000001", 00:35:12.396 "assigned_rate_limits": { 00:35:12.396 "rw_ios_per_sec": 0, 00:35:12.396 "rw_mbytes_per_sec": 0, 00:35:12.396 "r_mbytes_per_sec": 0, 00:35:12.396 "w_mbytes_per_sec": 0 00:35:12.396 }, 00:35:12.397 "claimed": true, 00:35:12.397 "claim_type": "exclusive_write", 00:35:12.397 "zoned": false, 00:35:12.397 "supported_io_types": { 00:35:12.397 "read": true, 00:35:12.397 "write": true, 00:35:12.397 "unmap": true, 00:35:12.397 "write_zeroes": true, 00:35:12.397 "flush": true, 00:35:12.397 "reset": true, 00:35:12.397 "compare": false, 00:35:12.397 "compare_and_write": false, 00:35:12.397 "abort": true, 00:35:12.397 "nvme_admin": false, 00:35:12.397 "nvme_io": false 00:35:12.397 }, 00:35:12.397 "memory_domains": [ 00:35:12.397 { 00:35:12.397 "dma_device_id": "system", 00:35:12.397 "dma_device_type": 1 00:35:12.397 }, 00:35:12.397 { 00:35:12.397 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:35:12.397 "dma_device_type": 2 00:35:12.397 } 00:35:12.397 ], 00:35:12.397 "driver_specific": { 00:35:12.397 "passthru": { 00:35:12.397 "name": "pt1", 00:35:12.397 "base_bdev_name": "malloc1" 00:35:12.397 } 00:35:12.397 } 00:35:12.397 }' 00:35:12.397 11:45:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:35:12.656 11:45:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:35:12.656 11:45:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:35:12.656 11:45:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:35:12.656 11:45:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:35:12.656 11:45:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:35:12.656 11:45:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:35:12.656 11:45:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:35:12.656 11:45:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:35:12.656 11:45:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:35:12.656 11:45:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:35:12.915 11:45:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:35:12.915 11:45:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:35:12.915 11:45:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:35:12.915 11:45:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:35:12.915 11:45:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:35:12.915 "name": "pt2", 00:35:12.915 "aliases": [ 00:35:12.915 "00000000-0000-0000-0000-000000000002" 00:35:12.915 ], 00:35:12.915 "product_name": "passthru", 00:35:12.915 "block_size": 512, 00:35:12.915 "num_blocks": 65536, 00:35:12.915 "uuid": "00000000-0000-0000-0000-000000000002", 00:35:12.915 "assigned_rate_limits": { 00:35:12.915 "rw_ios_per_sec": 0, 00:35:12.915 "rw_mbytes_per_sec": 0, 00:35:12.915 "r_mbytes_per_sec": 0, 00:35:12.915 "w_mbytes_per_sec": 0 00:35:12.915 }, 00:35:12.915 "claimed": true, 00:35:12.915 "claim_type": "exclusive_write", 00:35:12.915 "zoned": false, 00:35:12.915 "supported_io_types": { 00:35:12.915 "read": true, 00:35:12.915 "write": true, 00:35:12.915 "unmap": true, 00:35:12.915 "write_zeroes": true, 00:35:12.915 "flush": true, 00:35:12.915 "reset": true, 00:35:12.915 "compare": false, 00:35:12.915 "compare_and_write": false, 00:35:12.915 "abort": true, 00:35:12.915 "nvme_admin": false, 00:35:12.915 "nvme_io": false 00:35:12.915 }, 00:35:12.915 "memory_domains": [ 00:35:12.915 { 00:35:12.915 "dma_device_id": "system", 00:35:12.915 "dma_device_type": 1 00:35:12.915 }, 00:35:12.915 { 00:35:12.915 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:35:12.915 "dma_device_type": 2 00:35:12.915 } 00:35:12.915 ], 00:35:12.915 "driver_specific": { 00:35:12.915 "passthru": { 00:35:12.915 "name": "pt2", 00:35:12.915 "base_bdev_name": "malloc2" 00:35:12.915 } 00:35:12.915 } 00:35:12.915 }' 00:35:12.915 11:45:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:35:12.915 11:45:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:35:13.174 11:45:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:35:13.174 11:45:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:35:13.174 11:45:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:35:13.174 11:45:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:35:13.174 11:45:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:35:13.174 11:45:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:35:13.174 11:45:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:35:13.174 11:45:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:35:13.174 11:45:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:35:13.434 11:45:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:35:13.434 11:45:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:35:13.434 11:45:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:35:13.434 11:45:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:35:13.434 11:45:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:35:13.434 "name": "pt3", 00:35:13.434 "aliases": [ 00:35:13.434 "00000000-0000-0000-0000-000000000003" 00:35:13.434 ], 00:35:13.434 "product_name": "passthru", 00:35:13.434 "block_size": 512, 00:35:13.434 "num_blocks": 65536, 00:35:13.434 "uuid": "00000000-0000-0000-0000-000000000003", 00:35:13.434 "assigned_rate_limits": { 00:35:13.434 "rw_ios_per_sec": 0, 00:35:13.434 "rw_mbytes_per_sec": 0, 00:35:13.434 "r_mbytes_per_sec": 0, 00:35:13.434 "w_mbytes_per_sec": 0 00:35:13.434 }, 00:35:13.434 "claimed": true, 00:35:13.434 "claim_type": "exclusive_write", 00:35:13.434 "zoned": false, 00:35:13.434 "supported_io_types": { 00:35:13.434 "read": true, 00:35:13.434 "write": true, 00:35:13.434 "unmap": true, 00:35:13.434 "write_zeroes": true, 00:35:13.434 "flush": true, 00:35:13.434 "reset": true, 00:35:13.434 "compare": false, 00:35:13.434 "compare_and_write": false, 00:35:13.434 "abort": true, 00:35:13.434 "nvme_admin": false, 00:35:13.434 "nvme_io": false 00:35:13.434 }, 00:35:13.434 "memory_domains": [ 00:35:13.434 { 00:35:13.434 "dma_device_id": "system", 00:35:13.434 "dma_device_type": 1 00:35:13.434 }, 00:35:13.434 { 00:35:13.434 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:35:13.434 "dma_device_type": 2 00:35:13.434 } 00:35:13.434 ], 00:35:13.434 "driver_specific": { 00:35:13.434 "passthru": { 00:35:13.434 "name": "pt3", 00:35:13.434 "base_bdev_name": "malloc3" 00:35:13.434 } 00:35:13.434 } 00:35:13.434 }' 00:35:13.434 11:45:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:35:13.434 11:45:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:35:13.693 11:45:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:35:13.693 11:45:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:35:13.693 11:45:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:35:13.693 11:45:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:35:13.693 11:45:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:35:13.693 11:45:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:35:13.693 11:45:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:35:13.693 11:45:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:35:13.693 11:45:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:35:13.693 11:45:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:35:13.693 11:45:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:35:13.693 11:45:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:35:13.693 11:45:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:35:13.952 11:45:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:35:13.952 "name": "pt4", 00:35:13.952 "aliases": [ 00:35:13.952 "00000000-0000-0000-0000-000000000004" 00:35:13.952 ], 00:35:13.952 "product_name": "passthru", 00:35:13.952 "block_size": 512, 00:35:13.952 "num_blocks": 65536, 00:35:13.952 "uuid": "00000000-0000-0000-0000-000000000004", 00:35:13.952 "assigned_rate_limits": { 00:35:13.952 "rw_ios_per_sec": 0, 00:35:13.952 "rw_mbytes_per_sec": 0, 00:35:13.952 "r_mbytes_per_sec": 0, 00:35:13.952 "w_mbytes_per_sec": 0 00:35:13.952 }, 00:35:13.952 "claimed": true, 00:35:13.952 "claim_type": "exclusive_write", 00:35:13.952 "zoned": false, 00:35:13.952 "supported_io_types": { 00:35:13.952 "read": true, 00:35:13.952 "write": true, 00:35:13.952 "unmap": true, 00:35:13.952 "write_zeroes": true, 00:35:13.952 "flush": true, 00:35:13.952 "reset": true, 00:35:13.952 "compare": false, 00:35:13.952 "compare_and_write": false, 00:35:13.952 "abort": true, 00:35:13.952 "nvme_admin": false, 00:35:13.952 "nvme_io": false 00:35:13.952 }, 00:35:13.952 "memory_domains": [ 00:35:13.952 { 00:35:13.952 "dma_device_id": "system", 00:35:13.952 "dma_device_type": 1 00:35:13.952 }, 00:35:13.952 { 00:35:13.952 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:35:13.952 "dma_device_type": 2 00:35:13.952 } 00:35:13.952 ], 00:35:13.952 "driver_specific": { 00:35:13.952 "passthru": { 00:35:13.952 "name": "pt4", 00:35:13.952 "base_bdev_name": "malloc4" 00:35:13.952 } 00:35:13.952 } 00:35:13.952 }' 00:35:13.952 11:45:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:35:13.952 11:45:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:35:13.952 11:45:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:35:13.952 11:45:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:35:14.212 11:45:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:35:14.212 11:45:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:35:14.212 11:45:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:35:14.212 11:45:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:35:14.212 11:45:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:35:14.212 11:45:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:35:14.212 11:45:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:35:14.212 11:45:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:35:14.212 11:45:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:35:14.212 11:45:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:35:14.472 [2024-06-10 11:45:58.251019] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:35:14.472 11:45:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 9a2b8166-f437-4fb5-ac08-4ed3cc30cc41 '!=' 9a2b8166-f437-4fb5-ac08-4ed3cc30cc41 ']' 00:35:14.472 11:45:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:35:14.472 11:45:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:35:14.472 11:45:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:35:14.472 11:45:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 199690 00:35:14.472 11:45:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@949 -- # '[' -z 199690 ']' 00:35:14.472 11:45:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # kill -0 199690 00:35:14.472 11:45:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # uname 00:35:14.472 11:45:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:35:14.472 11:45:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 199690 00:35:14.472 11:45:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:35:14.472 11:45:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:35:14.472 11:45:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 199690' 00:35:14.472 killing process with pid 199690 00:35:14.472 11:45:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # kill 199690 00:35:14.472 [2024-06-10 11:45:58.319110] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:35:14.472 [2024-06-10 11:45:58.319159] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:35:14.472 [2024-06-10 11:45:58.319205] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:35:14.472 [2024-06-10 11:45:58.319214] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e45db0 name raid_bdev1, state offline 00:35:14.472 11:45:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@973 -- # wait 199690 00:35:14.472 [2024-06-10 11:45:58.360702] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:35:14.732 11:45:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:35:14.732 00:35:14.732 real 0m12.691s 00:35:14.732 user 0m22.677s 00:35:14.732 sys 0m2.483s 00:35:14.732 11:45:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:35:14.732 11:45:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:35:14.732 ************************************ 00:35:14.732 END TEST raid_superblock_test 00:35:14.732 ************************************ 00:35:14.732 11:45:58 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 4 read 00:35:14.732 11:45:58 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:35:14.732 11:45:58 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:35:14.732 11:45:58 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:35:14.732 ************************************ 00:35:14.732 START TEST raid_read_error_test 00:35:14.732 ************************************ 00:35:14.732 11:45:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test concat 4 read 00:35:14.732 11:45:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:35:14.732 11:45:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:35:14.732 11:45:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:35:14.732 11:45:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:35:14.732 11:45:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:35:14.732 11:45:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:35:14.732 11:45:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:35:14.732 11:45:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:35:14.732 11:45:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:35:14.732 11:45:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:35:14.732 11:45:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:35:14.732 11:45:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:35:14.732 11:45:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:35:14.732 11:45:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:35:14.732 11:45:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:35:14.732 11:45:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:35:14.732 11:45:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:35:14.732 11:45:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:35:14.732 11:45:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:35:14.732 11:45:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:35:14.732 11:45:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:35:14.732 11:45:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:35:14.732 11:45:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:35:14.732 11:45:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:35:14.732 11:45:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:35:14.732 11:45:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:35:14.732 11:45:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:35:14.732 11:45:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:35:14.732 11:45:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.RhnSM1GOlt 00:35:14.732 11:45:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=201765 00:35:14.732 11:45:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 201765 /var/tmp/spdk-raid.sock 00:35:14.732 11:45:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:35:14.732 11:45:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@830 -- # '[' -z 201765 ']' 00:35:14.732 11:45:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:35:14.732 11:45:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:35:14.732 11:45:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:35:14.732 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:35:14.732 11:45:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:35:14.732 11:45:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:35:14.992 [2024-06-10 11:45:58.717692] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:35:14.992 [2024-06-10 11:45:58.717744] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid201765 ] 00:35:14.992 [2024-06-10 11:45:58.804326] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:14.992 [2024-06-10 11:45:58.889256] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:35:15.251 [2024-06-10 11:45:58.946813] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:35:15.251 [2024-06-10 11:45:58.946840] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:35:15.819 11:45:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:35:15.819 11:45:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@863 -- # return 0 00:35:15.819 11:45:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:35:15.819 11:45:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:35:15.819 BaseBdev1_malloc 00:35:15.819 11:45:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:35:16.079 true 00:35:16.079 11:45:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:35:16.360 [2024-06-10 11:46:00.043216] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:35:16.360 [2024-06-10 11:46:00.043257] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:35:16.360 [2024-06-10 11:46:00.043272] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b3eb10 00:35:16.360 [2024-06-10 11:46:00.043281] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:35:16.360 [2024-06-10 11:46:00.044597] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:35:16.360 [2024-06-10 11:46:00.044622] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:35:16.360 BaseBdev1 00:35:16.360 11:46:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:35:16.360 11:46:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:35:16.360 BaseBdev2_malloc 00:35:16.360 11:46:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:35:16.619 true 00:35:16.619 11:46:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:35:16.619 [2024-06-10 11:46:00.560245] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:35:16.619 [2024-06-10 11:46:00.560283] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:35:16.619 [2024-06-10 11:46:00.560298] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b43280 00:35:16.619 [2024-06-10 11:46:00.560307] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:35:16.619 [2024-06-10 11:46:00.561343] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:35:16.619 [2024-06-10 11:46:00.561365] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:35:16.619 BaseBdev2 00:35:16.879 11:46:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:35:16.879 11:46:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:35:16.879 BaseBdev3_malloc 00:35:16.879 11:46:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:35:17.137 true 00:35:17.137 11:46:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:35:17.396 [2024-06-10 11:46:01.097343] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:35:17.396 [2024-06-10 11:46:01.097378] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:35:17.396 [2024-06-10 11:46:01.097394] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b45ab0 00:35:17.396 [2024-06-10 11:46:01.097403] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:35:17.396 [2024-06-10 11:46:01.098378] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:35:17.396 [2024-06-10 11:46:01.098401] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:35:17.396 BaseBdev3 00:35:17.396 11:46:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:35:17.396 11:46:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:35:17.396 BaseBdev4_malloc 00:35:17.396 11:46:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:35:17.681 true 00:35:17.681 11:46:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:35:17.681 [2024-06-10 11:46:01.626432] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:35:17.681 [2024-06-10 11:46:01.626470] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:35:17.681 [2024-06-10 11:46:01.626485] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b46380 00:35:17.681 [2024-06-10 11:46:01.626493] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:35:17.940 [2024-06-10 11:46:01.627507] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:35:17.940 [2024-06-10 11:46:01.627529] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:35:17.940 BaseBdev4 00:35:17.940 11:46:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:35:17.940 [2024-06-10 11:46:01.802940] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:35:17.940 [2024-06-10 11:46:01.803886] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:35:17.940 [2024-06-10 11:46:01.803936] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:35:17.940 [2024-06-10 11:46:01.803977] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:35:17.940 [2024-06-10 11:46:01.804142] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1b3e020 00:35:17.940 [2024-06-10 11:46:01.804150] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:35:17.940 [2024-06-10 11:46:01.804293] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b40400 00:35:17.940 [2024-06-10 11:46:01.804402] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1b3e020 00:35:17.940 [2024-06-10 11:46:01.804408] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1b3e020 00:35:17.940 [2024-06-10 11:46:01.804479] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:35:17.940 11:46:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:35:17.940 11:46:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:35:17.940 11:46:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:35:17.940 11:46:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:35:17.940 11:46:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:35:17.940 11:46:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:35:17.940 11:46:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:35:17.940 11:46:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:35:17.940 11:46:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:35:17.940 11:46:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:35:17.940 11:46:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:17.940 11:46:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:35:18.199 11:46:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:35:18.199 "name": "raid_bdev1", 00:35:18.199 "uuid": "066561d1-d006-4ee4-9b8d-e6e6229b29fd", 00:35:18.199 "strip_size_kb": 64, 00:35:18.199 "state": "online", 00:35:18.199 "raid_level": "concat", 00:35:18.199 "superblock": true, 00:35:18.199 "num_base_bdevs": 4, 00:35:18.199 "num_base_bdevs_discovered": 4, 00:35:18.199 "num_base_bdevs_operational": 4, 00:35:18.199 "base_bdevs_list": [ 00:35:18.199 { 00:35:18.199 "name": "BaseBdev1", 00:35:18.199 "uuid": "8d3fd3e1-81b4-584f-8ef8-b292c65bfc08", 00:35:18.199 "is_configured": true, 00:35:18.199 "data_offset": 2048, 00:35:18.199 "data_size": 63488 00:35:18.199 }, 00:35:18.199 { 00:35:18.199 "name": "BaseBdev2", 00:35:18.199 "uuid": "f1a5a3cd-edde-5638-bd94-6da68edb2bd4", 00:35:18.199 "is_configured": true, 00:35:18.199 "data_offset": 2048, 00:35:18.199 "data_size": 63488 00:35:18.199 }, 00:35:18.199 { 00:35:18.199 "name": "BaseBdev3", 00:35:18.199 "uuid": "38975ac9-aa32-59ff-b3c4-d0c6e11af0a7", 00:35:18.199 "is_configured": true, 00:35:18.199 "data_offset": 2048, 00:35:18.199 "data_size": 63488 00:35:18.199 }, 00:35:18.199 { 00:35:18.199 "name": "BaseBdev4", 00:35:18.199 "uuid": "b28f0faa-8dc7-5a23-b8e8-10b144d7fecd", 00:35:18.199 "is_configured": true, 00:35:18.199 "data_offset": 2048, 00:35:18.199 "data_size": 63488 00:35:18.199 } 00:35:18.199 ] 00:35:18.199 }' 00:35:18.199 11:46:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:35:18.199 11:46:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:35:18.767 11:46:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:35:18.767 11:46:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:35:18.767 [2024-06-10 11:46:02.569102] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b40670 00:35:19.706 11:46:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:35:19.965 11:46:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:35:19.965 11:46:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:35:19.965 11:46:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:35:19.965 11:46:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:35:19.965 11:46:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:35:19.965 11:46:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:35:19.965 11:46:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:35:19.965 11:46:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:35:19.965 11:46:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:35:19.965 11:46:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:35:19.965 11:46:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:35:19.965 11:46:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:35:19.965 11:46:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:35:19.965 11:46:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:19.965 11:46:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:35:19.965 11:46:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:35:19.965 "name": "raid_bdev1", 00:35:19.965 "uuid": "066561d1-d006-4ee4-9b8d-e6e6229b29fd", 00:35:19.965 "strip_size_kb": 64, 00:35:19.965 "state": "online", 00:35:19.965 "raid_level": "concat", 00:35:19.965 "superblock": true, 00:35:19.965 "num_base_bdevs": 4, 00:35:19.965 "num_base_bdevs_discovered": 4, 00:35:19.965 "num_base_bdevs_operational": 4, 00:35:19.965 "base_bdevs_list": [ 00:35:19.965 { 00:35:19.965 "name": "BaseBdev1", 00:35:19.965 "uuid": "8d3fd3e1-81b4-584f-8ef8-b292c65bfc08", 00:35:19.965 "is_configured": true, 00:35:19.965 "data_offset": 2048, 00:35:19.965 "data_size": 63488 00:35:19.965 }, 00:35:19.965 { 00:35:19.965 "name": "BaseBdev2", 00:35:19.965 "uuid": "f1a5a3cd-edde-5638-bd94-6da68edb2bd4", 00:35:19.965 "is_configured": true, 00:35:19.965 "data_offset": 2048, 00:35:19.965 "data_size": 63488 00:35:19.965 }, 00:35:19.965 { 00:35:19.965 "name": "BaseBdev3", 00:35:19.965 "uuid": "38975ac9-aa32-59ff-b3c4-d0c6e11af0a7", 00:35:19.965 "is_configured": true, 00:35:19.965 "data_offset": 2048, 00:35:19.965 "data_size": 63488 00:35:19.965 }, 00:35:19.965 { 00:35:19.965 "name": "BaseBdev4", 00:35:19.965 "uuid": "b28f0faa-8dc7-5a23-b8e8-10b144d7fecd", 00:35:19.965 "is_configured": true, 00:35:19.965 "data_offset": 2048, 00:35:19.965 "data_size": 63488 00:35:19.965 } 00:35:19.965 ] 00:35:19.965 }' 00:35:19.965 11:46:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:35:19.965 11:46:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:35:20.534 11:46:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:35:20.794 [2024-06-10 11:46:04.522729] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:35:20.794 [2024-06-10 11:46:04.522763] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:35:20.794 [2024-06-10 11:46:04.524763] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:35:20.794 [2024-06-10 11:46:04.524793] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:35:20.794 [2024-06-10 11:46:04.524820] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:35:20.794 [2024-06-10 11:46:04.524827] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b3e020 name raid_bdev1, state offline 00:35:20.794 0 00:35:20.794 11:46:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 201765 00:35:20.794 11:46:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@949 -- # '[' -z 201765 ']' 00:35:20.794 11:46:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # kill -0 201765 00:35:20.794 11:46:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # uname 00:35:20.794 11:46:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:35:20.794 11:46:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 201765 00:35:20.794 11:46:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:35:20.794 11:46:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:35:20.794 11:46:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 201765' 00:35:20.794 killing process with pid 201765 00:35:20.794 11:46:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # kill 201765 00:35:20.794 [2024-06-10 11:46:04.590310] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:35:20.794 11:46:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@973 -- # wait 201765 00:35:20.794 [2024-06-10 11:46:04.620672] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:35:21.053 11:46:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.RhnSM1GOlt 00:35:21.053 11:46:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:35:21.053 11:46:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:35:21.053 11:46:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.51 00:35:21.053 11:46:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:35:21.053 11:46:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:35:21.053 11:46:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:35:21.053 11:46:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.51 != \0\.\0\0 ]] 00:35:21.053 00:35:21.053 real 0m6.187s 00:35:21.053 user 0m9.572s 00:35:21.053 sys 0m1.096s 00:35:21.053 11:46:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:35:21.053 11:46:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:35:21.053 ************************************ 00:35:21.053 END TEST raid_read_error_test 00:35:21.053 ************************************ 00:35:21.053 11:46:04 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 4 write 00:35:21.053 11:46:04 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:35:21.053 11:46:04 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:35:21.053 11:46:04 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:35:21.053 ************************************ 00:35:21.053 START TEST raid_write_error_test 00:35:21.053 ************************************ 00:35:21.053 11:46:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test concat 4 write 00:35:21.053 11:46:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:35:21.053 11:46:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:35:21.053 11:46:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:35:21.053 11:46:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:35:21.053 11:46:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:35:21.053 11:46:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:35:21.053 11:46:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:35:21.053 11:46:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:35:21.053 11:46:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:35:21.053 11:46:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:35:21.053 11:46:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:35:21.053 11:46:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:35:21.053 11:46:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:35:21.053 11:46:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:35:21.053 11:46:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:35:21.053 11:46:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:35:21.053 11:46:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:35:21.054 11:46:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:35:21.054 11:46:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:35:21.054 11:46:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:35:21.054 11:46:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:35:21.054 11:46:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:35:21.054 11:46:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:35:21.054 11:46:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:35:21.054 11:46:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:35:21.054 11:46:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:35:21.054 11:46:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:35:21.054 11:46:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:35:21.054 11:46:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.fmckjEamH9 00:35:21.054 11:46:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=202591 00:35:21.054 11:46:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 202591 /var/tmp/spdk-raid.sock 00:35:21.054 11:46:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:35:21.054 11:46:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@830 -- # '[' -z 202591 ']' 00:35:21.054 11:46:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:35:21.054 11:46:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:35:21.054 11:46:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:35:21.054 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:35:21.054 11:46:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:35:21.054 11:46:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:35:21.054 [2024-06-10 11:46:04.991997] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:35:21.054 [2024-06-10 11:46:04.992055] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid202591 ] 00:35:21.313 [2024-06-10 11:46:05.080823] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:21.313 [2024-06-10 11:46:05.167859] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:35:21.313 [2024-06-10 11:46:05.232726] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:35:21.313 [2024-06-10 11:46:05.232753] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:35:21.880 11:46:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:35:21.880 11:46:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@863 -- # return 0 00:35:21.880 11:46:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:35:21.880 11:46:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:35:22.139 BaseBdev1_malloc 00:35:22.139 11:46:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:35:22.398 true 00:35:22.398 11:46:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:35:22.398 [2024-06-10 11:46:06.320752] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:35:22.398 [2024-06-10 11:46:06.320791] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:35:22.398 [2024-06-10 11:46:06.320804] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2835b10 00:35:22.398 [2024-06-10 11:46:06.320812] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:35:22.398 [2024-06-10 11:46:06.321976] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:35:22.398 [2024-06-10 11:46:06.321999] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:35:22.398 BaseBdev1 00:35:22.398 11:46:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:35:22.398 11:46:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:35:22.657 BaseBdev2_malloc 00:35:22.657 11:46:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:35:22.916 true 00:35:22.916 11:46:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:35:22.916 [2024-06-10 11:46:06.853812] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:35:22.916 [2024-06-10 11:46:06.853846] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:35:22.916 [2024-06-10 11:46:06.853859] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x283a280 00:35:22.916 [2024-06-10 11:46:06.853870] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:35:22.916 [2024-06-10 11:46:06.854783] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:35:22.916 [2024-06-10 11:46:06.854804] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:35:22.916 BaseBdev2 00:35:23.175 11:46:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:35:23.175 11:46:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:35:23.175 BaseBdev3_malloc 00:35:23.175 11:46:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:35:23.434 true 00:35:23.434 11:46:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:35:23.434 [2024-06-10 11:46:07.358706] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:35:23.434 [2024-06-10 11:46:07.358743] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:35:23.434 [2024-06-10 11:46:07.358756] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x283cab0 00:35:23.434 [2024-06-10 11:46:07.358765] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:35:23.434 [2024-06-10 11:46:07.359729] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:35:23.434 [2024-06-10 11:46:07.359750] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:35:23.434 BaseBdev3 00:35:23.434 11:46:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:35:23.434 11:46:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:35:23.738 BaseBdev4_malloc 00:35:23.738 11:46:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:35:24.016 true 00:35:24.016 11:46:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:35:24.016 [2024-06-10 11:46:07.891709] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:35:24.016 [2024-06-10 11:46:07.891744] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:35:24.016 [2024-06-10 11:46:07.891757] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x283d380 00:35:24.016 [2024-06-10 11:46:07.891764] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:35:24.016 [2024-06-10 11:46:07.892729] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:35:24.016 [2024-06-10 11:46:07.892752] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:35:24.016 BaseBdev4 00:35:24.017 11:46:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:35:24.275 [2024-06-10 11:46:08.068362] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:35:24.275 [2024-06-10 11:46:08.069159] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:35:24.275 [2024-06-10 11:46:08.069202] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:35:24.275 [2024-06-10 11:46:08.069240] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:35:24.275 [2024-06-10 11:46:08.069392] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2835020 00:35:24.275 [2024-06-10 11:46:08.069399] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:35:24.275 [2024-06-10 11:46:08.069520] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2837400 00:35:24.275 [2024-06-10 11:46:08.069616] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2835020 00:35:24.275 [2024-06-10 11:46:08.069621] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2835020 00:35:24.275 [2024-06-10 11:46:08.069683] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:35:24.275 11:46:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:35:24.275 11:46:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:35:24.275 11:46:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:35:24.275 11:46:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:35:24.275 11:46:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:35:24.275 11:46:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:35:24.275 11:46:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:35:24.275 11:46:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:35:24.275 11:46:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:35:24.275 11:46:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:35:24.275 11:46:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:24.275 11:46:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:35:24.533 11:46:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:35:24.533 "name": "raid_bdev1", 00:35:24.533 "uuid": "5cdbe615-f761-4593-a45f-a15bd9be84f6", 00:35:24.533 "strip_size_kb": 64, 00:35:24.533 "state": "online", 00:35:24.533 "raid_level": "concat", 00:35:24.533 "superblock": true, 00:35:24.533 "num_base_bdevs": 4, 00:35:24.533 "num_base_bdevs_discovered": 4, 00:35:24.533 "num_base_bdevs_operational": 4, 00:35:24.533 "base_bdevs_list": [ 00:35:24.533 { 00:35:24.533 "name": "BaseBdev1", 00:35:24.533 "uuid": "d6e3e65f-ed58-5015-81de-427051f5beb1", 00:35:24.533 "is_configured": true, 00:35:24.533 "data_offset": 2048, 00:35:24.533 "data_size": 63488 00:35:24.533 }, 00:35:24.533 { 00:35:24.533 "name": "BaseBdev2", 00:35:24.533 "uuid": "538b021c-6e0b-5d63-beb4-05a1ff1233a4", 00:35:24.533 "is_configured": true, 00:35:24.533 "data_offset": 2048, 00:35:24.533 "data_size": 63488 00:35:24.533 }, 00:35:24.533 { 00:35:24.533 "name": "BaseBdev3", 00:35:24.533 "uuid": "5e9d1869-b3f2-581f-b8b4-007ff5690951", 00:35:24.533 "is_configured": true, 00:35:24.533 "data_offset": 2048, 00:35:24.533 "data_size": 63488 00:35:24.533 }, 00:35:24.533 { 00:35:24.533 "name": "BaseBdev4", 00:35:24.533 "uuid": "11730241-5272-5217-94a3-5cf7f2b4453e", 00:35:24.533 "is_configured": true, 00:35:24.533 "data_offset": 2048, 00:35:24.533 "data_size": 63488 00:35:24.533 } 00:35:24.533 ] 00:35:24.533 }' 00:35:24.533 11:46:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:35:24.533 11:46:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:35:25.099 11:46:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:35:25.099 11:46:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:35:25.099 [2024-06-10 11:46:08.834541] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2837670 00:35:26.035 11:46:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:35:26.035 11:46:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:35:26.035 11:46:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:35:26.035 11:46:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:35:26.035 11:46:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:35:26.035 11:46:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:35:26.035 11:46:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:35:26.035 11:46:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:35:26.035 11:46:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:35:26.035 11:46:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:35:26.035 11:46:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:35:26.035 11:46:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:35:26.035 11:46:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:35:26.035 11:46:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:35:26.035 11:46:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:26.035 11:46:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:35:26.293 11:46:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:35:26.293 "name": "raid_bdev1", 00:35:26.293 "uuid": "5cdbe615-f761-4593-a45f-a15bd9be84f6", 00:35:26.293 "strip_size_kb": 64, 00:35:26.293 "state": "online", 00:35:26.293 "raid_level": "concat", 00:35:26.293 "superblock": true, 00:35:26.293 "num_base_bdevs": 4, 00:35:26.293 "num_base_bdevs_discovered": 4, 00:35:26.293 "num_base_bdevs_operational": 4, 00:35:26.293 "base_bdevs_list": [ 00:35:26.293 { 00:35:26.293 "name": "BaseBdev1", 00:35:26.293 "uuid": "d6e3e65f-ed58-5015-81de-427051f5beb1", 00:35:26.293 "is_configured": true, 00:35:26.293 "data_offset": 2048, 00:35:26.293 "data_size": 63488 00:35:26.293 }, 00:35:26.293 { 00:35:26.293 "name": "BaseBdev2", 00:35:26.293 "uuid": "538b021c-6e0b-5d63-beb4-05a1ff1233a4", 00:35:26.293 "is_configured": true, 00:35:26.293 "data_offset": 2048, 00:35:26.293 "data_size": 63488 00:35:26.293 }, 00:35:26.293 { 00:35:26.293 "name": "BaseBdev3", 00:35:26.293 "uuid": "5e9d1869-b3f2-581f-b8b4-007ff5690951", 00:35:26.293 "is_configured": true, 00:35:26.293 "data_offset": 2048, 00:35:26.293 "data_size": 63488 00:35:26.293 }, 00:35:26.293 { 00:35:26.293 "name": "BaseBdev4", 00:35:26.293 "uuid": "11730241-5272-5217-94a3-5cf7f2b4453e", 00:35:26.293 "is_configured": true, 00:35:26.293 "data_offset": 2048, 00:35:26.293 "data_size": 63488 00:35:26.293 } 00:35:26.293 ] 00:35:26.293 }' 00:35:26.293 11:46:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:35:26.293 11:46:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:35:26.860 11:46:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:35:26.860 [2024-06-10 11:46:10.771567] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:35:26.860 [2024-06-10 11:46:10.771607] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:35:26.860 [2024-06-10 11:46:10.773606] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:35:26.860 [2024-06-10 11:46:10.773635] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:35:26.860 [2024-06-10 11:46:10.773663] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:35:26.860 [2024-06-10 11:46:10.773671] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2835020 name raid_bdev1, state offline 00:35:26.860 0 00:35:26.860 11:46:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 202591 00:35:26.860 11:46:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@949 -- # '[' -z 202591 ']' 00:35:26.860 11:46:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # kill -0 202591 00:35:26.860 11:46:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # uname 00:35:26.860 11:46:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:35:26.860 11:46:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 202591 00:35:27.119 11:46:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:35:27.119 11:46:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:35:27.119 11:46:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 202591' 00:35:27.119 killing process with pid 202591 00:35:27.119 11:46:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # kill 202591 00:35:27.119 [2024-06-10 11:46:10.836423] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:35:27.119 11:46:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@973 -- # wait 202591 00:35:27.119 [2024-06-10 11:46:10.867081] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:35:27.377 11:46:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.fmckjEamH9 00:35:27.377 11:46:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:35:27.377 11:46:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:35:27.377 11:46:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.52 00:35:27.377 11:46:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:35:27.377 11:46:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:35:27.377 11:46:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:35:27.377 11:46:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.52 != \0\.\0\0 ]] 00:35:27.377 00:35:27.377 real 0m6.163s 00:35:27.377 user 0m9.494s 00:35:27.377 sys 0m1.112s 00:35:27.377 11:46:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:35:27.377 11:46:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:35:27.377 ************************************ 00:35:27.377 END TEST raid_write_error_test 00:35:27.377 ************************************ 00:35:27.377 11:46:11 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:35:27.377 11:46:11 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 4 false 00:35:27.377 11:46:11 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:35:27.377 11:46:11 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:35:27.377 11:46:11 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:35:27.377 ************************************ 00:35:27.377 START TEST raid_state_function_test 00:35:27.377 ************************************ 00:35:27.377 11:46:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # raid_state_function_test raid1 4 false 00:35:27.377 11:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:35:27.377 11:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:35:27.377 11:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:35:27.377 11:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:35:27.377 11:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:35:27.377 11:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:35:27.377 11:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:35:27.377 11:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:35:27.377 11:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:35:27.377 11:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:35:27.377 11:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:35:27.377 11:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:35:27.377 11:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:35:27.377 11:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:35:27.377 11:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:35:27.377 11:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:35:27.377 11:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:35:27.377 11:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:35:27.377 11:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:35:27.377 11:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:35:27.377 11:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:35:27.377 11:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:35:27.377 11:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:35:27.377 11:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:35:27.377 11:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:35:27.377 11:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:35:27.377 11:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:35:27.377 11:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:35:27.377 11:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=203558 00:35:27.377 11:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 203558' 00:35:27.377 Process raid pid: 203558 00:35:27.377 11:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:35:27.377 11:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 203558 /var/tmp/spdk-raid.sock 00:35:27.377 11:46:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@830 -- # '[' -z 203558 ']' 00:35:27.377 11:46:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:35:27.377 11:46:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:35:27.377 11:46:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:35:27.377 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:35:27.377 11:46:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:35:27.377 11:46:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:35:27.377 [2024-06-10 11:46:11.232294] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:35:27.377 [2024-06-10 11:46:11.232355] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:35:27.377 [2024-06-10 11:46:11.319822] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:27.635 [2024-06-10 11:46:11.403193] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:35:27.635 [2024-06-10 11:46:11.455454] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:35:27.635 [2024-06-10 11:46:11.455501] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:35:28.199 11:46:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:35:28.199 11:46:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@863 -- # return 0 00:35:28.199 11:46:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:35:28.456 [2024-06-10 11:46:12.192760] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:35:28.456 [2024-06-10 11:46:12.192797] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:35:28.456 [2024-06-10 11:46:12.192805] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:35:28.456 [2024-06-10 11:46:12.192812] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:35:28.456 [2024-06-10 11:46:12.192818] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:35:28.456 [2024-06-10 11:46:12.192825] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:35:28.456 [2024-06-10 11:46:12.192831] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:35:28.456 [2024-06-10 11:46:12.192838] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:35:28.456 11:46:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:35:28.456 11:46:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:35:28.456 11:46:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:35:28.456 11:46:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:35:28.456 11:46:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:35:28.456 11:46:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:35:28.456 11:46:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:35:28.456 11:46:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:35:28.456 11:46:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:35:28.456 11:46:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:35:28.456 11:46:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:28.456 11:46:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:35:28.456 11:46:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:35:28.456 "name": "Existed_Raid", 00:35:28.456 "uuid": "00000000-0000-0000-0000-000000000000", 00:35:28.456 "strip_size_kb": 0, 00:35:28.456 "state": "configuring", 00:35:28.456 "raid_level": "raid1", 00:35:28.456 "superblock": false, 00:35:28.456 "num_base_bdevs": 4, 00:35:28.456 "num_base_bdevs_discovered": 0, 00:35:28.456 "num_base_bdevs_operational": 4, 00:35:28.456 "base_bdevs_list": [ 00:35:28.456 { 00:35:28.456 "name": "BaseBdev1", 00:35:28.456 "uuid": "00000000-0000-0000-0000-000000000000", 00:35:28.456 "is_configured": false, 00:35:28.456 "data_offset": 0, 00:35:28.456 "data_size": 0 00:35:28.456 }, 00:35:28.456 { 00:35:28.457 "name": "BaseBdev2", 00:35:28.457 "uuid": "00000000-0000-0000-0000-000000000000", 00:35:28.457 "is_configured": false, 00:35:28.457 "data_offset": 0, 00:35:28.457 "data_size": 0 00:35:28.457 }, 00:35:28.457 { 00:35:28.457 "name": "BaseBdev3", 00:35:28.457 "uuid": "00000000-0000-0000-0000-000000000000", 00:35:28.457 "is_configured": false, 00:35:28.457 "data_offset": 0, 00:35:28.457 "data_size": 0 00:35:28.457 }, 00:35:28.457 { 00:35:28.457 "name": "BaseBdev4", 00:35:28.457 "uuid": "00000000-0000-0000-0000-000000000000", 00:35:28.457 "is_configured": false, 00:35:28.457 "data_offset": 0, 00:35:28.457 "data_size": 0 00:35:28.457 } 00:35:28.457 ] 00:35:28.457 }' 00:35:28.457 11:46:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:35:28.457 11:46:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:35:29.021 11:46:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:35:29.280 [2024-06-10 11:46:13.022816] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:35:29.281 [2024-06-10 11:46:13.022842] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ebd550 name Existed_Raid, state configuring 00:35:29.281 11:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:35:29.281 [2024-06-10 11:46:13.207518] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:35:29.281 [2024-06-10 11:46:13.207549] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:35:29.281 [2024-06-10 11:46:13.207556] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:35:29.281 [2024-06-10 11:46:13.207565] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:35:29.281 [2024-06-10 11:46:13.207571] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:35:29.281 [2024-06-10 11:46:13.207580] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:35:29.281 [2024-06-10 11:46:13.207587] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:35:29.281 [2024-06-10 11:46:13.207595] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:35:29.539 11:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:35:29.539 [2024-06-10 11:46:13.400459] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:35:29.539 BaseBdev1 00:35:29.539 11:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:35:29.539 11:46:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:35:29.539 11:46:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:35:29.539 11:46:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:35:29.539 11:46:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:35:29.539 11:46:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:35:29.539 11:46:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:35:29.797 11:46:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:35:30.056 [ 00:35:30.056 { 00:35:30.056 "name": "BaseBdev1", 00:35:30.056 "aliases": [ 00:35:30.056 "e6a97eb2-bafb-4c07-9e11-ddb42e12384b" 00:35:30.056 ], 00:35:30.056 "product_name": "Malloc disk", 00:35:30.056 "block_size": 512, 00:35:30.056 "num_blocks": 65536, 00:35:30.056 "uuid": "e6a97eb2-bafb-4c07-9e11-ddb42e12384b", 00:35:30.056 "assigned_rate_limits": { 00:35:30.056 "rw_ios_per_sec": 0, 00:35:30.056 "rw_mbytes_per_sec": 0, 00:35:30.056 "r_mbytes_per_sec": 0, 00:35:30.056 "w_mbytes_per_sec": 0 00:35:30.056 }, 00:35:30.056 "claimed": true, 00:35:30.056 "claim_type": "exclusive_write", 00:35:30.056 "zoned": false, 00:35:30.056 "supported_io_types": { 00:35:30.056 "read": true, 00:35:30.056 "write": true, 00:35:30.056 "unmap": true, 00:35:30.056 "write_zeroes": true, 00:35:30.056 "flush": true, 00:35:30.056 "reset": true, 00:35:30.056 "compare": false, 00:35:30.056 "compare_and_write": false, 00:35:30.056 "abort": true, 00:35:30.056 "nvme_admin": false, 00:35:30.056 "nvme_io": false 00:35:30.056 }, 00:35:30.056 "memory_domains": [ 00:35:30.056 { 00:35:30.056 "dma_device_id": "system", 00:35:30.056 "dma_device_type": 1 00:35:30.056 }, 00:35:30.056 { 00:35:30.056 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:35:30.056 "dma_device_type": 2 00:35:30.056 } 00:35:30.056 ], 00:35:30.056 "driver_specific": {} 00:35:30.056 } 00:35:30.056 ] 00:35:30.056 11:46:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:35:30.056 11:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:35:30.056 11:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:35:30.056 11:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:35:30.056 11:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:35:30.056 11:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:35:30.056 11:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:35:30.056 11:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:35:30.056 11:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:35:30.056 11:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:35:30.056 11:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:35:30.056 11:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:30.056 11:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:35:30.056 11:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:35:30.056 "name": "Existed_Raid", 00:35:30.056 "uuid": "00000000-0000-0000-0000-000000000000", 00:35:30.056 "strip_size_kb": 0, 00:35:30.056 "state": "configuring", 00:35:30.056 "raid_level": "raid1", 00:35:30.056 "superblock": false, 00:35:30.056 "num_base_bdevs": 4, 00:35:30.056 "num_base_bdevs_discovered": 1, 00:35:30.056 "num_base_bdevs_operational": 4, 00:35:30.056 "base_bdevs_list": [ 00:35:30.056 { 00:35:30.056 "name": "BaseBdev1", 00:35:30.056 "uuid": "e6a97eb2-bafb-4c07-9e11-ddb42e12384b", 00:35:30.056 "is_configured": true, 00:35:30.056 "data_offset": 0, 00:35:30.056 "data_size": 65536 00:35:30.056 }, 00:35:30.056 { 00:35:30.056 "name": "BaseBdev2", 00:35:30.056 "uuid": "00000000-0000-0000-0000-000000000000", 00:35:30.056 "is_configured": false, 00:35:30.056 "data_offset": 0, 00:35:30.056 "data_size": 0 00:35:30.056 }, 00:35:30.056 { 00:35:30.056 "name": "BaseBdev3", 00:35:30.056 "uuid": "00000000-0000-0000-0000-000000000000", 00:35:30.056 "is_configured": false, 00:35:30.056 "data_offset": 0, 00:35:30.056 "data_size": 0 00:35:30.056 }, 00:35:30.056 { 00:35:30.056 "name": "BaseBdev4", 00:35:30.056 "uuid": "00000000-0000-0000-0000-000000000000", 00:35:30.056 "is_configured": false, 00:35:30.056 "data_offset": 0, 00:35:30.057 "data_size": 0 00:35:30.057 } 00:35:30.057 ] 00:35:30.057 }' 00:35:30.057 11:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:35:30.057 11:46:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:35:30.624 11:46:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:35:30.882 [2024-06-10 11:46:14.611577] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:35:30.882 [2024-06-10 11:46:14.611609] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ebcdc0 name Existed_Raid, state configuring 00:35:30.882 11:46:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:35:30.882 [2024-06-10 11:46:14.792066] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:35:30.882 [2024-06-10 11:46:14.793118] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:35:30.882 [2024-06-10 11:46:14.793144] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:35:30.882 [2024-06-10 11:46:14.793156] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:35:30.882 [2024-06-10 11:46:14.793164] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:35:30.882 [2024-06-10 11:46:14.793170] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:35:30.882 [2024-06-10 11:46:14.793178] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:35:30.882 11:46:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:35:30.882 11:46:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:35:30.882 11:46:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:35:30.883 11:46:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:35:30.883 11:46:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:35:30.883 11:46:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:35:30.883 11:46:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:35:30.883 11:46:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:35:30.883 11:46:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:35:30.883 11:46:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:35:30.883 11:46:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:35:30.883 11:46:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:35:30.883 11:46:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:30.883 11:46:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:35:31.141 11:46:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:35:31.141 "name": "Existed_Raid", 00:35:31.141 "uuid": "00000000-0000-0000-0000-000000000000", 00:35:31.141 "strip_size_kb": 0, 00:35:31.141 "state": "configuring", 00:35:31.141 "raid_level": "raid1", 00:35:31.141 "superblock": false, 00:35:31.141 "num_base_bdevs": 4, 00:35:31.141 "num_base_bdevs_discovered": 1, 00:35:31.141 "num_base_bdevs_operational": 4, 00:35:31.141 "base_bdevs_list": [ 00:35:31.141 { 00:35:31.141 "name": "BaseBdev1", 00:35:31.141 "uuid": "e6a97eb2-bafb-4c07-9e11-ddb42e12384b", 00:35:31.141 "is_configured": true, 00:35:31.141 "data_offset": 0, 00:35:31.141 "data_size": 65536 00:35:31.141 }, 00:35:31.141 { 00:35:31.141 "name": "BaseBdev2", 00:35:31.141 "uuid": "00000000-0000-0000-0000-000000000000", 00:35:31.141 "is_configured": false, 00:35:31.141 "data_offset": 0, 00:35:31.141 "data_size": 0 00:35:31.141 }, 00:35:31.141 { 00:35:31.141 "name": "BaseBdev3", 00:35:31.141 "uuid": "00000000-0000-0000-0000-000000000000", 00:35:31.141 "is_configured": false, 00:35:31.141 "data_offset": 0, 00:35:31.141 "data_size": 0 00:35:31.141 }, 00:35:31.141 { 00:35:31.141 "name": "BaseBdev4", 00:35:31.141 "uuid": "00000000-0000-0000-0000-000000000000", 00:35:31.141 "is_configured": false, 00:35:31.141 "data_offset": 0, 00:35:31.141 "data_size": 0 00:35:31.141 } 00:35:31.141 ] 00:35:31.141 }' 00:35:31.141 11:46:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:35:31.141 11:46:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:35:31.708 11:46:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:35:31.708 [2024-06-10 11:46:15.593030] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:35:31.708 BaseBdev2 00:35:31.708 11:46:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:35:31.708 11:46:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:35:31.708 11:46:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:35:31.708 11:46:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:35:31.708 11:46:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:35:31.708 11:46:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:35:31.708 11:46:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:35:31.966 11:46:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:35:32.225 [ 00:35:32.225 { 00:35:32.225 "name": "BaseBdev2", 00:35:32.225 "aliases": [ 00:35:32.225 "26d0e0e7-b70c-4b99-8bd9-f0c5c3993463" 00:35:32.225 ], 00:35:32.225 "product_name": "Malloc disk", 00:35:32.225 "block_size": 512, 00:35:32.225 "num_blocks": 65536, 00:35:32.225 "uuid": "26d0e0e7-b70c-4b99-8bd9-f0c5c3993463", 00:35:32.225 "assigned_rate_limits": { 00:35:32.225 "rw_ios_per_sec": 0, 00:35:32.225 "rw_mbytes_per_sec": 0, 00:35:32.225 "r_mbytes_per_sec": 0, 00:35:32.225 "w_mbytes_per_sec": 0 00:35:32.225 }, 00:35:32.225 "claimed": true, 00:35:32.225 "claim_type": "exclusive_write", 00:35:32.225 "zoned": false, 00:35:32.225 "supported_io_types": { 00:35:32.225 "read": true, 00:35:32.225 "write": true, 00:35:32.225 "unmap": true, 00:35:32.225 "write_zeroes": true, 00:35:32.225 "flush": true, 00:35:32.225 "reset": true, 00:35:32.225 "compare": false, 00:35:32.225 "compare_and_write": false, 00:35:32.225 "abort": true, 00:35:32.225 "nvme_admin": false, 00:35:32.225 "nvme_io": false 00:35:32.225 }, 00:35:32.225 "memory_domains": [ 00:35:32.225 { 00:35:32.225 "dma_device_id": "system", 00:35:32.225 "dma_device_type": 1 00:35:32.225 }, 00:35:32.225 { 00:35:32.225 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:35:32.225 "dma_device_type": 2 00:35:32.225 } 00:35:32.225 ], 00:35:32.225 "driver_specific": {} 00:35:32.225 } 00:35:32.225 ] 00:35:32.225 11:46:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:35:32.225 11:46:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:35:32.225 11:46:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:35:32.225 11:46:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:35:32.225 11:46:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:35:32.225 11:46:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:35:32.225 11:46:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:35:32.225 11:46:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:35:32.225 11:46:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:35:32.225 11:46:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:35:32.225 11:46:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:35:32.225 11:46:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:35:32.225 11:46:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:35:32.225 11:46:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:32.225 11:46:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:35:32.225 11:46:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:35:32.225 "name": "Existed_Raid", 00:35:32.225 "uuid": "00000000-0000-0000-0000-000000000000", 00:35:32.225 "strip_size_kb": 0, 00:35:32.225 "state": "configuring", 00:35:32.225 "raid_level": "raid1", 00:35:32.225 "superblock": false, 00:35:32.225 "num_base_bdevs": 4, 00:35:32.225 "num_base_bdevs_discovered": 2, 00:35:32.225 "num_base_bdevs_operational": 4, 00:35:32.225 "base_bdevs_list": [ 00:35:32.225 { 00:35:32.225 "name": "BaseBdev1", 00:35:32.225 "uuid": "e6a97eb2-bafb-4c07-9e11-ddb42e12384b", 00:35:32.225 "is_configured": true, 00:35:32.225 "data_offset": 0, 00:35:32.225 "data_size": 65536 00:35:32.225 }, 00:35:32.225 { 00:35:32.225 "name": "BaseBdev2", 00:35:32.225 "uuid": "26d0e0e7-b70c-4b99-8bd9-f0c5c3993463", 00:35:32.225 "is_configured": true, 00:35:32.225 "data_offset": 0, 00:35:32.225 "data_size": 65536 00:35:32.225 }, 00:35:32.225 { 00:35:32.225 "name": "BaseBdev3", 00:35:32.225 "uuid": "00000000-0000-0000-0000-000000000000", 00:35:32.225 "is_configured": false, 00:35:32.225 "data_offset": 0, 00:35:32.225 "data_size": 0 00:35:32.225 }, 00:35:32.225 { 00:35:32.225 "name": "BaseBdev4", 00:35:32.225 "uuid": "00000000-0000-0000-0000-000000000000", 00:35:32.225 "is_configured": false, 00:35:32.225 "data_offset": 0, 00:35:32.225 "data_size": 0 00:35:32.225 } 00:35:32.225 ] 00:35:32.225 }' 00:35:32.225 11:46:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:35:32.225 11:46:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:35:32.792 11:46:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:35:33.051 [2024-06-10 11:46:16.807022] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:35:33.051 BaseBdev3 00:35:33.051 11:46:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:35:33.051 11:46:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:35:33.051 11:46:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:35:33.051 11:46:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:35:33.051 11:46:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:35:33.051 11:46:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:35:33.051 11:46:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:35:33.051 11:46:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:35:33.310 [ 00:35:33.310 { 00:35:33.310 "name": "BaseBdev3", 00:35:33.310 "aliases": [ 00:35:33.310 "17c4e37f-7f27-471a-acb5-954c4730bc71" 00:35:33.310 ], 00:35:33.310 "product_name": "Malloc disk", 00:35:33.310 "block_size": 512, 00:35:33.310 "num_blocks": 65536, 00:35:33.310 "uuid": "17c4e37f-7f27-471a-acb5-954c4730bc71", 00:35:33.310 "assigned_rate_limits": { 00:35:33.310 "rw_ios_per_sec": 0, 00:35:33.310 "rw_mbytes_per_sec": 0, 00:35:33.310 "r_mbytes_per_sec": 0, 00:35:33.310 "w_mbytes_per_sec": 0 00:35:33.310 }, 00:35:33.310 "claimed": true, 00:35:33.310 "claim_type": "exclusive_write", 00:35:33.310 "zoned": false, 00:35:33.310 "supported_io_types": { 00:35:33.310 "read": true, 00:35:33.310 "write": true, 00:35:33.310 "unmap": true, 00:35:33.310 "write_zeroes": true, 00:35:33.310 "flush": true, 00:35:33.310 "reset": true, 00:35:33.310 "compare": false, 00:35:33.310 "compare_and_write": false, 00:35:33.310 "abort": true, 00:35:33.310 "nvme_admin": false, 00:35:33.310 "nvme_io": false 00:35:33.310 }, 00:35:33.310 "memory_domains": [ 00:35:33.310 { 00:35:33.310 "dma_device_id": "system", 00:35:33.310 "dma_device_type": 1 00:35:33.310 }, 00:35:33.310 { 00:35:33.310 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:35:33.310 "dma_device_type": 2 00:35:33.310 } 00:35:33.310 ], 00:35:33.310 "driver_specific": {} 00:35:33.310 } 00:35:33.310 ] 00:35:33.310 11:46:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:35:33.310 11:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:35:33.310 11:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:35:33.310 11:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:35:33.310 11:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:35:33.310 11:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:35:33.310 11:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:35:33.310 11:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:35:33.310 11:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:35:33.310 11:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:35:33.310 11:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:35:33.310 11:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:35:33.310 11:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:35:33.310 11:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:33.310 11:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:35:33.569 11:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:35:33.569 "name": "Existed_Raid", 00:35:33.569 "uuid": "00000000-0000-0000-0000-000000000000", 00:35:33.569 "strip_size_kb": 0, 00:35:33.569 "state": "configuring", 00:35:33.569 "raid_level": "raid1", 00:35:33.569 "superblock": false, 00:35:33.569 "num_base_bdevs": 4, 00:35:33.569 "num_base_bdevs_discovered": 3, 00:35:33.569 "num_base_bdevs_operational": 4, 00:35:33.570 "base_bdevs_list": [ 00:35:33.570 { 00:35:33.570 "name": "BaseBdev1", 00:35:33.570 "uuid": "e6a97eb2-bafb-4c07-9e11-ddb42e12384b", 00:35:33.570 "is_configured": true, 00:35:33.570 "data_offset": 0, 00:35:33.570 "data_size": 65536 00:35:33.570 }, 00:35:33.570 { 00:35:33.570 "name": "BaseBdev2", 00:35:33.570 "uuid": "26d0e0e7-b70c-4b99-8bd9-f0c5c3993463", 00:35:33.570 "is_configured": true, 00:35:33.570 "data_offset": 0, 00:35:33.570 "data_size": 65536 00:35:33.570 }, 00:35:33.570 { 00:35:33.570 "name": "BaseBdev3", 00:35:33.570 "uuid": "17c4e37f-7f27-471a-acb5-954c4730bc71", 00:35:33.570 "is_configured": true, 00:35:33.570 "data_offset": 0, 00:35:33.570 "data_size": 65536 00:35:33.570 }, 00:35:33.570 { 00:35:33.570 "name": "BaseBdev4", 00:35:33.570 "uuid": "00000000-0000-0000-0000-000000000000", 00:35:33.570 "is_configured": false, 00:35:33.570 "data_offset": 0, 00:35:33.570 "data_size": 0 00:35:33.570 } 00:35:33.570 ] 00:35:33.570 }' 00:35:33.570 11:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:35:33.570 11:46:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:35:34.138 11:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:35:34.138 [2024-06-10 11:46:17.992985] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:35:34.138 [2024-06-10 11:46:17.993024] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1ebde20 00:35:34.138 [2024-06-10 11:46:17.993030] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:35:34.138 [2024-06-10 11:46:17.993204] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ebea70 00:35:34.138 [2024-06-10 11:46:17.993295] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1ebde20 00:35:34.138 [2024-06-10 11:46:17.993301] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1ebde20 00:35:34.138 [2024-06-10 11:46:17.993420] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:35:34.138 BaseBdev4 00:35:34.138 11:46:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:35:34.138 11:46:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev4 00:35:34.138 11:46:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:35:34.138 11:46:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:35:34.138 11:46:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:35:34.138 11:46:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:35:34.138 11:46:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:35:34.397 11:46:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:35:34.397 [ 00:35:34.397 { 00:35:34.397 "name": "BaseBdev4", 00:35:34.397 "aliases": [ 00:35:34.397 "63a17016-c3f1-4c44-a69d-daa796152a59" 00:35:34.397 ], 00:35:34.397 "product_name": "Malloc disk", 00:35:34.397 "block_size": 512, 00:35:34.397 "num_blocks": 65536, 00:35:34.397 "uuid": "63a17016-c3f1-4c44-a69d-daa796152a59", 00:35:34.397 "assigned_rate_limits": { 00:35:34.397 "rw_ios_per_sec": 0, 00:35:34.397 "rw_mbytes_per_sec": 0, 00:35:34.397 "r_mbytes_per_sec": 0, 00:35:34.397 "w_mbytes_per_sec": 0 00:35:34.397 }, 00:35:34.397 "claimed": true, 00:35:34.397 "claim_type": "exclusive_write", 00:35:34.397 "zoned": false, 00:35:34.397 "supported_io_types": { 00:35:34.397 "read": true, 00:35:34.397 "write": true, 00:35:34.397 "unmap": true, 00:35:34.397 "write_zeroes": true, 00:35:34.397 "flush": true, 00:35:34.397 "reset": true, 00:35:34.397 "compare": false, 00:35:34.397 "compare_and_write": false, 00:35:34.397 "abort": true, 00:35:34.397 "nvme_admin": false, 00:35:34.397 "nvme_io": false 00:35:34.397 }, 00:35:34.397 "memory_domains": [ 00:35:34.397 { 00:35:34.397 "dma_device_id": "system", 00:35:34.397 "dma_device_type": 1 00:35:34.397 }, 00:35:34.397 { 00:35:34.397 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:35:34.397 "dma_device_type": 2 00:35:34.397 } 00:35:34.397 ], 00:35:34.397 "driver_specific": {} 00:35:34.397 } 00:35:34.397 ] 00:35:34.657 11:46:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:35:34.657 11:46:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:35:34.657 11:46:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:35:34.657 11:46:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:35:34.657 11:46:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:35:34.657 11:46:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:35:34.657 11:46:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:35:34.657 11:46:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:35:34.657 11:46:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:35:34.657 11:46:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:35:34.657 11:46:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:35:34.657 11:46:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:35:34.657 11:46:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:35:34.657 11:46:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:34.657 11:46:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:35:34.657 11:46:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:35:34.657 "name": "Existed_Raid", 00:35:34.657 "uuid": "4a962f54-8286-4264-a002-60ba03498468", 00:35:34.657 "strip_size_kb": 0, 00:35:34.657 "state": "online", 00:35:34.657 "raid_level": "raid1", 00:35:34.657 "superblock": false, 00:35:34.657 "num_base_bdevs": 4, 00:35:34.657 "num_base_bdevs_discovered": 4, 00:35:34.657 "num_base_bdevs_operational": 4, 00:35:34.657 "base_bdevs_list": [ 00:35:34.657 { 00:35:34.657 "name": "BaseBdev1", 00:35:34.657 "uuid": "e6a97eb2-bafb-4c07-9e11-ddb42e12384b", 00:35:34.657 "is_configured": true, 00:35:34.657 "data_offset": 0, 00:35:34.657 "data_size": 65536 00:35:34.657 }, 00:35:34.657 { 00:35:34.657 "name": "BaseBdev2", 00:35:34.657 "uuid": "26d0e0e7-b70c-4b99-8bd9-f0c5c3993463", 00:35:34.657 "is_configured": true, 00:35:34.657 "data_offset": 0, 00:35:34.657 "data_size": 65536 00:35:34.657 }, 00:35:34.657 { 00:35:34.657 "name": "BaseBdev3", 00:35:34.657 "uuid": "17c4e37f-7f27-471a-acb5-954c4730bc71", 00:35:34.657 "is_configured": true, 00:35:34.657 "data_offset": 0, 00:35:34.657 "data_size": 65536 00:35:34.657 }, 00:35:34.657 { 00:35:34.657 "name": "BaseBdev4", 00:35:34.657 "uuid": "63a17016-c3f1-4c44-a69d-daa796152a59", 00:35:34.657 "is_configured": true, 00:35:34.657 "data_offset": 0, 00:35:34.657 "data_size": 65536 00:35:34.657 } 00:35:34.657 ] 00:35:34.657 }' 00:35:34.657 11:46:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:35:34.657 11:46:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:35:35.224 11:46:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:35:35.224 11:46:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:35:35.224 11:46:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:35:35.224 11:46:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:35:35.224 11:46:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:35:35.224 11:46:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:35:35.224 11:46:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:35:35.224 11:46:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:35:35.224 [2024-06-10 11:46:19.160201] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:35:35.483 11:46:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:35:35.483 "name": "Existed_Raid", 00:35:35.483 "aliases": [ 00:35:35.483 "4a962f54-8286-4264-a002-60ba03498468" 00:35:35.483 ], 00:35:35.483 "product_name": "Raid Volume", 00:35:35.483 "block_size": 512, 00:35:35.483 "num_blocks": 65536, 00:35:35.483 "uuid": "4a962f54-8286-4264-a002-60ba03498468", 00:35:35.483 "assigned_rate_limits": { 00:35:35.483 "rw_ios_per_sec": 0, 00:35:35.483 "rw_mbytes_per_sec": 0, 00:35:35.483 "r_mbytes_per_sec": 0, 00:35:35.483 "w_mbytes_per_sec": 0 00:35:35.483 }, 00:35:35.483 "claimed": false, 00:35:35.483 "zoned": false, 00:35:35.483 "supported_io_types": { 00:35:35.483 "read": true, 00:35:35.483 "write": true, 00:35:35.483 "unmap": false, 00:35:35.483 "write_zeroes": true, 00:35:35.483 "flush": false, 00:35:35.483 "reset": true, 00:35:35.483 "compare": false, 00:35:35.483 "compare_and_write": false, 00:35:35.483 "abort": false, 00:35:35.483 "nvme_admin": false, 00:35:35.483 "nvme_io": false 00:35:35.483 }, 00:35:35.483 "memory_domains": [ 00:35:35.483 { 00:35:35.483 "dma_device_id": "system", 00:35:35.483 "dma_device_type": 1 00:35:35.483 }, 00:35:35.483 { 00:35:35.483 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:35:35.483 "dma_device_type": 2 00:35:35.483 }, 00:35:35.483 { 00:35:35.483 "dma_device_id": "system", 00:35:35.483 "dma_device_type": 1 00:35:35.483 }, 00:35:35.483 { 00:35:35.483 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:35:35.483 "dma_device_type": 2 00:35:35.483 }, 00:35:35.483 { 00:35:35.483 "dma_device_id": "system", 00:35:35.483 "dma_device_type": 1 00:35:35.483 }, 00:35:35.483 { 00:35:35.483 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:35:35.483 "dma_device_type": 2 00:35:35.483 }, 00:35:35.483 { 00:35:35.483 "dma_device_id": "system", 00:35:35.483 "dma_device_type": 1 00:35:35.483 }, 00:35:35.483 { 00:35:35.483 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:35:35.483 "dma_device_type": 2 00:35:35.483 } 00:35:35.483 ], 00:35:35.483 "driver_specific": { 00:35:35.483 "raid": { 00:35:35.483 "uuid": "4a962f54-8286-4264-a002-60ba03498468", 00:35:35.483 "strip_size_kb": 0, 00:35:35.483 "state": "online", 00:35:35.483 "raid_level": "raid1", 00:35:35.483 "superblock": false, 00:35:35.483 "num_base_bdevs": 4, 00:35:35.483 "num_base_bdevs_discovered": 4, 00:35:35.483 "num_base_bdevs_operational": 4, 00:35:35.483 "base_bdevs_list": [ 00:35:35.483 { 00:35:35.483 "name": "BaseBdev1", 00:35:35.483 "uuid": "e6a97eb2-bafb-4c07-9e11-ddb42e12384b", 00:35:35.483 "is_configured": true, 00:35:35.483 "data_offset": 0, 00:35:35.483 "data_size": 65536 00:35:35.483 }, 00:35:35.483 { 00:35:35.483 "name": "BaseBdev2", 00:35:35.483 "uuid": "26d0e0e7-b70c-4b99-8bd9-f0c5c3993463", 00:35:35.483 "is_configured": true, 00:35:35.483 "data_offset": 0, 00:35:35.483 "data_size": 65536 00:35:35.483 }, 00:35:35.483 { 00:35:35.483 "name": "BaseBdev3", 00:35:35.483 "uuid": "17c4e37f-7f27-471a-acb5-954c4730bc71", 00:35:35.483 "is_configured": true, 00:35:35.483 "data_offset": 0, 00:35:35.483 "data_size": 65536 00:35:35.483 }, 00:35:35.483 { 00:35:35.483 "name": "BaseBdev4", 00:35:35.483 "uuid": "63a17016-c3f1-4c44-a69d-daa796152a59", 00:35:35.483 "is_configured": true, 00:35:35.483 "data_offset": 0, 00:35:35.483 "data_size": 65536 00:35:35.483 } 00:35:35.483 ] 00:35:35.483 } 00:35:35.483 } 00:35:35.483 }' 00:35:35.483 11:46:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:35:35.483 11:46:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:35:35.483 BaseBdev2 00:35:35.483 BaseBdev3 00:35:35.483 BaseBdev4' 00:35:35.483 11:46:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:35:35.483 11:46:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:35:35.483 11:46:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:35:35.484 11:46:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:35:35.484 "name": "BaseBdev1", 00:35:35.484 "aliases": [ 00:35:35.484 "e6a97eb2-bafb-4c07-9e11-ddb42e12384b" 00:35:35.484 ], 00:35:35.484 "product_name": "Malloc disk", 00:35:35.484 "block_size": 512, 00:35:35.484 "num_blocks": 65536, 00:35:35.484 "uuid": "e6a97eb2-bafb-4c07-9e11-ddb42e12384b", 00:35:35.484 "assigned_rate_limits": { 00:35:35.484 "rw_ios_per_sec": 0, 00:35:35.484 "rw_mbytes_per_sec": 0, 00:35:35.484 "r_mbytes_per_sec": 0, 00:35:35.484 "w_mbytes_per_sec": 0 00:35:35.484 }, 00:35:35.484 "claimed": true, 00:35:35.484 "claim_type": "exclusive_write", 00:35:35.484 "zoned": false, 00:35:35.484 "supported_io_types": { 00:35:35.484 "read": true, 00:35:35.484 "write": true, 00:35:35.484 "unmap": true, 00:35:35.484 "write_zeroes": true, 00:35:35.484 "flush": true, 00:35:35.484 "reset": true, 00:35:35.484 "compare": false, 00:35:35.484 "compare_and_write": false, 00:35:35.484 "abort": true, 00:35:35.484 "nvme_admin": false, 00:35:35.484 "nvme_io": false 00:35:35.484 }, 00:35:35.484 "memory_domains": [ 00:35:35.484 { 00:35:35.484 "dma_device_id": "system", 00:35:35.484 "dma_device_type": 1 00:35:35.484 }, 00:35:35.484 { 00:35:35.484 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:35:35.484 "dma_device_type": 2 00:35:35.484 } 00:35:35.484 ], 00:35:35.484 "driver_specific": {} 00:35:35.484 }' 00:35:35.484 11:46:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:35:35.742 11:46:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:35:35.742 11:46:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:35:35.742 11:46:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:35:35.742 11:46:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:35:35.742 11:46:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:35:35.742 11:46:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:35:35.742 11:46:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:35:35.742 11:46:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:35:35.742 11:46:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:35:35.742 11:46:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:35:36.001 11:46:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:35:36.001 11:46:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:35:36.001 11:46:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:35:36.001 11:46:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:35:36.001 11:46:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:35:36.001 "name": "BaseBdev2", 00:35:36.001 "aliases": [ 00:35:36.001 "26d0e0e7-b70c-4b99-8bd9-f0c5c3993463" 00:35:36.001 ], 00:35:36.001 "product_name": "Malloc disk", 00:35:36.001 "block_size": 512, 00:35:36.001 "num_blocks": 65536, 00:35:36.001 "uuid": "26d0e0e7-b70c-4b99-8bd9-f0c5c3993463", 00:35:36.001 "assigned_rate_limits": { 00:35:36.001 "rw_ios_per_sec": 0, 00:35:36.001 "rw_mbytes_per_sec": 0, 00:35:36.001 "r_mbytes_per_sec": 0, 00:35:36.001 "w_mbytes_per_sec": 0 00:35:36.001 }, 00:35:36.001 "claimed": true, 00:35:36.001 "claim_type": "exclusive_write", 00:35:36.001 "zoned": false, 00:35:36.001 "supported_io_types": { 00:35:36.001 "read": true, 00:35:36.001 "write": true, 00:35:36.001 "unmap": true, 00:35:36.001 "write_zeroes": true, 00:35:36.001 "flush": true, 00:35:36.001 "reset": true, 00:35:36.001 "compare": false, 00:35:36.001 "compare_and_write": false, 00:35:36.001 "abort": true, 00:35:36.001 "nvme_admin": false, 00:35:36.001 "nvme_io": false 00:35:36.001 }, 00:35:36.001 "memory_domains": [ 00:35:36.001 { 00:35:36.001 "dma_device_id": "system", 00:35:36.001 "dma_device_type": 1 00:35:36.001 }, 00:35:36.001 { 00:35:36.001 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:35:36.001 "dma_device_type": 2 00:35:36.001 } 00:35:36.001 ], 00:35:36.001 "driver_specific": {} 00:35:36.001 }' 00:35:36.001 11:46:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:35:36.260 11:46:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:35:36.260 11:46:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:35:36.260 11:46:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:35:36.260 11:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:35:36.260 11:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:35:36.260 11:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:35:36.260 11:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:35:36.260 11:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:35:36.260 11:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:35:36.260 11:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:35:36.518 11:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:35:36.518 11:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:35:36.518 11:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:35:36.518 11:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:35:36.518 11:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:35:36.518 "name": "BaseBdev3", 00:35:36.518 "aliases": [ 00:35:36.518 "17c4e37f-7f27-471a-acb5-954c4730bc71" 00:35:36.518 ], 00:35:36.518 "product_name": "Malloc disk", 00:35:36.518 "block_size": 512, 00:35:36.518 "num_blocks": 65536, 00:35:36.518 "uuid": "17c4e37f-7f27-471a-acb5-954c4730bc71", 00:35:36.518 "assigned_rate_limits": { 00:35:36.518 "rw_ios_per_sec": 0, 00:35:36.518 "rw_mbytes_per_sec": 0, 00:35:36.518 "r_mbytes_per_sec": 0, 00:35:36.518 "w_mbytes_per_sec": 0 00:35:36.518 }, 00:35:36.518 "claimed": true, 00:35:36.518 "claim_type": "exclusive_write", 00:35:36.518 "zoned": false, 00:35:36.518 "supported_io_types": { 00:35:36.518 "read": true, 00:35:36.518 "write": true, 00:35:36.518 "unmap": true, 00:35:36.518 "write_zeroes": true, 00:35:36.518 "flush": true, 00:35:36.518 "reset": true, 00:35:36.518 "compare": false, 00:35:36.518 "compare_and_write": false, 00:35:36.518 "abort": true, 00:35:36.518 "nvme_admin": false, 00:35:36.518 "nvme_io": false 00:35:36.518 }, 00:35:36.518 "memory_domains": [ 00:35:36.518 { 00:35:36.518 "dma_device_id": "system", 00:35:36.518 "dma_device_type": 1 00:35:36.518 }, 00:35:36.518 { 00:35:36.518 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:35:36.518 "dma_device_type": 2 00:35:36.518 } 00:35:36.518 ], 00:35:36.518 "driver_specific": {} 00:35:36.518 }' 00:35:36.518 11:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:35:36.518 11:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:35:36.776 11:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:35:36.776 11:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:35:36.776 11:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:35:36.776 11:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:35:36.776 11:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:35:36.776 11:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:35:36.776 11:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:35:36.776 11:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:35:36.776 11:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:35:37.035 11:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:35:37.035 11:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:35:37.036 11:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:35:37.036 11:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:35:37.036 11:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:35:37.036 "name": "BaseBdev4", 00:35:37.036 "aliases": [ 00:35:37.036 "63a17016-c3f1-4c44-a69d-daa796152a59" 00:35:37.036 ], 00:35:37.036 "product_name": "Malloc disk", 00:35:37.036 "block_size": 512, 00:35:37.036 "num_blocks": 65536, 00:35:37.036 "uuid": "63a17016-c3f1-4c44-a69d-daa796152a59", 00:35:37.036 "assigned_rate_limits": { 00:35:37.036 "rw_ios_per_sec": 0, 00:35:37.036 "rw_mbytes_per_sec": 0, 00:35:37.036 "r_mbytes_per_sec": 0, 00:35:37.036 "w_mbytes_per_sec": 0 00:35:37.036 }, 00:35:37.036 "claimed": true, 00:35:37.036 "claim_type": "exclusive_write", 00:35:37.036 "zoned": false, 00:35:37.036 "supported_io_types": { 00:35:37.036 "read": true, 00:35:37.036 "write": true, 00:35:37.036 "unmap": true, 00:35:37.036 "write_zeroes": true, 00:35:37.036 "flush": true, 00:35:37.036 "reset": true, 00:35:37.036 "compare": false, 00:35:37.036 "compare_and_write": false, 00:35:37.036 "abort": true, 00:35:37.036 "nvme_admin": false, 00:35:37.036 "nvme_io": false 00:35:37.036 }, 00:35:37.036 "memory_domains": [ 00:35:37.036 { 00:35:37.036 "dma_device_id": "system", 00:35:37.036 "dma_device_type": 1 00:35:37.036 }, 00:35:37.036 { 00:35:37.036 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:35:37.036 "dma_device_type": 2 00:35:37.036 } 00:35:37.036 ], 00:35:37.036 "driver_specific": {} 00:35:37.036 }' 00:35:37.036 11:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:35:37.036 11:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:35:37.294 11:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:35:37.294 11:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:35:37.294 11:46:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:35:37.294 11:46:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:35:37.294 11:46:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:35:37.294 11:46:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:35:37.294 11:46:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:35:37.294 11:46:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:35:37.294 11:46:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:35:37.294 11:46:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:35:37.294 11:46:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:35:37.553 [2024-06-10 11:46:21.381811] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:35:37.553 11:46:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:35:37.553 11:46:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:35:37.553 11:46:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:35:37.553 11:46:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:35:37.553 11:46:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:35:37.553 11:46:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:35:37.553 11:46:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:35:37.553 11:46:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:35:37.553 11:46:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:35:37.553 11:46:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:35:37.553 11:46:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:35:37.553 11:46:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:35:37.553 11:46:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:35:37.553 11:46:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:35:37.553 11:46:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:35:37.553 11:46:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:37.553 11:46:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:35:37.812 11:46:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:35:37.812 "name": "Existed_Raid", 00:35:37.812 "uuid": "4a962f54-8286-4264-a002-60ba03498468", 00:35:37.812 "strip_size_kb": 0, 00:35:37.812 "state": "online", 00:35:37.812 "raid_level": "raid1", 00:35:37.812 "superblock": false, 00:35:37.812 "num_base_bdevs": 4, 00:35:37.812 "num_base_bdevs_discovered": 3, 00:35:37.812 "num_base_bdevs_operational": 3, 00:35:37.812 "base_bdevs_list": [ 00:35:37.812 { 00:35:37.812 "name": null, 00:35:37.812 "uuid": "00000000-0000-0000-0000-000000000000", 00:35:37.812 "is_configured": false, 00:35:37.812 "data_offset": 0, 00:35:37.812 "data_size": 65536 00:35:37.812 }, 00:35:37.812 { 00:35:37.812 "name": "BaseBdev2", 00:35:37.812 "uuid": "26d0e0e7-b70c-4b99-8bd9-f0c5c3993463", 00:35:37.812 "is_configured": true, 00:35:37.812 "data_offset": 0, 00:35:37.812 "data_size": 65536 00:35:37.812 }, 00:35:37.812 { 00:35:37.812 "name": "BaseBdev3", 00:35:37.812 "uuid": "17c4e37f-7f27-471a-acb5-954c4730bc71", 00:35:37.812 "is_configured": true, 00:35:37.812 "data_offset": 0, 00:35:37.812 "data_size": 65536 00:35:37.812 }, 00:35:37.812 { 00:35:37.812 "name": "BaseBdev4", 00:35:37.812 "uuid": "63a17016-c3f1-4c44-a69d-daa796152a59", 00:35:37.812 "is_configured": true, 00:35:37.812 "data_offset": 0, 00:35:37.812 "data_size": 65536 00:35:37.812 } 00:35:37.812 ] 00:35:37.812 }' 00:35:37.812 11:46:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:35:37.812 11:46:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:35:38.384 11:46:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:35:38.384 11:46:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:35:38.384 11:46:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:35:38.384 11:46:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:38.384 11:46:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:35:38.384 11:46:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:35:38.384 11:46:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:35:38.645 [2024-06-10 11:46:22.421271] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:35:38.645 11:46:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:35:38.645 11:46:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:35:38.645 11:46:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:38.645 11:46:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:35:38.905 11:46:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:35:38.905 11:46:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:35:38.905 11:46:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:35:38.905 [2024-06-10 11:46:22.782212] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:35:38.905 11:46:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:35:38.905 11:46:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:35:38.905 11:46:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:35:38.905 11:46:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:39.163 11:46:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:35:39.163 11:46:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:35:39.163 11:46:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:35:39.422 [2024-06-10 11:46:23.135081] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:35:39.422 [2024-06-10 11:46:23.135145] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:35:39.422 [2024-06-10 11:46:23.145356] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:35:39.422 [2024-06-10 11:46:23.145385] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:35:39.422 [2024-06-10 11:46:23.145394] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ebde20 name Existed_Raid, state offline 00:35:39.422 11:46:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:35:39.422 11:46:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:35:39.422 11:46:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:39.422 11:46:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:35:39.422 11:46:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:35:39.422 11:46:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:35:39.422 11:46:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:35:39.422 11:46:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:35:39.422 11:46:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:35:39.422 11:46:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:35:39.680 BaseBdev2 00:35:39.680 11:46:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:35:39.680 11:46:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:35:39.680 11:46:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:35:39.680 11:46:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:35:39.680 11:46:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:35:39.680 11:46:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:35:39.680 11:46:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:35:39.938 11:46:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:35:39.938 [ 00:35:39.938 { 00:35:39.938 "name": "BaseBdev2", 00:35:39.938 "aliases": [ 00:35:39.938 "c10581b9-98fa-4177-a1a9-c130e76910ab" 00:35:39.938 ], 00:35:39.938 "product_name": "Malloc disk", 00:35:39.938 "block_size": 512, 00:35:39.938 "num_blocks": 65536, 00:35:39.938 "uuid": "c10581b9-98fa-4177-a1a9-c130e76910ab", 00:35:39.938 "assigned_rate_limits": { 00:35:39.938 "rw_ios_per_sec": 0, 00:35:39.938 "rw_mbytes_per_sec": 0, 00:35:39.938 "r_mbytes_per_sec": 0, 00:35:39.938 "w_mbytes_per_sec": 0 00:35:39.938 }, 00:35:39.938 "claimed": false, 00:35:39.938 "zoned": false, 00:35:39.938 "supported_io_types": { 00:35:39.938 "read": true, 00:35:39.938 "write": true, 00:35:39.938 "unmap": true, 00:35:39.938 "write_zeroes": true, 00:35:39.938 "flush": true, 00:35:39.938 "reset": true, 00:35:39.938 "compare": false, 00:35:39.938 "compare_and_write": false, 00:35:39.938 "abort": true, 00:35:39.938 "nvme_admin": false, 00:35:39.938 "nvme_io": false 00:35:39.938 }, 00:35:39.938 "memory_domains": [ 00:35:39.938 { 00:35:39.938 "dma_device_id": "system", 00:35:39.938 "dma_device_type": 1 00:35:39.938 }, 00:35:39.938 { 00:35:39.938 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:35:39.938 "dma_device_type": 2 00:35:39.938 } 00:35:39.938 ], 00:35:39.938 "driver_specific": {} 00:35:39.938 } 00:35:39.938 ] 00:35:39.938 11:46:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:35:39.938 11:46:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:35:39.938 11:46:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:35:39.938 11:46:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:35:40.196 BaseBdev3 00:35:40.196 11:46:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:35:40.196 11:46:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:35:40.196 11:46:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:35:40.196 11:46:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:35:40.196 11:46:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:35:40.196 11:46:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:35:40.196 11:46:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:35:40.455 11:46:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:35:40.455 [ 00:35:40.455 { 00:35:40.455 "name": "BaseBdev3", 00:35:40.455 "aliases": [ 00:35:40.455 "9c9bfd58-7be4-412d-b9cc-b18697860c2e" 00:35:40.455 ], 00:35:40.455 "product_name": "Malloc disk", 00:35:40.455 "block_size": 512, 00:35:40.455 "num_blocks": 65536, 00:35:40.455 "uuid": "9c9bfd58-7be4-412d-b9cc-b18697860c2e", 00:35:40.455 "assigned_rate_limits": { 00:35:40.455 "rw_ios_per_sec": 0, 00:35:40.455 "rw_mbytes_per_sec": 0, 00:35:40.455 "r_mbytes_per_sec": 0, 00:35:40.455 "w_mbytes_per_sec": 0 00:35:40.455 }, 00:35:40.455 "claimed": false, 00:35:40.455 "zoned": false, 00:35:40.455 "supported_io_types": { 00:35:40.455 "read": true, 00:35:40.455 "write": true, 00:35:40.455 "unmap": true, 00:35:40.455 "write_zeroes": true, 00:35:40.455 "flush": true, 00:35:40.455 "reset": true, 00:35:40.455 "compare": false, 00:35:40.455 "compare_and_write": false, 00:35:40.455 "abort": true, 00:35:40.455 "nvme_admin": false, 00:35:40.455 "nvme_io": false 00:35:40.455 }, 00:35:40.455 "memory_domains": [ 00:35:40.455 { 00:35:40.455 "dma_device_id": "system", 00:35:40.455 "dma_device_type": 1 00:35:40.455 }, 00:35:40.455 { 00:35:40.455 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:35:40.455 "dma_device_type": 2 00:35:40.455 } 00:35:40.455 ], 00:35:40.455 "driver_specific": {} 00:35:40.455 } 00:35:40.455 ] 00:35:40.455 11:46:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:35:40.455 11:46:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:35:40.455 11:46:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:35:40.455 11:46:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:35:40.713 BaseBdev4 00:35:40.713 11:46:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:35:40.713 11:46:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev4 00:35:40.713 11:46:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:35:40.713 11:46:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:35:40.713 11:46:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:35:40.713 11:46:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:35:40.713 11:46:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:35:40.971 11:46:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:35:40.971 [ 00:35:40.971 { 00:35:40.971 "name": "BaseBdev4", 00:35:40.971 "aliases": [ 00:35:40.971 "e40bdbc7-10f9-4dd2-acc1-7c63cb12df3d" 00:35:40.971 ], 00:35:40.971 "product_name": "Malloc disk", 00:35:40.971 "block_size": 512, 00:35:40.971 "num_blocks": 65536, 00:35:40.971 "uuid": "e40bdbc7-10f9-4dd2-acc1-7c63cb12df3d", 00:35:40.971 "assigned_rate_limits": { 00:35:40.971 "rw_ios_per_sec": 0, 00:35:40.971 "rw_mbytes_per_sec": 0, 00:35:40.971 "r_mbytes_per_sec": 0, 00:35:40.971 "w_mbytes_per_sec": 0 00:35:40.971 }, 00:35:40.971 "claimed": false, 00:35:40.971 "zoned": false, 00:35:40.971 "supported_io_types": { 00:35:40.971 "read": true, 00:35:40.971 "write": true, 00:35:40.971 "unmap": true, 00:35:40.971 "write_zeroes": true, 00:35:40.971 "flush": true, 00:35:40.971 "reset": true, 00:35:40.971 "compare": false, 00:35:40.971 "compare_and_write": false, 00:35:40.971 "abort": true, 00:35:40.971 "nvme_admin": false, 00:35:40.971 "nvme_io": false 00:35:40.971 }, 00:35:40.971 "memory_domains": [ 00:35:40.971 { 00:35:40.971 "dma_device_id": "system", 00:35:40.971 "dma_device_type": 1 00:35:40.971 }, 00:35:40.971 { 00:35:40.971 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:35:40.971 "dma_device_type": 2 00:35:40.971 } 00:35:40.971 ], 00:35:40.971 "driver_specific": {} 00:35:40.971 } 00:35:40.971 ] 00:35:40.971 11:46:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:35:40.971 11:46:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:35:40.971 11:46:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:35:40.971 11:46:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:35:41.229 [2024-06-10 11:46:25.035321] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:35:41.229 [2024-06-10 11:46:25.035360] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:35:41.229 [2024-06-10 11:46:25.035372] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:35:41.229 [2024-06-10 11:46:25.036350] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:35:41.229 [2024-06-10 11:46:25.036380] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:35:41.229 11:46:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:35:41.229 11:46:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:35:41.229 11:46:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:35:41.229 11:46:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:35:41.229 11:46:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:35:41.229 11:46:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:35:41.229 11:46:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:35:41.229 11:46:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:35:41.229 11:46:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:35:41.229 11:46:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:35:41.229 11:46:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:41.229 11:46:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:35:41.488 11:46:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:35:41.488 "name": "Existed_Raid", 00:35:41.488 "uuid": "00000000-0000-0000-0000-000000000000", 00:35:41.488 "strip_size_kb": 0, 00:35:41.488 "state": "configuring", 00:35:41.488 "raid_level": "raid1", 00:35:41.488 "superblock": false, 00:35:41.488 "num_base_bdevs": 4, 00:35:41.488 "num_base_bdevs_discovered": 3, 00:35:41.488 "num_base_bdevs_operational": 4, 00:35:41.488 "base_bdevs_list": [ 00:35:41.488 { 00:35:41.488 "name": "BaseBdev1", 00:35:41.488 "uuid": "00000000-0000-0000-0000-000000000000", 00:35:41.488 "is_configured": false, 00:35:41.488 "data_offset": 0, 00:35:41.488 "data_size": 0 00:35:41.488 }, 00:35:41.488 { 00:35:41.488 "name": "BaseBdev2", 00:35:41.488 "uuid": "c10581b9-98fa-4177-a1a9-c130e76910ab", 00:35:41.488 "is_configured": true, 00:35:41.488 "data_offset": 0, 00:35:41.488 "data_size": 65536 00:35:41.488 }, 00:35:41.488 { 00:35:41.488 "name": "BaseBdev3", 00:35:41.488 "uuid": "9c9bfd58-7be4-412d-b9cc-b18697860c2e", 00:35:41.488 "is_configured": true, 00:35:41.488 "data_offset": 0, 00:35:41.488 "data_size": 65536 00:35:41.488 }, 00:35:41.488 { 00:35:41.488 "name": "BaseBdev4", 00:35:41.488 "uuid": "e40bdbc7-10f9-4dd2-acc1-7c63cb12df3d", 00:35:41.488 "is_configured": true, 00:35:41.488 "data_offset": 0, 00:35:41.488 "data_size": 65536 00:35:41.488 } 00:35:41.488 ] 00:35:41.488 }' 00:35:41.488 11:46:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:35:41.488 11:46:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:35:42.054 11:46:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:35:42.054 [2024-06-10 11:46:25.901729] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:35:42.054 11:46:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:35:42.054 11:46:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:35:42.054 11:46:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:35:42.054 11:46:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:35:42.054 11:46:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:35:42.054 11:46:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:35:42.054 11:46:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:35:42.054 11:46:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:35:42.054 11:46:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:35:42.054 11:46:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:35:42.054 11:46:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:42.054 11:46:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:35:42.312 11:46:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:35:42.312 "name": "Existed_Raid", 00:35:42.312 "uuid": "00000000-0000-0000-0000-000000000000", 00:35:42.312 "strip_size_kb": 0, 00:35:42.312 "state": "configuring", 00:35:42.312 "raid_level": "raid1", 00:35:42.312 "superblock": false, 00:35:42.312 "num_base_bdevs": 4, 00:35:42.312 "num_base_bdevs_discovered": 2, 00:35:42.312 "num_base_bdevs_operational": 4, 00:35:42.312 "base_bdevs_list": [ 00:35:42.312 { 00:35:42.312 "name": "BaseBdev1", 00:35:42.312 "uuid": "00000000-0000-0000-0000-000000000000", 00:35:42.312 "is_configured": false, 00:35:42.312 "data_offset": 0, 00:35:42.312 "data_size": 0 00:35:42.312 }, 00:35:42.312 { 00:35:42.312 "name": null, 00:35:42.312 "uuid": "c10581b9-98fa-4177-a1a9-c130e76910ab", 00:35:42.312 "is_configured": false, 00:35:42.312 "data_offset": 0, 00:35:42.312 "data_size": 65536 00:35:42.312 }, 00:35:42.312 { 00:35:42.312 "name": "BaseBdev3", 00:35:42.312 "uuid": "9c9bfd58-7be4-412d-b9cc-b18697860c2e", 00:35:42.312 "is_configured": true, 00:35:42.312 "data_offset": 0, 00:35:42.312 "data_size": 65536 00:35:42.312 }, 00:35:42.312 { 00:35:42.312 "name": "BaseBdev4", 00:35:42.312 "uuid": "e40bdbc7-10f9-4dd2-acc1-7c63cb12df3d", 00:35:42.312 "is_configured": true, 00:35:42.312 "data_offset": 0, 00:35:42.312 "data_size": 65536 00:35:42.312 } 00:35:42.312 ] 00:35:42.312 }' 00:35:42.312 11:46:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:35:42.312 11:46:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:35:42.880 11:46:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:35:42.880 11:46:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:42.880 11:46:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:35:42.880 11:46:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:35:43.139 [2024-06-10 11:46:26.900300] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:35:43.139 BaseBdev1 00:35:43.139 11:46:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:35:43.139 11:46:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:35:43.139 11:46:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:35:43.139 11:46:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:35:43.139 11:46:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:35:43.139 11:46:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:35:43.139 11:46:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:35:43.399 11:46:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:35:43.399 [ 00:35:43.399 { 00:35:43.399 "name": "BaseBdev1", 00:35:43.399 "aliases": [ 00:35:43.399 "ca0bb4e3-b4de-423e-9092-70e804bfa97e" 00:35:43.399 ], 00:35:43.399 "product_name": "Malloc disk", 00:35:43.399 "block_size": 512, 00:35:43.399 "num_blocks": 65536, 00:35:43.399 "uuid": "ca0bb4e3-b4de-423e-9092-70e804bfa97e", 00:35:43.399 "assigned_rate_limits": { 00:35:43.399 "rw_ios_per_sec": 0, 00:35:43.399 "rw_mbytes_per_sec": 0, 00:35:43.399 "r_mbytes_per_sec": 0, 00:35:43.399 "w_mbytes_per_sec": 0 00:35:43.399 }, 00:35:43.399 "claimed": true, 00:35:43.399 "claim_type": "exclusive_write", 00:35:43.399 "zoned": false, 00:35:43.399 "supported_io_types": { 00:35:43.399 "read": true, 00:35:43.399 "write": true, 00:35:43.399 "unmap": true, 00:35:43.399 "write_zeroes": true, 00:35:43.399 "flush": true, 00:35:43.399 "reset": true, 00:35:43.399 "compare": false, 00:35:43.399 "compare_and_write": false, 00:35:43.399 "abort": true, 00:35:43.399 "nvme_admin": false, 00:35:43.399 "nvme_io": false 00:35:43.399 }, 00:35:43.399 "memory_domains": [ 00:35:43.399 { 00:35:43.399 "dma_device_id": "system", 00:35:43.399 "dma_device_type": 1 00:35:43.399 }, 00:35:43.399 { 00:35:43.399 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:35:43.399 "dma_device_type": 2 00:35:43.399 } 00:35:43.399 ], 00:35:43.399 "driver_specific": {} 00:35:43.399 } 00:35:43.399 ] 00:35:43.399 11:46:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:35:43.399 11:46:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:35:43.399 11:46:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:35:43.399 11:46:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:35:43.399 11:46:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:35:43.399 11:46:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:35:43.399 11:46:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:35:43.399 11:46:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:35:43.399 11:46:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:35:43.399 11:46:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:35:43.399 11:46:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:35:43.399 11:46:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:43.399 11:46:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:35:43.658 11:46:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:35:43.658 "name": "Existed_Raid", 00:35:43.658 "uuid": "00000000-0000-0000-0000-000000000000", 00:35:43.658 "strip_size_kb": 0, 00:35:43.658 "state": "configuring", 00:35:43.658 "raid_level": "raid1", 00:35:43.658 "superblock": false, 00:35:43.658 "num_base_bdevs": 4, 00:35:43.658 "num_base_bdevs_discovered": 3, 00:35:43.658 "num_base_bdevs_operational": 4, 00:35:43.658 "base_bdevs_list": [ 00:35:43.658 { 00:35:43.658 "name": "BaseBdev1", 00:35:43.658 "uuid": "ca0bb4e3-b4de-423e-9092-70e804bfa97e", 00:35:43.658 "is_configured": true, 00:35:43.658 "data_offset": 0, 00:35:43.658 "data_size": 65536 00:35:43.658 }, 00:35:43.658 { 00:35:43.658 "name": null, 00:35:43.658 "uuid": "c10581b9-98fa-4177-a1a9-c130e76910ab", 00:35:43.658 "is_configured": false, 00:35:43.658 "data_offset": 0, 00:35:43.658 "data_size": 65536 00:35:43.658 }, 00:35:43.658 { 00:35:43.658 "name": "BaseBdev3", 00:35:43.658 "uuid": "9c9bfd58-7be4-412d-b9cc-b18697860c2e", 00:35:43.658 "is_configured": true, 00:35:43.658 "data_offset": 0, 00:35:43.658 "data_size": 65536 00:35:43.658 }, 00:35:43.658 { 00:35:43.658 "name": "BaseBdev4", 00:35:43.658 "uuid": "e40bdbc7-10f9-4dd2-acc1-7c63cb12df3d", 00:35:43.658 "is_configured": true, 00:35:43.658 "data_offset": 0, 00:35:43.658 "data_size": 65536 00:35:43.658 } 00:35:43.658 ] 00:35:43.658 }' 00:35:43.658 11:46:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:35:43.659 11:46:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:35:44.225 11:46:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:44.225 11:46:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:35:44.225 11:46:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:35:44.225 11:46:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:35:44.483 [2024-06-10 11:46:28.259828] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:35:44.483 11:46:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:35:44.483 11:46:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:35:44.483 11:46:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:35:44.483 11:46:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:35:44.483 11:46:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:35:44.483 11:46:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:35:44.483 11:46:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:35:44.483 11:46:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:35:44.483 11:46:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:35:44.483 11:46:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:35:44.483 11:46:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:44.483 11:46:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:35:44.742 11:46:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:35:44.742 "name": "Existed_Raid", 00:35:44.742 "uuid": "00000000-0000-0000-0000-000000000000", 00:35:44.742 "strip_size_kb": 0, 00:35:44.742 "state": "configuring", 00:35:44.742 "raid_level": "raid1", 00:35:44.742 "superblock": false, 00:35:44.742 "num_base_bdevs": 4, 00:35:44.742 "num_base_bdevs_discovered": 2, 00:35:44.742 "num_base_bdevs_operational": 4, 00:35:44.742 "base_bdevs_list": [ 00:35:44.742 { 00:35:44.742 "name": "BaseBdev1", 00:35:44.742 "uuid": "ca0bb4e3-b4de-423e-9092-70e804bfa97e", 00:35:44.742 "is_configured": true, 00:35:44.742 "data_offset": 0, 00:35:44.742 "data_size": 65536 00:35:44.742 }, 00:35:44.742 { 00:35:44.742 "name": null, 00:35:44.742 "uuid": "c10581b9-98fa-4177-a1a9-c130e76910ab", 00:35:44.742 "is_configured": false, 00:35:44.742 "data_offset": 0, 00:35:44.742 "data_size": 65536 00:35:44.742 }, 00:35:44.742 { 00:35:44.742 "name": null, 00:35:44.742 "uuid": "9c9bfd58-7be4-412d-b9cc-b18697860c2e", 00:35:44.742 "is_configured": false, 00:35:44.742 "data_offset": 0, 00:35:44.742 "data_size": 65536 00:35:44.742 }, 00:35:44.742 { 00:35:44.742 "name": "BaseBdev4", 00:35:44.742 "uuid": "e40bdbc7-10f9-4dd2-acc1-7c63cb12df3d", 00:35:44.742 "is_configured": true, 00:35:44.742 "data_offset": 0, 00:35:44.742 "data_size": 65536 00:35:44.742 } 00:35:44.742 ] 00:35:44.742 }' 00:35:44.742 11:46:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:35:44.742 11:46:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:35:45.308 11:46:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:35:45.308 11:46:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:45.308 11:46:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:35:45.308 11:46:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:35:45.566 [2024-06-10 11:46:29.294507] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:35:45.566 11:46:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:35:45.566 11:46:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:35:45.566 11:46:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:35:45.566 11:46:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:35:45.566 11:46:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:35:45.566 11:46:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:35:45.566 11:46:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:35:45.566 11:46:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:35:45.566 11:46:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:35:45.566 11:46:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:35:45.566 11:46:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:45.566 11:46:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:35:45.566 11:46:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:35:45.566 "name": "Existed_Raid", 00:35:45.566 "uuid": "00000000-0000-0000-0000-000000000000", 00:35:45.566 "strip_size_kb": 0, 00:35:45.566 "state": "configuring", 00:35:45.566 "raid_level": "raid1", 00:35:45.566 "superblock": false, 00:35:45.566 "num_base_bdevs": 4, 00:35:45.566 "num_base_bdevs_discovered": 3, 00:35:45.566 "num_base_bdevs_operational": 4, 00:35:45.566 "base_bdevs_list": [ 00:35:45.566 { 00:35:45.566 "name": "BaseBdev1", 00:35:45.567 "uuid": "ca0bb4e3-b4de-423e-9092-70e804bfa97e", 00:35:45.567 "is_configured": true, 00:35:45.567 "data_offset": 0, 00:35:45.567 "data_size": 65536 00:35:45.567 }, 00:35:45.567 { 00:35:45.567 "name": null, 00:35:45.567 "uuid": "c10581b9-98fa-4177-a1a9-c130e76910ab", 00:35:45.567 "is_configured": false, 00:35:45.567 "data_offset": 0, 00:35:45.567 "data_size": 65536 00:35:45.567 }, 00:35:45.567 { 00:35:45.567 "name": "BaseBdev3", 00:35:45.567 "uuid": "9c9bfd58-7be4-412d-b9cc-b18697860c2e", 00:35:45.567 "is_configured": true, 00:35:45.567 "data_offset": 0, 00:35:45.567 "data_size": 65536 00:35:45.567 }, 00:35:45.567 { 00:35:45.567 "name": "BaseBdev4", 00:35:45.567 "uuid": "e40bdbc7-10f9-4dd2-acc1-7c63cb12df3d", 00:35:45.567 "is_configured": true, 00:35:45.567 "data_offset": 0, 00:35:45.567 "data_size": 65536 00:35:45.567 } 00:35:45.567 ] 00:35:45.567 }' 00:35:45.567 11:46:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:35:45.567 11:46:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:35:46.132 11:46:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:46.132 11:46:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:35:46.390 11:46:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:35:46.390 11:46:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:35:46.648 [2024-06-10 11:46:30.337239] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:35:46.648 11:46:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:35:46.648 11:46:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:35:46.648 11:46:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:35:46.648 11:46:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:35:46.648 11:46:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:35:46.648 11:46:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:35:46.648 11:46:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:35:46.648 11:46:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:35:46.648 11:46:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:35:46.648 11:46:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:35:46.648 11:46:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:46.648 11:46:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:35:46.648 11:46:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:35:46.648 "name": "Existed_Raid", 00:35:46.648 "uuid": "00000000-0000-0000-0000-000000000000", 00:35:46.648 "strip_size_kb": 0, 00:35:46.648 "state": "configuring", 00:35:46.648 "raid_level": "raid1", 00:35:46.648 "superblock": false, 00:35:46.648 "num_base_bdevs": 4, 00:35:46.648 "num_base_bdevs_discovered": 2, 00:35:46.648 "num_base_bdevs_operational": 4, 00:35:46.648 "base_bdevs_list": [ 00:35:46.648 { 00:35:46.648 "name": null, 00:35:46.648 "uuid": "ca0bb4e3-b4de-423e-9092-70e804bfa97e", 00:35:46.648 "is_configured": false, 00:35:46.648 "data_offset": 0, 00:35:46.648 "data_size": 65536 00:35:46.648 }, 00:35:46.648 { 00:35:46.648 "name": null, 00:35:46.648 "uuid": "c10581b9-98fa-4177-a1a9-c130e76910ab", 00:35:46.648 "is_configured": false, 00:35:46.648 "data_offset": 0, 00:35:46.648 "data_size": 65536 00:35:46.648 }, 00:35:46.648 { 00:35:46.648 "name": "BaseBdev3", 00:35:46.648 "uuid": "9c9bfd58-7be4-412d-b9cc-b18697860c2e", 00:35:46.648 "is_configured": true, 00:35:46.648 "data_offset": 0, 00:35:46.648 "data_size": 65536 00:35:46.648 }, 00:35:46.648 { 00:35:46.648 "name": "BaseBdev4", 00:35:46.648 "uuid": "e40bdbc7-10f9-4dd2-acc1-7c63cb12df3d", 00:35:46.648 "is_configured": true, 00:35:46.648 "data_offset": 0, 00:35:46.648 "data_size": 65536 00:35:46.648 } 00:35:46.648 ] 00:35:46.648 }' 00:35:46.648 11:46:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:35:46.648 11:46:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:35:47.214 11:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:35:47.214 11:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:47.473 11:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:35:47.473 11:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:35:47.473 [2024-06-10 11:46:31.363751] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:35:47.473 11:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:35:47.473 11:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:35:47.473 11:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:35:47.473 11:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:35:47.473 11:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:35:47.473 11:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:35:47.473 11:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:35:47.473 11:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:35:47.473 11:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:35:47.473 11:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:35:47.473 11:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:35:47.474 11:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:47.735 11:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:35:47.735 "name": "Existed_Raid", 00:35:47.735 "uuid": "00000000-0000-0000-0000-000000000000", 00:35:47.735 "strip_size_kb": 0, 00:35:47.735 "state": "configuring", 00:35:47.735 "raid_level": "raid1", 00:35:47.735 "superblock": false, 00:35:47.735 "num_base_bdevs": 4, 00:35:47.735 "num_base_bdevs_discovered": 3, 00:35:47.735 "num_base_bdevs_operational": 4, 00:35:47.735 "base_bdevs_list": [ 00:35:47.735 { 00:35:47.735 "name": null, 00:35:47.735 "uuid": "ca0bb4e3-b4de-423e-9092-70e804bfa97e", 00:35:47.735 "is_configured": false, 00:35:47.735 "data_offset": 0, 00:35:47.735 "data_size": 65536 00:35:47.735 }, 00:35:47.735 { 00:35:47.735 "name": "BaseBdev2", 00:35:47.735 "uuid": "c10581b9-98fa-4177-a1a9-c130e76910ab", 00:35:47.735 "is_configured": true, 00:35:47.735 "data_offset": 0, 00:35:47.735 "data_size": 65536 00:35:47.735 }, 00:35:47.735 { 00:35:47.735 "name": "BaseBdev3", 00:35:47.735 "uuid": "9c9bfd58-7be4-412d-b9cc-b18697860c2e", 00:35:47.735 "is_configured": true, 00:35:47.735 "data_offset": 0, 00:35:47.735 "data_size": 65536 00:35:47.735 }, 00:35:47.735 { 00:35:47.735 "name": "BaseBdev4", 00:35:47.735 "uuid": "e40bdbc7-10f9-4dd2-acc1-7c63cb12df3d", 00:35:47.735 "is_configured": true, 00:35:47.735 "data_offset": 0, 00:35:47.735 "data_size": 65536 00:35:47.735 } 00:35:47.735 ] 00:35:47.735 }' 00:35:47.735 11:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:35:47.735 11:46:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:35:48.383 11:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:48.384 11:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:35:48.384 11:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:35:48.384 11:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:48.384 11:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:35:48.643 11:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u ca0bb4e3-b4de-423e-9092-70e804bfa97e 00:35:48.903 [2024-06-10 11:46:32.598109] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:35:48.903 [2024-06-10 11:46:32.598145] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1ebca50 00:35:48.903 [2024-06-10 11:46:32.598150] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:35:48.903 [2024-06-10 11:46:32.598288] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1dd7d60 00:35:48.903 [2024-06-10 11:46:32.598380] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1ebca50 00:35:48.903 [2024-06-10 11:46:32.598386] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1ebca50 00:35:48.903 [2024-06-10 11:46:32.598525] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:35:48.903 NewBaseBdev 00:35:48.903 11:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:35:48.903 11:46:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=NewBaseBdev 00:35:48.903 11:46:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:35:48.903 11:46:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:35:48.903 11:46:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:35:48.903 11:46:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:35:48.903 11:46:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:35:48.903 11:46:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:35:49.162 [ 00:35:49.162 { 00:35:49.162 "name": "NewBaseBdev", 00:35:49.162 "aliases": [ 00:35:49.162 "ca0bb4e3-b4de-423e-9092-70e804bfa97e" 00:35:49.162 ], 00:35:49.162 "product_name": "Malloc disk", 00:35:49.162 "block_size": 512, 00:35:49.162 "num_blocks": 65536, 00:35:49.162 "uuid": "ca0bb4e3-b4de-423e-9092-70e804bfa97e", 00:35:49.162 "assigned_rate_limits": { 00:35:49.162 "rw_ios_per_sec": 0, 00:35:49.162 "rw_mbytes_per_sec": 0, 00:35:49.162 "r_mbytes_per_sec": 0, 00:35:49.162 "w_mbytes_per_sec": 0 00:35:49.162 }, 00:35:49.162 "claimed": true, 00:35:49.162 "claim_type": "exclusive_write", 00:35:49.162 "zoned": false, 00:35:49.162 "supported_io_types": { 00:35:49.162 "read": true, 00:35:49.162 "write": true, 00:35:49.162 "unmap": true, 00:35:49.162 "write_zeroes": true, 00:35:49.162 "flush": true, 00:35:49.162 "reset": true, 00:35:49.162 "compare": false, 00:35:49.162 "compare_and_write": false, 00:35:49.162 "abort": true, 00:35:49.162 "nvme_admin": false, 00:35:49.162 "nvme_io": false 00:35:49.162 }, 00:35:49.162 "memory_domains": [ 00:35:49.162 { 00:35:49.162 "dma_device_id": "system", 00:35:49.162 "dma_device_type": 1 00:35:49.162 }, 00:35:49.162 { 00:35:49.162 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:35:49.162 "dma_device_type": 2 00:35:49.162 } 00:35:49.162 ], 00:35:49.162 "driver_specific": {} 00:35:49.162 } 00:35:49.162 ] 00:35:49.162 11:46:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:35:49.162 11:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:35:49.162 11:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:35:49.162 11:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:35:49.162 11:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:35:49.162 11:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:35:49.162 11:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:35:49.162 11:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:35:49.162 11:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:35:49.162 11:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:35:49.162 11:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:35:49.162 11:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:49.162 11:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:35:49.421 11:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:35:49.421 "name": "Existed_Raid", 00:35:49.421 "uuid": "60971877-ba3e-4815-a197-2d3cd836ed5e", 00:35:49.421 "strip_size_kb": 0, 00:35:49.421 "state": "online", 00:35:49.421 "raid_level": "raid1", 00:35:49.421 "superblock": false, 00:35:49.421 "num_base_bdevs": 4, 00:35:49.421 "num_base_bdevs_discovered": 4, 00:35:49.421 "num_base_bdevs_operational": 4, 00:35:49.421 "base_bdevs_list": [ 00:35:49.421 { 00:35:49.421 "name": "NewBaseBdev", 00:35:49.421 "uuid": "ca0bb4e3-b4de-423e-9092-70e804bfa97e", 00:35:49.421 "is_configured": true, 00:35:49.421 "data_offset": 0, 00:35:49.421 "data_size": 65536 00:35:49.421 }, 00:35:49.421 { 00:35:49.421 "name": "BaseBdev2", 00:35:49.421 "uuid": "c10581b9-98fa-4177-a1a9-c130e76910ab", 00:35:49.421 "is_configured": true, 00:35:49.421 "data_offset": 0, 00:35:49.421 "data_size": 65536 00:35:49.421 }, 00:35:49.421 { 00:35:49.421 "name": "BaseBdev3", 00:35:49.421 "uuid": "9c9bfd58-7be4-412d-b9cc-b18697860c2e", 00:35:49.421 "is_configured": true, 00:35:49.421 "data_offset": 0, 00:35:49.421 "data_size": 65536 00:35:49.421 }, 00:35:49.421 { 00:35:49.421 "name": "BaseBdev4", 00:35:49.421 "uuid": "e40bdbc7-10f9-4dd2-acc1-7c63cb12df3d", 00:35:49.421 "is_configured": true, 00:35:49.421 "data_offset": 0, 00:35:49.421 "data_size": 65536 00:35:49.421 } 00:35:49.421 ] 00:35:49.422 }' 00:35:49.422 11:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:35:49.422 11:46:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:35:49.681 11:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:35:49.681 11:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:35:49.681 11:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:35:49.681 11:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:35:49.681 11:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:35:49.681 11:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:35:49.681 11:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:35:49.681 11:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:35:49.940 [2024-06-10 11:46:33.729255] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:35:49.940 11:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:35:49.940 "name": "Existed_Raid", 00:35:49.940 "aliases": [ 00:35:49.940 "60971877-ba3e-4815-a197-2d3cd836ed5e" 00:35:49.940 ], 00:35:49.940 "product_name": "Raid Volume", 00:35:49.940 "block_size": 512, 00:35:49.940 "num_blocks": 65536, 00:35:49.940 "uuid": "60971877-ba3e-4815-a197-2d3cd836ed5e", 00:35:49.940 "assigned_rate_limits": { 00:35:49.940 "rw_ios_per_sec": 0, 00:35:49.940 "rw_mbytes_per_sec": 0, 00:35:49.940 "r_mbytes_per_sec": 0, 00:35:49.940 "w_mbytes_per_sec": 0 00:35:49.940 }, 00:35:49.940 "claimed": false, 00:35:49.940 "zoned": false, 00:35:49.940 "supported_io_types": { 00:35:49.940 "read": true, 00:35:49.940 "write": true, 00:35:49.940 "unmap": false, 00:35:49.940 "write_zeroes": true, 00:35:49.940 "flush": false, 00:35:49.940 "reset": true, 00:35:49.940 "compare": false, 00:35:49.940 "compare_and_write": false, 00:35:49.940 "abort": false, 00:35:49.940 "nvme_admin": false, 00:35:49.940 "nvme_io": false 00:35:49.940 }, 00:35:49.940 "memory_domains": [ 00:35:49.940 { 00:35:49.940 "dma_device_id": "system", 00:35:49.940 "dma_device_type": 1 00:35:49.940 }, 00:35:49.940 { 00:35:49.940 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:35:49.940 "dma_device_type": 2 00:35:49.940 }, 00:35:49.940 { 00:35:49.940 "dma_device_id": "system", 00:35:49.940 "dma_device_type": 1 00:35:49.940 }, 00:35:49.940 { 00:35:49.940 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:35:49.940 "dma_device_type": 2 00:35:49.940 }, 00:35:49.940 { 00:35:49.940 "dma_device_id": "system", 00:35:49.940 "dma_device_type": 1 00:35:49.940 }, 00:35:49.940 { 00:35:49.940 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:35:49.940 "dma_device_type": 2 00:35:49.940 }, 00:35:49.940 { 00:35:49.940 "dma_device_id": "system", 00:35:49.940 "dma_device_type": 1 00:35:49.940 }, 00:35:49.940 { 00:35:49.940 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:35:49.940 "dma_device_type": 2 00:35:49.940 } 00:35:49.940 ], 00:35:49.940 "driver_specific": { 00:35:49.940 "raid": { 00:35:49.940 "uuid": "60971877-ba3e-4815-a197-2d3cd836ed5e", 00:35:49.940 "strip_size_kb": 0, 00:35:49.940 "state": "online", 00:35:49.940 "raid_level": "raid1", 00:35:49.940 "superblock": false, 00:35:49.940 "num_base_bdevs": 4, 00:35:49.940 "num_base_bdevs_discovered": 4, 00:35:49.940 "num_base_bdevs_operational": 4, 00:35:49.940 "base_bdevs_list": [ 00:35:49.940 { 00:35:49.940 "name": "NewBaseBdev", 00:35:49.940 "uuid": "ca0bb4e3-b4de-423e-9092-70e804bfa97e", 00:35:49.940 "is_configured": true, 00:35:49.940 "data_offset": 0, 00:35:49.940 "data_size": 65536 00:35:49.940 }, 00:35:49.940 { 00:35:49.940 "name": "BaseBdev2", 00:35:49.940 "uuid": "c10581b9-98fa-4177-a1a9-c130e76910ab", 00:35:49.940 "is_configured": true, 00:35:49.940 "data_offset": 0, 00:35:49.940 "data_size": 65536 00:35:49.940 }, 00:35:49.940 { 00:35:49.940 "name": "BaseBdev3", 00:35:49.940 "uuid": "9c9bfd58-7be4-412d-b9cc-b18697860c2e", 00:35:49.940 "is_configured": true, 00:35:49.940 "data_offset": 0, 00:35:49.940 "data_size": 65536 00:35:49.940 }, 00:35:49.940 { 00:35:49.940 "name": "BaseBdev4", 00:35:49.940 "uuid": "e40bdbc7-10f9-4dd2-acc1-7c63cb12df3d", 00:35:49.940 "is_configured": true, 00:35:49.940 "data_offset": 0, 00:35:49.940 "data_size": 65536 00:35:49.940 } 00:35:49.940 ] 00:35:49.940 } 00:35:49.940 } 00:35:49.940 }' 00:35:49.940 11:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:35:49.940 11:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:35:49.940 BaseBdev2 00:35:49.940 BaseBdev3 00:35:49.940 BaseBdev4' 00:35:49.940 11:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:35:49.940 11:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:35:49.940 11:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:35:50.199 11:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:35:50.199 "name": "NewBaseBdev", 00:35:50.199 "aliases": [ 00:35:50.199 "ca0bb4e3-b4de-423e-9092-70e804bfa97e" 00:35:50.199 ], 00:35:50.199 "product_name": "Malloc disk", 00:35:50.199 "block_size": 512, 00:35:50.199 "num_blocks": 65536, 00:35:50.199 "uuid": "ca0bb4e3-b4de-423e-9092-70e804bfa97e", 00:35:50.199 "assigned_rate_limits": { 00:35:50.199 "rw_ios_per_sec": 0, 00:35:50.200 "rw_mbytes_per_sec": 0, 00:35:50.200 "r_mbytes_per_sec": 0, 00:35:50.200 "w_mbytes_per_sec": 0 00:35:50.200 }, 00:35:50.200 "claimed": true, 00:35:50.200 "claim_type": "exclusive_write", 00:35:50.200 "zoned": false, 00:35:50.200 "supported_io_types": { 00:35:50.200 "read": true, 00:35:50.200 "write": true, 00:35:50.200 "unmap": true, 00:35:50.200 "write_zeroes": true, 00:35:50.200 "flush": true, 00:35:50.200 "reset": true, 00:35:50.200 "compare": false, 00:35:50.200 "compare_and_write": false, 00:35:50.200 "abort": true, 00:35:50.200 "nvme_admin": false, 00:35:50.200 "nvme_io": false 00:35:50.200 }, 00:35:50.200 "memory_domains": [ 00:35:50.200 { 00:35:50.200 "dma_device_id": "system", 00:35:50.200 "dma_device_type": 1 00:35:50.200 }, 00:35:50.200 { 00:35:50.200 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:35:50.200 "dma_device_type": 2 00:35:50.200 } 00:35:50.200 ], 00:35:50.200 "driver_specific": {} 00:35:50.200 }' 00:35:50.200 11:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:35:50.200 11:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:35:50.200 11:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:35:50.200 11:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:35:50.200 11:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:35:50.458 11:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:35:50.458 11:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:35:50.458 11:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:35:50.458 11:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:35:50.458 11:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:35:50.458 11:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:35:50.458 11:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:35:50.458 11:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:35:50.458 11:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:35:50.458 11:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:35:50.718 11:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:35:50.718 "name": "BaseBdev2", 00:35:50.718 "aliases": [ 00:35:50.718 "c10581b9-98fa-4177-a1a9-c130e76910ab" 00:35:50.718 ], 00:35:50.718 "product_name": "Malloc disk", 00:35:50.718 "block_size": 512, 00:35:50.718 "num_blocks": 65536, 00:35:50.718 "uuid": "c10581b9-98fa-4177-a1a9-c130e76910ab", 00:35:50.718 "assigned_rate_limits": { 00:35:50.718 "rw_ios_per_sec": 0, 00:35:50.718 "rw_mbytes_per_sec": 0, 00:35:50.718 "r_mbytes_per_sec": 0, 00:35:50.718 "w_mbytes_per_sec": 0 00:35:50.718 }, 00:35:50.718 "claimed": true, 00:35:50.718 "claim_type": "exclusive_write", 00:35:50.718 "zoned": false, 00:35:50.718 "supported_io_types": { 00:35:50.718 "read": true, 00:35:50.718 "write": true, 00:35:50.718 "unmap": true, 00:35:50.718 "write_zeroes": true, 00:35:50.718 "flush": true, 00:35:50.718 "reset": true, 00:35:50.718 "compare": false, 00:35:50.718 "compare_and_write": false, 00:35:50.718 "abort": true, 00:35:50.718 "nvme_admin": false, 00:35:50.718 "nvme_io": false 00:35:50.718 }, 00:35:50.718 "memory_domains": [ 00:35:50.718 { 00:35:50.718 "dma_device_id": "system", 00:35:50.718 "dma_device_type": 1 00:35:50.718 }, 00:35:50.718 { 00:35:50.718 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:35:50.718 "dma_device_type": 2 00:35:50.718 } 00:35:50.718 ], 00:35:50.718 "driver_specific": {} 00:35:50.718 }' 00:35:50.718 11:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:35:50.718 11:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:35:50.718 11:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:35:50.718 11:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:35:50.718 11:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:35:50.718 11:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:35:50.718 11:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:35:50.718 11:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:35:50.977 11:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:35:50.977 11:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:35:50.977 11:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:35:50.977 11:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:35:50.977 11:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:35:50.977 11:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:35:50.977 11:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:35:51.236 11:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:35:51.236 "name": "BaseBdev3", 00:35:51.236 "aliases": [ 00:35:51.236 "9c9bfd58-7be4-412d-b9cc-b18697860c2e" 00:35:51.236 ], 00:35:51.236 "product_name": "Malloc disk", 00:35:51.236 "block_size": 512, 00:35:51.236 "num_blocks": 65536, 00:35:51.236 "uuid": "9c9bfd58-7be4-412d-b9cc-b18697860c2e", 00:35:51.236 "assigned_rate_limits": { 00:35:51.236 "rw_ios_per_sec": 0, 00:35:51.236 "rw_mbytes_per_sec": 0, 00:35:51.236 "r_mbytes_per_sec": 0, 00:35:51.236 "w_mbytes_per_sec": 0 00:35:51.236 }, 00:35:51.236 "claimed": true, 00:35:51.236 "claim_type": "exclusive_write", 00:35:51.236 "zoned": false, 00:35:51.236 "supported_io_types": { 00:35:51.236 "read": true, 00:35:51.236 "write": true, 00:35:51.236 "unmap": true, 00:35:51.236 "write_zeroes": true, 00:35:51.236 "flush": true, 00:35:51.236 "reset": true, 00:35:51.236 "compare": false, 00:35:51.236 "compare_and_write": false, 00:35:51.236 "abort": true, 00:35:51.236 "nvme_admin": false, 00:35:51.236 "nvme_io": false 00:35:51.236 }, 00:35:51.236 "memory_domains": [ 00:35:51.236 { 00:35:51.236 "dma_device_id": "system", 00:35:51.236 "dma_device_type": 1 00:35:51.236 }, 00:35:51.236 { 00:35:51.236 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:35:51.236 "dma_device_type": 2 00:35:51.236 } 00:35:51.236 ], 00:35:51.236 "driver_specific": {} 00:35:51.236 }' 00:35:51.236 11:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:35:51.236 11:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:35:51.236 11:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:35:51.237 11:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:35:51.237 11:46:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:35:51.237 11:46:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:35:51.237 11:46:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:35:51.237 11:46:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:35:51.237 11:46:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:35:51.237 11:46:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:35:51.495 11:46:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:35:51.496 11:46:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:35:51.496 11:46:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:35:51.496 11:46:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:35:51.496 11:46:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:35:51.496 11:46:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:35:51.496 "name": "BaseBdev4", 00:35:51.496 "aliases": [ 00:35:51.496 "e40bdbc7-10f9-4dd2-acc1-7c63cb12df3d" 00:35:51.496 ], 00:35:51.496 "product_name": "Malloc disk", 00:35:51.496 "block_size": 512, 00:35:51.496 "num_blocks": 65536, 00:35:51.496 "uuid": "e40bdbc7-10f9-4dd2-acc1-7c63cb12df3d", 00:35:51.496 "assigned_rate_limits": { 00:35:51.496 "rw_ios_per_sec": 0, 00:35:51.496 "rw_mbytes_per_sec": 0, 00:35:51.496 "r_mbytes_per_sec": 0, 00:35:51.496 "w_mbytes_per_sec": 0 00:35:51.496 }, 00:35:51.496 "claimed": true, 00:35:51.496 "claim_type": "exclusive_write", 00:35:51.496 "zoned": false, 00:35:51.496 "supported_io_types": { 00:35:51.496 "read": true, 00:35:51.496 "write": true, 00:35:51.496 "unmap": true, 00:35:51.496 "write_zeroes": true, 00:35:51.496 "flush": true, 00:35:51.496 "reset": true, 00:35:51.496 "compare": false, 00:35:51.496 "compare_and_write": false, 00:35:51.496 "abort": true, 00:35:51.496 "nvme_admin": false, 00:35:51.496 "nvme_io": false 00:35:51.496 }, 00:35:51.496 "memory_domains": [ 00:35:51.496 { 00:35:51.496 "dma_device_id": "system", 00:35:51.496 "dma_device_type": 1 00:35:51.496 }, 00:35:51.496 { 00:35:51.496 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:35:51.496 "dma_device_type": 2 00:35:51.496 } 00:35:51.496 ], 00:35:51.496 "driver_specific": {} 00:35:51.496 }' 00:35:51.496 11:46:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:35:51.496 11:46:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:35:51.755 11:46:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:35:51.755 11:46:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:35:51.755 11:46:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:35:51.755 11:46:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:35:51.755 11:46:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:35:51.755 11:46:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:35:51.755 11:46:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:35:51.755 11:46:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:35:51.755 11:46:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:35:52.014 11:46:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:35:52.014 11:46:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:35:52.014 [2024-06-10 11:46:35.850537] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:35:52.014 [2024-06-10 11:46:35.850561] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:35:52.014 [2024-06-10 11:46:35.850598] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:35:52.014 [2024-06-10 11:46:35.850789] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:35:52.014 [2024-06-10 11:46:35.850803] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ebca50 name Existed_Raid, state offline 00:35:52.014 11:46:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 203558 00:35:52.014 11:46:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@949 -- # '[' -z 203558 ']' 00:35:52.014 11:46:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # kill -0 203558 00:35:52.014 11:46:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # uname 00:35:52.014 11:46:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:35:52.014 11:46:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 203558 00:35:52.014 11:46:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:35:52.014 11:46:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:35:52.014 11:46:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 203558' 00:35:52.015 killing process with pid 203558 00:35:52.015 11:46:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # kill 203558 00:35:52.015 [2024-06-10 11:46:35.913212] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:35:52.015 11:46:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@973 -- # wait 203558 00:35:52.015 [2024-06-10 11:46:35.948257] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:35:52.274 11:46:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:35:52.274 00:35:52.274 real 0m24.953s 00:35:52.274 user 0m45.392s 00:35:52.274 sys 0m4.930s 00:35:52.274 11:46:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:35:52.274 11:46:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:35:52.274 ************************************ 00:35:52.274 END TEST raid_state_function_test 00:35:52.274 ************************************ 00:35:52.274 11:46:36 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 4 true 00:35:52.274 11:46:36 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:35:52.274 11:46:36 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:35:52.274 11:46:36 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:35:52.274 ************************************ 00:35:52.274 START TEST raid_state_function_test_sb 00:35:52.274 ************************************ 00:35:52.274 11:46:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # raid_state_function_test raid1 4 true 00:35:52.274 11:46:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:35:52.274 11:46:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:35:52.274 11:46:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:35:52.274 11:46:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:35:52.274 11:46:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:35:52.274 11:46:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:35:52.274 11:46:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:35:52.274 11:46:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:35:52.274 11:46:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:35:52.274 11:46:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:35:52.274 11:46:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:35:52.274 11:46:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:35:52.274 11:46:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:35:52.274 11:46:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:35:52.274 11:46:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:35:52.274 11:46:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:35:52.274 11:46:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:35:52.274 11:46:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:35:52.274 11:46:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:35:52.274 11:46:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:35:52.274 11:46:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:35:52.274 11:46:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:35:52.274 11:46:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:35:52.274 11:46:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:35:52.274 11:46:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:35:52.274 11:46:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:35:52.274 11:46:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:35:52.274 11:46:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:35:52.274 11:46:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=207537 00:35:52.274 11:46:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 207537' 00:35:52.274 Process raid pid: 207537 00:35:52.274 11:46:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:35:52.274 11:46:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 207537 /var/tmp/spdk-raid.sock 00:35:52.274 11:46:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@830 -- # '[' -z 207537 ']' 00:35:52.274 11:46:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:35:52.274 11:46:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local max_retries=100 00:35:52.274 11:46:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:35:52.274 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:35:52.274 11:46:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@839 -- # xtrace_disable 00:35:52.274 11:46:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:35:52.534 [2024-06-10 11:46:36.265426] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:35:52.534 [2024-06-10 11:46:36.265476] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:35:52.534 [2024-06-10 11:46:36.352688] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:52.534 [2024-06-10 11:46:36.431518] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:35:52.792 [2024-06-10 11:46:36.484126] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:35:52.792 [2024-06-10 11:46:36.484149] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:35:53.360 11:46:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:35:53.360 11:46:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@863 -- # return 0 00:35:53.360 11:46:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:35:53.360 [2024-06-10 11:46:37.221926] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:35:53.360 [2024-06-10 11:46:37.221960] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:35:53.360 [2024-06-10 11:46:37.221968] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:35:53.360 [2024-06-10 11:46:37.221979] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:35:53.360 [2024-06-10 11:46:37.221984] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:35:53.360 [2024-06-10 11:46:37.221992] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:35:53.360 [2024-06-10 11:46:37.221997] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:35:53.360 [2024-06-10 11:46:37.222004] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:35:53.360 11:46:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:35:53.360 11:46:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:35:53.360 11:46:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:35:53.360 11:46:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:35:53.360 11:46:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:35:53.360 11:46:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:35:53.360 11:46:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:35:53.360 11:46:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:35:53.360 11:46:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:35:53.360 11:46:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:35:53.360 11:46:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:53.360 11:46:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:35:53.619 11:46:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:35:53.619 "name": "Existed_Raid", 00:35:53.619 "uuid": "03ee0872-01f2-447b-8733-f3bc75822f52", 00:35:53.619 "strip_size_kb": 0, 00:35:53.619 "state": "configuring", 00:35:53.619 "raid_level": "raid1", 00:35:53.619 "superblock": true, 00:35:53.619 "num_base_bdevs": 4, 00:35:53.619 "num_base_bdevs_discovered": 0, 00:35:53.619 "num_base_bdevs_operational": 4, 00:35:53.619 "base_bdevs_list": [ 00:35:53.619 { 00:35:53.619 "name": "BaseBdev1", 00:35:53.619 "uuid": "00000000-0000-0000-0000-000000000000", 00:35:53.619 "is_configured": false, 00:35:53.619 "data_offset": 0, 00:35:53.619 "data_size": 0 00:35:53.619 }, 00:35:53.619 { 00:35:53.619 "name": "BaseBdev2", 00:35:53.619 "uuid": "00000000-0000-0000-0000-000000000000", 00:35:53.619 "is_configured": false, 00:35:53.619 "data_offset": 0, 00:35:53.619 "data_size": 0 00:35:53.619 }, 00:35:53.619 { 00:35:53.619 "name": "BaseBdev3", 00:35:53.619 "uuid": "00000000-0000-0000-0000-000000000000", 00:35:53.619 "is_configured": false, 00:35:53.619 "data_offset": 0, 00:35:53.619 "data_size": 0 00:35:53.619 }, 00:35:53.619 { 00:35:53.619 "name": "BaseBdev4", 00:35:53.619 "uuid": "00000000-0000-0000-0000-000000000000", 00:35:53.619 "is_configured": false, 00:35:53.619 "data_offset": 0, 00:35:53.619 "data_size": 0 00:35:53.619 } 00:35:53.619 ] 00:35:53.619 }' 00:35:53.619 11:46:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:35:53.619 11:46:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:35:54.185 11:46:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:35:54.185 [2024-06-10 11:46:38.035916] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:35:54.185 [2024-06-10 11:46:38.035938] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x216b550 name Existed_Raid, state configuring 00:35:54.185 11:46:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:35:54.443 [2024-06-10 11:46:38.196355] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:35:54.443 [2024-06-10 11:46:38.196384] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:35:54.443 [2024-06-10 11:46:38.196390] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:35:54.443 [2024-06-10 11:46:38.196397] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:35:54.443 [2024-06-10 11:46:38.196403] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:35:54.443 [2024-06-10 11:46:38.196410] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:35:54.443 [2024-06-10 11:46:38.196415] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:35:54.443 [2024-06-10 11:46:38.196422] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:35:54.443 11:46:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:35:54.443 [2024-06-10 11:46:38.378588] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:35:54.443 BaseBdev1 00:35:54.702 11:46:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:35:54.702 11:46:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:35:54.702 11:46:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:35:54.702 11:46:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:35:54.702 11:46:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:35:54.702 11:46:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:35:54.702 11:46:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:35:54.702 11:46:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:35:54.960 [ 00:35:54.960 { 00:35:54.960 "name": "BaseBdev1", 00:35:54.960 "aliases": [ 00:35:54.960 "de664f72-7281-4c52-a623-dce27bd4c4de" 00:35:54.960 ], 00:35:54.960 "product_name": "Malloc disk", 00:35:54.960 "block_size": 512, 00:35:54.960 "num_blocks": 65536, 00:35:54.960 "uuid": "de664f72-7281-4c52-a623-dce27bd4c4de", 00:35:54.960 "assigned_rate_limits": { 00:35:54.960 "rw_ios_per_sec": 0, 00:35:54.960 "rw_mbytes_per_sec": 0, 00:35:54.960 "r_mbytes_per_sec": 0, 00:35:54.960 "w_mbytes_per_sec": 0 00:35:54.960 }, 00:35:54.960 "claimed": true, 00:35:54.960 "claim_type": "exclusive_write", 00:35:54.960 "zoned": false, 00:35:54.960 "supported_io_types": { 00:35:54.960 "read": true, 00:35:54.960 "write": true, 00:35:54.960 "unmap": true, 00:35:54.960 "write_zeroes": true, 00:35:54.960 "flush": true, 00:35:54.960 "reset": true, 00:35:54.960 "compare": false, 00:35:54.960 "compare_and_write": false, 00:35:54.960 "abort": true, 00:35:54.960 "nvme_admin": false, 00:35:54.960 "nvme_io": false 00:35:54.960 }, 00:35:54.960 "memory_domains": [ 00:35:54.960 { 00:35:54.960 "dma_device_id": "system", 00:35:54.960 "dma_device_type": 1 00:35:54.960 }, 00:35:54.960 { 00:35:54.960 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:35:54.960 "dma_device_type": 2 00:35:54.960 } 00:35:54.960 ], 00:35:54.960 "driver_specific": {} 00:35:54.960 } 00:35:54.960 ] 00:35:54.960 11:46:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:35:54.960 11:46:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:35:54.960 11:46:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:35:54.960 11:46:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:35:54.960 11:46:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:35:54.960 11:46:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:35:54.960 11:46:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:35:54.960 11:46:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:35:54.960 11:46:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:35:54.960 11:46:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:35:54.960 11:46:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:35:54.960 11:46:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:54.960 11:46:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:35:55.218 11:46:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:35:55.218 "name": "Existed_Raid", 00:35:55.218 "uuid": "fcfb3da3-f4fa-4044-9ece-11e834ebec0c", 00:35:55.218 "strip_size_kb": 0, 00:35:55.218 "state": "configuring", 00:35:55.218 "raid_level": "raid1", 00:35:55.218 "superblock": true, 00:35:55.218 "num_base_bdevs": 4, 00:35:55.218 "num_base_bdevs_discovered": 1, 00:35:55.218 "num_base_bdevs_operational": 4, 00:35:55.218 "base_bdevs_list": [ 00:35:55.218 { 00:35:55.218 "name": "BaseBdev1", 00:35:55.218 "uuid": "de664f72-7281-4c52-a623-dce27bd4c4de", 00:35:55.218 "is_configured": true, 00:35:55.218 "data_offset": 2048, 00:35:55.218 "data_size": 63488 00:35:55.218 }, 00:35:55.218 { 00:35:55.219 "name": "BaseBdev2", 00:35:55.219 "uuid": "00000000-0000-0000-0000-000000000000", 00:35:55.219 "is_configured": false, 00:35:55.219 "data_offset": 0, 00:35:55.219 "data_size": 0 00:35:55.219 }, 00:35:55.219 { 00:35:55.219 "name": "BaseBdev3", 00:35:55.219 "uuid": "00000000-0000-0000-0000-000000000000", 00:35:55.219 "is_configured": false, 00:35:55.219 "data_offset": 0, 00:35:55.219 "data_size": 0 00:35:55.219 }, 00:35:55.219 { 00:35:55.219 "name": "BaseBdev4", 00:35:55.219 "uuid": "00000000-0000-0000-0000-000000000000", 00:35:55.219 "is_configured": false, 00:35:55.219 "data_offset": 0, 00:35:55.219 "data_size": 0 00:35:55.219 } 00:35:55.219 ] 00:35:55.219 }' 00:35:55.219 11:46:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:35:55.219 11:46:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:35:55.477 11:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:35:55.735 [2024-06-10 11:46:39.561646] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:35:55.735 [2024-06-10 11:46:39.561680] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x216adc0 name Existed_Raid, state configuring 00:35:55.735 11:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:35:55.994 [2024-06-10 11:46:39.734129] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:35:55.994 [2024-06-10 11:46:39.735222] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:35:55.994 [2024-06-10 11:46:39.735249] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:35:55.994 [2024-06-10 11:46:39.735257] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:35:55.994 [2024-06-10 11:46:39.735265] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:35:55.994 [2024-06-10 11:46:39.735271] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:35:55.994 [2024-06-10 11:46:39.735278] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:35:55.994 11:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:35:55.994 11:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:35:55.994 11:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:35:55.994 11:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:35:55.994 11:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:35:55.994 11:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:35:55.994 11:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:35:55.994 11:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:35:55.994 11:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:35:55.994 11:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:35:55.994 11:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:35:55.994 11:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:35:55.994 11:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:55.994 11:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:35:55.994 11:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:35:55.994 "name": "Existed_Raid", 00:35:55.994 "uuid": "6811b467-e44b-4071-8efe-005ec12374da", 00:35:55.994 "strip_size_kb": 0, 00:35:55.994 "state": "configuring", 00:35:55.994 "raid_level": "raid1", 00:35:55.994 "superblock": true, 00:35:55.994 "num_base_bdevs": 4, 00:35:55.994 "num_base_bdevs_discovered": 1, 00:35:55.994 "num_base_bdevs_operational": 4, 00:35:55.994 "base_bdevs_list": [ 00:35:55.994 { 00:35:55.994 "name": "BaseBdev1", 00:35:55.994 "uuid": "de664f72-7281-4c52-a623-dce27bd4c4de", 00:35:55.994 "is_configured": true, 00:35:55.994 "data_offset": 2048, 00:35:55.994 "data_size": 63488 00:35:55.994 }, 00:35:55.994 { 00:35:55.994 "name": "BaseBdev2", 00:35:55.994 "uuid": "00000000-0000-0000-0000-000000000000", 00:35:55.994 "is_configured": false, 00:35:55.994 "data_offset": 0, 00:35:55.994 "data_size": 0 00:35:55.994 }, 00:35:55.994 { 00:35:55.994 "name": "BaseBdev3", 00:35:55.994 "uuid": "00000000-0000-0000-0000-000000000000", 00:35:55.994 "is_configured": false, 00:35:55.994 "data_offset": 0, 00:35:55.994 "data_size": 0 00:35:55.994 }, 00:35:55.994 { 00:35:55.994 "name": "BaseBdev4", 00:35:55.994 "uuid": "00000000-0000-0000-0000-000000000000", 00:35:55.994 "is_configured": false, 00:35:55.994 "data_offset": 0, 00:35:55.994 "data_size": 0 00:35:55.994 } 00:35:55.994 ] 00:35:55.994 }' 00:35:55.994 11:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:35:55.994 11:46:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:35:56.558 11:46:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:35:56.816 [2024-06-10 11:46:40.583948] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:35:56.816 BaseBdev2 00:35:56.816 11:46:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:35:56.816 11:46:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:35:56.816 11:46:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:35:56.816 11:46:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:35:56.816 11:46:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:35:56.816 11:46:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:35:56.816 11:46:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:35:57.074 11:46:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:35:57.074 [ 00:35:57.074 { 00:35:57.074 "name": "BaseBdev2", 00:35:57.074 "aliases": [ 00:35:57.074 "7a90a15c-9f46-4cdc-a571-07a2f9f83ac0" 00:35:57.074 ], 00:35:57.074 "product_name": "Malloc disk", 00:35:57.074 "block_size": 512, 00:35:57.074 "num_blocks": 65536, 00:35:57.074 "uuid": "7a90a15c-9f46-4cdc-a571-07a2f9f83ac0", 00:35:57.074 "assigned_rate_limits": { 00:35:57.074 "rw_ios_per_sec": 0, 00:35:57.074 "rw_mbytes_per_sec": 0, 00:35:57.074 "r_mbytes_per_sec": 0, 00:35:57.074 "w_mbytes_per_sec": 0 00:35:57.074 }, 00:35:57.074 "claimed": true, 00:35:57.074 "claim_type": "exclusive_write", 00:35:57.074 "zoned": false, 00:35:57.074 "supported_io_types": { 00:35:57.074 "read": true, 00:35:57.074 "write": true, 00:35:57.074 "unmap": true, 00:35:57.074 "write_zeroes": true, 00:35:57.074 "flush": true, 00:35:57.074 "reset": true, 00:35:57.074 "compare": false, 00:35:57.074 "compare_and_write": false, 00:35:57.074 "abort": true, 00:35:57.074 "nvme_admin": false, 00:35:57.074 "nvme_io": false 00:35:57.074 }, 00:35:57.074 "memory_domains": [ 00:35:57.074 { 00:35:57.074 "dma_device_id": "system", 00:35:57.074 "dma_device_type": 1 00:35:57.074 }, 00:35:57.074 { 00:35:57.074 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:35:57.074 "dma_device_type": 2 00:35:57.074 } 00:35:57.074 ], 00:35:57.074 "driver_specific": {} 00:35:57.074 } 00:35:57.074 ] 00:35:57.074 11:46:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:35:57.074 11:46:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:35:57.074 11:46:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:35:57.074 11:46:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:35:57.074 11:46:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:35:57.074 11:46:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:35:57.074 11:46:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:35:57.074 11:46:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:35:57.074 11:46:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:35:57.075 11:46:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:35:57.075 11:46:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:35:57.075 11:46:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:35:57.075 11:46:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:35:57.075 11:46:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:57.075 11:46:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:35:57.333 11:46:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:35:57.333 "name": "Existed_Raid", 00:35:57.333 "uuid": "6811b467-e44b-4071-8efe-005ec12374da", 00:35:57.333 "strip_size_kb": 0, 00:35:57.333 "state": "configuring", 00:35:57.333 "raid_level": "raid1", 00:35:57.333 "superblock": true, 00:35:57.333 "num_base_bdevs": 4, 00:35:57.333 "num_base_bdevs_discovered": 2, 00:35:57.333 "num_base_bdevs_operational": 4, 00:35:57.333 "base_bdevs_list": [ 00:35:57.333 { 00:35:57.333 "name": "BaseBdev1", 00:35:57.333 "uuid": "de664f72-7281-4c52-a623-dce27bd4c4de", 00:35:57.333 "is_configured": true, 00:35:57.333 "data_offset": 2048, 00:35:57.333 "data_size": 63488 00:35:57.333 }, 00:35:57.333 { 00:35:57.333 "name": "BaseBdev2", 00:35:57.333 "uuid": "7a90a15c-9f46-4cdc-a571-07a2f9f83ac0", 00:35:57.333 "is_configured": true, 00:35:57.333 "data_offset": 2048, 00:35:57.333 "data_size": 63488 00:35:57.333 }, 00:35:57.333 { 00:35:57.333 "name": "BaseBdev3", 00:35:57.333 "uuid": "00000000-0000-0000-0000-000000000000", 00:35:57.333 "is_configured": false, 00:35:57.333 "data_offset": 0, 00:35:57.333 "data_size": 0 00:35:57.333 }, 00:35:57.333 { 00:35:57.333 "name": "BaseBdev4", 00:35:57.333 "uuid": "00000000-0000-0000-0000-000000000000", 00:35:57.333 "is_configured": false, 00:35:57.333 "data_offset": 0, 00:35:57.333 "data_size": 0 00:35:57.333 } 00:35:57.333 ] 00:35:57.333 }' 00:35:57.333 11:46:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:35:57.333 11:46:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:35:57.900 11:46:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:35:57.900 [2024-06-10 11:46:41.747040] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:35:57.900 BaseBdev3 00:35:57.900 11:46:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:35:57.900 11:46:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:35:57.900 11:46:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:35:57.900 11:46:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:35:57.900 11:46:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:35:57.900 11:46:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:35:57.900 11:46:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:35:58.158 11:46:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:35:58.158 [ 00:35:58.158 { 00:35:58.158 "name": "BaseBdev3", 00:35:58.158 "aliases": [ 00:35:58.158 "c7c41121-34a0-4319-a32d-c4f827f46401" 00:35:58.158 ], 00:35:58.158 "product_name": "Malloc disk", 00:35:58.158 "block_size": 512, 00:35:58.158 "num_blocks": 65536, 00:35:58.158 "uuid": "c7c41121-34a0-4319-a32d-c4f827f46401", 00:35:58.158 "assigned_rate_limits": { 00:35:58.158 "rw_ios_per_sec": 0, 00:35:58.158 "rw_mbytes_per_sec": 0, 00:35:58.158 "r_mbytes_per_sec": 0, 00:35:58.158 "w_mbytes_per_sec": 0 00:35:58.158 }, 00:35:58.158 "claimed": true, 00:35:58.158 "claim_type": "exclusive_write", 00:35:58.158 "zoned": false, 00:35:58.158 "supported_io_types": { 00:35:58.158 "read": true, 00:35:58.158 "write": true, 00:35:58.158 "unmap": true, 00:35:58.158 "write_zeroes": true, 00:35:58.158 "flush": true, 00:35:58.158 "reset": true, 00:35:58.158 "compare": false, 00:35:58.158 "compare_and_write": false, 00:35:58.158 "abort": true, 00:35:58.158 "nvme_admin": false, 00:35:58.158 "nvme_io": false 00:35:58.158 }, 00:35:58.158 "memory_domains": [ 00:35:58.158 { 00:35:58.158 "dma_device_id": "system", 00:35:58.158 "dma_device_type": 1 00:35:58.158 }, 00:35:58.158 { 00:35:58.158 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:35:58.158 "dma_device_type": 2 00:35:58.158 } 00:35:58.158 ], 00:35:58.158 "driver_specific": {} 00:35:58.158 } 00:35:58.158 ] 00:35:58.158 11:46:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:35:58.158 11:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:35:58.158 11:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:35:58.158 11:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:35:58.158 11:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:35:58.158 11:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:35:58.158 11:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:35:58.158 11:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:35:58.158 11:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:35:58.158 11:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:35:58.158 11:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:35:58.158 11:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:35:58.158 11:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:35:58.158 11:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:58.158 11:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:35:58.415 11:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:35:58.415 "name": "Existed_Raid", 00:35:58.415 "uuid": "6811b467-e44b-4071-8efe-005ec12374da", 00:35:58.415 "strip_size_kb": 0, 00:35:58.415 "state": "configuring", 00:35:58.415 "raid_level": "raid1", 00:35:58.415 "superblock": true, 00:35:58.415 "num_base_bdevs": 4, 00:35:58.415 "num_base_bdevs_discovered": 3, 00:35:58.415 "num_base_bdevs_operational": 4, 00:35:58.415 "base_bdevs_list": [ 00:35:58.415 { 00:35:58.415 "name": "BaseBdev1", 00:35:58.415 "uuid": "de664f72-7281-4c52-a623-dce27bd4c4de", 00:35:58.415 "is_configured": true, 00:35:58.415 "data_offset": 2048, 00:35:58.415 "data_size": 63488 00:35:58.415 }, 00:35:58.415 { 00:35:58.415 "name": "BaseBdev2", 00:35:58.415 "uuid": "7a90a15c-9f46-4cdc-a571-07a2f9f83ac0", 00:35:58.415 "is_configured": true, 00:35:58.415 "data_offset": 2048, 00:35:58.415 "data_size": 63488 00:35:58.415 }, 00:35:58.415 { 00:35:58.415 "name": "BaseBdev3", 00:35:58.415 "uuid": "c7c41121-34a0-4319-a32d-c4f827f46401", 00:35:58.415 "is_configured": true, 00:35:58.415 "data_offset": 2048, 00:35:58.415 "data_size": 63488 00:35:58.415 }, 00:35:58.415 { 00:35:58.415 "name": "BaseBdev4", 00:35:58.415 "uuid": "00000000-0000-0000-0000-000000000000", 00:35:58.415 "is_configured": false, 00:35:58.415 "data_offset": 0, 00:35:58.415 "data_size": 0 00:35:58.415 } 00:35:58.415 ] 00:35:58.415 }' 00:35:58.415 11:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:35:58.415 11:46:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:35:58.981 11:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:35:58.981 [2024-06-10 11:46:42.896857] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:35:58.981 [2024-06-10 11:46:42.896999] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x216be20 00:35:58.981 [2024-06-10 11:46:42.897010] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:35:58.981 [2024-06-10 11:46:42.897133] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x216ca70 00:35:58.981 [2024-06-10 11:46:42.897224] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x216be20 00:35:58.981 [2024-06-10 11:46:42.897230] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x216be20 00:35:58.981 [2024-06-10 11:46:42.897295] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:35:58.981 BaseBdev4 00:35:58.981 11:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:35:58.981 11:46:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev4 00:35:58.981 11:46:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:35:58.981 11:46:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:35:58.981 11:46:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:35:58.981 11:46:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:35:58.981 11:46:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:35:59.239 11:46:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:35:59.497 [ 00:35:59.497 { 00:35:59.497 "name": "BaseBdev4", 00:35:59.497 "aliases": [ 00:35:59.497 "346414f4-f343-4488-88ee-9b30ee93ce44" 00:35:59.497 ], 00:35:59.497 "product_name": "Malloc disk", 00:35:59.497 "block_size": 512, 00:35:59.497 "num_blocks": 65536, 00:35:59.497 "uuid": "346414f4-f343-4488-88ee-9b30ee93ce44", 00:35:59.497 "assigned_rate_limits": { 00:35:59.497 "rw_ios_per_sec": 0, 00:35:59.497 "rw_mbytes_per_sec": 0, 00:35:59.497 "r_mbytes_per_sec": 0, 00:35:59.497 "w_mbytes_per_sec": 0 00:35:59.497 }, 00:35:59.497 "claimed": true, 00:35:59.497 "claim_type": "exclusive_write", 00:35:59.497 "zoned": false, 00:35:59.497 "supported_io_types": { 00:35:59.497 "read": true, 00:35:59.497 "write": true, 00:35:59.497 "unmap": true, 00:35:59.497 "write_zeroes": true, 00:35:59.497 "flush": true, 00:35:59.497 "reset": true, 00:35:59.497 "compare": false, 00:35:59.497 "compare_and_write": false, 00:35:59.497 "abort": true, 00:35:59.497 "nvme_admin": false, 00:35:59.497 "nvme_io": false 00:35:59.497 }, 00:35:59.497 "memory_domains": [ 00:35:59.497 { 00:35:59.497 "dma_device_id": "system", 00:35:59.497 "dma_device_type": 1 00:35:59.497 }, 00:35:59.497 { 00:35:59.497 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:35:59.497 "dma_device_type": 2 00:35:59.497 } 00:35:59.497 ], 00:35:59.497 "driver_specific": {} 00:35:59.497 } 00:35:59.497 ] 00:35:59.497 11:46:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:35:59.497 11:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:35:59.497 11:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:35:59.497 11:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:35:59.497 11:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:35:59.497 11:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:35:59.497 11:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:35:59.497 11:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:35:59.497 11:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:35:59.497 11:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:35:59.497 11:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:35:59.497 11:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:35:59.497 11:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:35:59.497 11:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:59.497 11:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:35:59.756 11:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:35:59.756 "name": "Existed_Raid", 00:35:59.756 "uuid": "6811b467-e44b-4071-8efe-005ec12374da", 00:35:59.756 "strip_size_kb": 0, 00:35:59.756 "state": "online", 00:35:59.756 "raid_level": "raid1", 00:35:59.756 "superblock": true, 00:35:59.756 "num_base_bdevs": 4, 00:35:59.756 "num_base_bdevs_discovered": 4, 00:35:59.756 "num_base_bdevs_operational": 4, 00:35:59.756 "base_bdevs_list": [ 00:35:59.756 { 00:35:59.756 "name": "BaseBdev1", 00:35:59.756 "uuid": "de664f72-7281-4c52-a623-dce27bd4c4de", 00:35:59.756 "is_configured": true, 00:35:59.756 "data_offset": 2048, 00:35:59.756 "data_size": 63488 00:35:59.756 }, 00:35:59.756 { 00:35:59.756 "name": "BaseBdev2", 00:35:59.756 "uuid": "7a90a15c-9f46-4cdc-a571-07a2f9f83ac0", 00:35:59.756 "is_configured": true, 00:35:59.756 "data_offset": 2048, 00:35:59.756 "data_size": 63488 00:35:59.756 }, 00:35:59.756 { 00:35:59.756 "name": "BaseBdev3", 00:35:59.756 "uuid": "c7c41121-34a0-4319-a32d-c4f827f46401", 00:35:59.756 "is_configured": true, 00:35:59.756 "data_offset": 2048, 00:35:59.756 "data_size": 63488 00:35:59.756 }, 00:35:59.756 { 00:35:59.756 "name": "BaseBdev4", 00:35:59.756 "uuid": "346414f4-f343-4488-88ee-9b30ee93ce44", 00:35:59.756 "is_configured": true, 00:35:59.756 "data_offset": 2048, 00:35:59.756 "data_size": 63488 00:35:59.756 } 00:35:59.756 ] 00:35:59.756 }' 00:35:59.756 11:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:35:59.756 11:46:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:36:00.015 11:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:36:00.015 11:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:36:00.015 11:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:36:00.015 11:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:36:00.015 11:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:36:00.015 11:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:36:00.015 11:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:36:00.015 11:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:36:00.274 [2024-06-10 11:46:44.088154] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:36:00.274 11:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:36:00.274 "name": "Existed_Raid", 00:36:00.274 "aliases": [ 00:36:00.274 "6811b467-e44b-4071-8efe-005ec12374da" 00:36:00.274 ], 00:36:00.274 "product_name": "Raid Volume", 00:36:00.274 "block_size": 512, 00:36:00.274 "num_blocks": 63488, 00:36:00.274 "uuid": "6811b467-e44b-4071-8efe-005ec12374da", 00:36:00.274 "assigned_rate_limits": { 00:36:00.274 "rw_ios_per_sec": 0, 00:36:00.274 "rw_mbytes_per_sec": 0, 00:36:00.274 "r_mbytes_per_sec": 0, 00:36:00.274 "w_mbytes_per_sec": 0 00:36:00.274 }, 00:36:00.274 "claimed": false, 00:36:00.274 "zoned": false, 00:36:00.274 "supported_io_types": { 00:36:00.274 "read": true, 00:36:00.274 "write": true, 00:36:00.274 "unmap": false, 00:36:00.274 "write_zeroes": true, 00:36:00.274 "flush": false, 00:36:00.274 "reset": true, 00:36:00.274 "compare": false, 00:36:00.274 "compare_and_write": false, 00:36:00.274 "abort": false, 00:36:00.274 "nvme_admin": false, 00:36:00.274 "nvme_io": false 00:36:00.275 }, 00:36:00.275 "memory_domains": [ 00:36:00.275 { 00:36:00.275 "dma_device_id": "system", 00:36:00.275 "dma_device_type": 1 00:36:00.275 }, 00:36:00.275 { 00:36:00.275 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:36:00.275 "dma_device_type": 2 00:36:00.275 }, 00:36:00.275 { 00:36:00.275 "dma_device_id": "system", 00:36:00.275 "dma_device_type": 1 00:36:00.275 }, 00:36:00.275 { 00:36:00.275 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:36:00.275 "dma_device_type": 2 00:36:00.275 }, 00:36:00.275 { 00:36:00.275 "dma_device_id": "system", 00:36:00.275 "dma_device_type": 1 00:36:00.275 }, 00:36:00.275 { 00:36:00.275 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:36:00.275 "dma_device_type": 2 00:36:00.275 }, 00:36:00.275 { 00:36:00.275 "dma_device_id": "system", 00:36:00.275 "dma_device_type": 1 00:36:00.275 }, 00:36:00.275 { 00:36:00.275 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:36:00.275 "dma_device_type": 2 00:36:00.275 } 00:36:00.275 ], 00:36:00.275 "driver_specific": { 00:36:00.275 "raid": { 00:36:00.275 "uuid": "6811b467-e44b-4071-8efe-005ec12374da", 00:36:00.275 "strip_size_kb": 0, 00:36:00.275 "state": "online", 00:36:00.275 "raid_level": "raid1", 00:36:00.275 "superblock": true, 00:36:00.275 "num_base_bdevs": 4, 00:36:00.275 "num_base_bdevs_discovered": 4, 00:36:00.275 "num_base_bdevs_operational": 4, 00:36:00.275 "base_bdevs_list": [ 00:36:00.275 { 00:36:00.275 "name": "BaseBdev1", 00:36:00.275 "uuid": "de664f72-7281-4c52-a623-dce27bd4c4de", 00:36:00.275 "is_configured": true, 00:36:00.275 "data_offset": 2048, 00:36:00.275 "data_size": 63488 00:36:00.275 }, 00:36:00.275 { 00:36:00.275 "name": "BaseBdev2", 00:36:00.275 "uuid": "7a90a15c-9f46-4cdc-a571-07a2f9f83ac0", 00:36:00.275 "is_configured": true, 00:36:00.275 "data_offset": 2048, 00:36:00.275 "data_size": 63488 00:36:00.275 }, 00:36:00.275 { 00:36:00.275 "name": "BaseBdev3", 00:36:00.275 "uuid": "c7c41121-34a0-4319-a32d-c4f827f46401", 00:36:00.275 "is_configured": true, 00:36:00.275 "data_offset": 2048, 00:36:00.275 "data_size": 63488 00:36:00.275 }, 00:36:00.275 { 00:36:00.275 "name": "BaseBdev4", 00:36:00.275 "uuid": "346414f4-f343-4488-88ee-9b30ee93ce44", 00:36:00.275 "is_configured": true, 00:36:00.275 "data_offset": 2048, 00:36:00.275 "data_size": 63488 00:36:00.275 } 00:36:00.275 ] 00:36:00.275 } 00:36:00.275 } 00:36:00.275 }' 00:36:00.275 11:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:36:00.275 11:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:36:00.275 BaseBdev2 00:36:00.275 BaseBdev3 00:36:00.275 BaseBdev4' 00:36:00.275 11:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:36:00.275 11:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:36:00.275 11:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:36:00.534 11:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:36:00.534 "name": "BaseBdev1", 00:36:00.534 "aliases": [ 00:36:00.534 "de664f72-7281-4c52-a623-dce27bd4c4de" 00:36:00.534 ], 00:36:00.534 "product_name": "Malloc disk", 00:36:00.534 "block_size": 512, 00:36:00.534 "num_blocks": 65536, 00:36:00.534 "uuid": "de664f72-7281-4c52-a623-dce27bd4c4de", 00:36:00.534 "assigned_rate_limits": { 00:36:00.534 "rw_ios_per_sec": 0, 00:36:00.534 "rw_mbytes_per_sec": 0, 00:36:00.534 "r_mbytes_per_sec": 0, 00:36:00.534 "w_mbytes_per_sec": 0 00:36:00.534 }, 00:36:00.534 "claimed": true, 00:36:00.534 "claim_type": "exclusive_write", 00:36:00.534 "zoned": false, 00:36:00.534 "supported_io_types": { 00:36:00.534 "read": true, 00:36:00.534 "write": true, 00:36:00.534 "unmap": true, 00:36:00.534 "write_zeroes": true, 00:36:00.534 "flush": true, 00:36:00.534 "reset": true, 00:36:00.534 "compare": false, 00:36:00.534 "compare_and_write": false, 00:36:00.534 "abort": true, 00:36:00.534 "nvme_admin": false, 00:36:00.534 "nvme_io": false 00:36:00.534 }, 00:36:00.534 "memory_domains": [ 00:36:00.534 { 00:36:00.534 "dma_device_id": "system", 00:36:00.534 "dma_device_type": 1 00:36:00.534 }, 00:36:00.534 { 00:36:00.534 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:36:00.534 "dma_device_type": 2 00:36:00.534 } 00:36:00.534 ], 00:36:00.534 "driver_specific": {} 00:36:00.534 }' 00:36:00.534 11:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:36:00.534 11:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:36:00.534 11:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:36:00.534 11:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:36:00.534 11:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:36:00.534 11:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:36:00.534 11:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:36:00.793 11:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:36:00.793 11:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:36:00.793 11:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:36:00.793 11:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:36:00.793 11:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:36:00.793 11:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:36:00.793 11:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:36:00.793 11:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:36:01.053 11:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:36:01.053 "name": "BaseBdev2", 00:36:01.053 "aliases": [ 00:36:01.053 "7a90a15c-9f46-4cdc-a571-07a2f9f83ac0" 00:36:01.053 ], 00:36:01.053 "product_name": "Malloc disk", 00:36:01.053 "block_size": 512, 00:36:01.053 "num_blocks": 65536, 00:36:01.053 "uuid": "7a90a15c-9f46-4cdc-a571-07a2f9f83ac0", 00:36:01.053 "assigned_rate_limits": { 00:36:01.053 "rw_ios_per_sec": 0, 00:36:01.053 "rw_mbytes_per_sec": 0, 00:36:01.053 "r_mbytes_per_sec": 0, 00:36:01.053 "w_mbytes_per_sec": 0 00:36:01.053 }, 00:36:01.053 "claimed": true, 00:36:01.053 "claim_type": "exclusive_write", 00:36:01.053 "zoned": false, 00:36:01.053 "supported_io_types": { 00:36:01.053 "read": true, 00:36:01.053 "write": true, 00:36:01.053 "unmap": true, 00:36:01.053 "write_zeroes": true, 00:36:01.053 "flush": true, 00:36:01.053 "reset": true, 00:36:01.053 "compare": false, 00:36:01.053 "compare_and_write": false, 00:36:01.053 "abort": true, 00:36:01.053 "nvme_admin": false, 00:36:01.053 "nvme_io": false 00:36:01.053 }, 00:36:01.053 "memory_domains": [ 00:36:01.053 { 00:36:01.053 "dma_device_id": "system", 00:36:01.053 "dma_device_type": 1 00:36:01.053 }, 00:36:01.053 { 00:36:01.053 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:36:01.053 "dma_device_type": 2 00:36:01.053 } 00:36:01.053 ], 00:36:01.053 "driver_specific": {} 00:36:01.053 }' 00:36:01.053 11:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:36:01.053 11:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:36:01.053 11:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:36:01.053 11:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:36:01.053 11:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:36:01.053 11:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:36:01.053 11:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:36:01.312 11:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:36:01.312 11:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:36:01.312 11:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:36:01.312 11:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:36:01.312 11:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:36:01.312 11:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:36:01.312 11:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:36:01.312 11:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:36:01.572 11:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:36:01.572 "name": "BaseBdev3", 00:36:01.572 "aliases": [ 00:36:01.572 "c7c41121-34a0-4319-a32d-c4f827f46401" 00:36:01.572 ], 00:36:01.572 "product_name": "Malloc disk", 00:36:01.572 "block_size": 512, 00:36:01.572 "num_blocks": 65536, 00:36:01.572 "uuid": "c7c41121-34a0-4319-a32d-c4f827f46401", 00:36:01.572 "assigned_rate_limits": { 00:36:01.572 "rw_ios_per_sec": 0, 00:36:01.572 "rw_mbytes_per_sec": 0, 00:36:01.572 "r_mbytes_per_sec": 0, 00:36:01.572 "w_mbytes_per_sec": 0 00:36:01.572 }, 00:36:01.572 "claimed": true, 00:36:01.572 "claim_type": "exclusive_write", 00:36:01.572 "zoned": false, 00:36:01.572 "supported_io_types": { 00:36:01.572 "read": true, 00:36:01.572 "write": true, 00:36:01.572 "unmap": true, 00:36:01.572 "write_zeroes": true, 00:36:01.572 "flush": true, 00:36:01.572 "reset": true, 00:36:01.572 "compare": false, 00:36:01.572 "compare_and_write": false, 00:36:01.572 "abort": true, 00:36:01.572 "nvme_admin": false, 00:36:01.572 "nvme_io": false 00:36:01.572 }, 00:36:01.572 "memory_domains": [ 00:36:01.572 { 00:36:01.572 "dma_device_id": "system", 00:36:01.572 "dma_device_type": 1 00:36:01.572 }, 00:36:01.572 { 00:36:01.572 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:36:01.572 "dma_device_type": 2 00:36:01.572 } 00:36:01.572 ], 00:36:01.572 "driver_specific": {} 00:36:01.572 }' 00:36:01.572 11:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:36:01.572 11:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:36:01.572 11:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:36:01.572 11:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:36:01.572 11:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:36:01.572 11:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:36:01.572 11:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:36:01.572 11:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:36:01.572 11:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:36:01.572 11:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:36:01.572 11:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:36:01.831 11:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:36:01.831 11:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:36:01.831 11:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:36:01.831 11:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:36:01.831 11:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:36:01.831 "name": "BaseBdev4", 00:36:01.831 "aliases": [ 00:36:01.831 "346414f4-f343-4488-88ee-9b30ee93ce44" 00:36:01.831 ], 00:36:01.831 "product_name": "Malloc disk", 00:36:01.831 "block_size": 512, 00:36:01.831 "num_blocks": 65536, 00:36:01.831 "uuid": "346414f4-f343-4488-88ee-9b30ee93ce44", 00:36:01.831 "assigned_rate_limits": { 00:36:01.831 "rw_ios_per_sec": 0, 00:36:01.831 "rw_mbytes_per_sec": 0, 00:36:01.831 "r_mbytes_per_sec": 0, 00:36:01.831 "w_mbytes_per_sec": 0 00:36:01.831 }, 00:36:01.831 "claimed": true, 00:36:01.831 "claim_type": "exclusive_write", 00:36:01.831 "zoned": false, 00:36:01.831 "supported_io_types": { 00:36:01.831 "read": true, 00:36:01.831 "write": true, 00:36:01.831 "unmap": true, 00:36:01.831 "write_zeroes": true, 00:36:01.831 "flush": true, 00:36:01.831 "reset": true, 00:36:01.831 "compare": false, 00:36:01.831 "compare_and_write": false, 00:36:01.831 "abort": true, 00:36:01.831 "nvme_admin": false, 00:36:01.831 "nvme_io": false 00:36:01.831 }, 00:36:01.831 "memory_domains": [ 00:36:01.831 { 00:36:01.831 "dma_device_id": "system", 00:36:01.831 "dma_device_type": 1 00:36:01.831 }, 00:36:01.831 { 00:36:01.831 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:36:01.831 "dma_device_type": 2 00:36:01.831 } 00:36:01.831 ], 00:36:01.831 "driver_specific": {} 00:36:01.831 }' 00:36:01.831 11:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:36:01.831 11:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:36:01.831 11:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:36:01.831 11:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:36:02.090 11:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:36:02.090 11:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:36:02.090 11:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:36:02.090 11:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:36:02.090 11:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:36:02.090 11:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:36:02.090 11:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:36:02.090 11:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:36:02.090 11:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:36:02.349 [2024-06-10 11:46:46.133279] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:36:02.349 11:46:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:36:02.349 11:46:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:36:02.349 11:46:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:36:02.349 11:46:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:36:02.349 11:46:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:36:02.349 11:46:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:36:02.349 11:46:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:36:02.349 11:46:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:36:02.349 11:46:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:36:02.349 11:46:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:36:02.349 11:46:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:36:02.349 11:46:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:36:02.349 11:46:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:36:02.349 11:46:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:36:02.349 11:46:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:36:02.349 11:46:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:02.349 11:46:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:36:02.608 11:46:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:36:02.608 "name": "Existed_Raid", 00:36:02.608 "uuid": "6811b467-e44b-4071-8efe-005ec12374da", 00:36:02.608 "strip_size_kb": 0, 00:36:02.608 "state": "online", 00:36:02.608 "raid_level": "raid1", 00:36:02.608 "superblock": true, 00:36:02.608 "num_base_bdevs": 4, 00:36:02.608 "num_base_bdevs_discovered": 3, 00:36:02.608 "num_base_bdevs_operational": 3, 00:36:02.608 "base_bdevs_list": [ 00:36:02.608 { 00:36:02.608 "name": null, 00:36:02.608 "uuid": "00000000-0000-0000-0000-000000000000", 00:36:02.609 "is_configured": false, 00:36:02.609 "data_offset": 2048, 00:36:02.609 "data_size": 63488 00:36:02.609 }, 00:36:02.609 { 00:36:02.609 "name": "BaseBdev2", 00:36:02.609 "uuid": "7a90a15c-9f46-4cdc-a571-07a2f9f83ac0", 00:36:02.609 "is_configured": true, 00:36:02.609 "data_offset": 2048, 00:36:02.609 "data_size": 63488 00:36:02.609 }, 00:36:02.609 { 00:36:02.609 "name": "BaseBdev3", 00:36:02.609 "uuid": "c7c41121-34a0-4319-a32d-c4f827f46401", 00:36:02.609 "is_configured": true, 00:36:02.609 "data_offset": 2048, 00:36:02.609 "data_size": 63488 00:36:02.609 }, 00:36:02.609 { 00:36:02.609 "name": "BaseBdev4", 00:36:02.609 "uuid": "346414f4-f343-4488-88ee-9b30ee93ce44", 00:36:02.609 "is_configured": true, 00:36:02.609 "data_offset": 2048, 00:36:02.609 "data_size": 63488 00:36:02.609 } 00:36:02.609 ] 00:36:02.609 }' 00:36:02.609 11:46:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:36:02.609 11:46:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:36:03.177 11:46:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:36:03.177 11:46:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:36:03.177 11:46:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:03.177 11:46:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:36:03.177 11:46:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:36:03.177 11:46:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:36:03.177 11:46:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:36:03.435 [2024-06-10 11:46:47.161592] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:36:03.435 11:46:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:36:03.435 11:46:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:36:03.435 11:46:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:03.435 11:46:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:36:03.435 11:46:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:36:03.435 11:46:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:36:03.435 11:46:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:36:03.694 [2024-06-10 11:46:47.514438] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:36:03.694 11:46:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:36:03.694 11:46:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:36:03.694 11:46:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:03.694 11:46:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:36:03.953 11:46:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:36:03.953 11:46:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:36:03.953 11:46:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:36:03.953 [2024-06-10 11:46:47.873354] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:36:03.953 [2024-06-10 11:46:47.873417] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:36:03.953 [2024-06-10 11:46:47.885366] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:36:03.953 [2024-06-10 11:46:47.885397] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:36:03.953 [2024-06-10 11:46:47.885406] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x216be20 name Existed_Raid, state offline 00:36:04.211 11:46:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:36:04.211 11:46:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:36:04.211 11:46:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:04.211 11:46:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:36:04.211 11:46:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:36:04.211 11:46:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:36:04.211 11:46:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:36:04.211 11:46:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:36:04.211 11:46:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:36:04.211 11:46:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:36:04.470 BaseBdev2 00:36:04.470 11:46:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:36:04.470 11:46:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:36:04.470 11:46:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:36:04.470 11:46:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:36:04.470 11:46:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:36:04.470 11:46:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:36:04.470 11:46:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:36:04.470 11:46:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:36:04.730 [ 00:36:04.730 { 00:36:04.730 "name": "BaseBdev2", 00:36:04.730 "aliases": [ 00:36:04.730 "cb8b92a5-9778-4d6e-859b-57d9ec8a495f" 00:36:04.730 ], 00:36:04.730 "product_name": "Malloc disk", 00:36:04.730 "block_size": 512, 00:36:04.730 "num_blocks": 65536, 00:36:04.730 "uuid": "cb8b92a5-9778-4d6e-859b-57d9ec8a495f", 00:36:04.730 "assigned_rate_limits": { 00:36:04.730 "rw_ios_per_sec": 0, 00:36:04.730 "rw_mbytes_per_sec": 0, 00:36:04.730 "r_mbytes_per_sec": 0, 00:36:04.730 "w_mbytes_per_sec": 0 00:36:04.730 }, 00:36:04.730 "claimed": false, 00:36:04.730 "zoned": false, 00:36:04.730 "supported_io_types": { 00:36:04.730 "read": true, 00:36:04.730 "write": true, 00:36:04.730 "unmap": true, 00:36:04.730 "write_zeroes": true, 00:36:04.730 "flush": true, 00:36:04.730 "reset": true, 00:36:04.730 "compare": false, 00:36:04.730 "compare_and_write": false, 00:36:04.730 "abort": true, 00:36:04.730 "nvme_admin": false, 00:36:04.730 "nvme_io": false 00:36:04.730 }, 00:36:04.730 "memory_domains": [ 00:36:04.730 { 00:36:04.730 "dma_device_id": "system", 00:36:04.730 "dma_device_type": 1 00:36:04.730 }, 00:36:04.730 { 00:36:04.730 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:36:04.730 "dma_device_type": 2 00:36:04.730 } 00:36:04.730 ], 00:36:04.730 "driver_specific": {} 00:36:04.730 } 00:36:04.730 ] 00:36:04.730 11:46:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:36:04.730 11:46:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:36:04.730 11:46:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:36:04.730 11:46:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:36:04.988 BaseBdev3 00:36:04.988 11:46:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:36:04.988 11:46:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:36:04.988 11:46:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:36:04.988 11:46:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:36:04.988 11:46:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:36:04.988 11:46:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:36:04.988 11:46:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:36:05.247 11:46:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:36:05.247 [ 00:36:05.247 { 00:36:05.247 "name": "BaseBdev3", 00:36:05.247 "aliases": [ 00:36:05.247 "d4444dfb-f6fc-46c8-9421-826bc14e0d06" 00:36:05.247 ], 00:36:05.247 "product_name": "Malloc disk", 00:36:05.247 "block_size": 512, 00:36:05.247 "num_blocks": 65536, 00:36:05.247 "uuid": "d4444dfb-f6fc-46c8-9421-826bc14e0d06", 00:36:05.247 "assigned_rate_limits": { 00:36:05.247 "rw_ios_per_sec": 0, 00:36:05.247 "rw_mbytes_per_sec": 0, 00:36:05.247 "r_mbytes_per_sec": 0, 00:36:05.247 "w_mbytes_per_sec": 0 00:36:05.247 }, 00:36:05.247 "claimed": false, 00:36:05.247 "zoned": false, 00:36:05.247 "supported_io_types": { 00:36:05.247 "read": true, 00:36:05.247 "write": true, 00:36:05.247 "unmap": true, 00:36:05.247 "write_zeroes": true, 00:36:05.247 "flush": true, 00:36:05.247 "reset": true, 00:36:05.247 "compare": false, 00:36:05.247 "compare_and_write": false, 00:36:05.247 "abort": true, 00:36:05.247 "nvme_admin": false, 00:36:05.247 "nvme_io": false 00:36:05.247 }, 00:36:05.247 "memory_domains": [ 00:36:05.247 { 00:36:05.247 "dma_device_id": "system", 00:36:05.247 "dma_device_type": 1 00:36:05.247 }, 00:36:05.247 { 00:36:05.247 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:36:05.247 "dma_device_type": 2 00:36:05.247 } 00:36:05.247 ], 00:36:05.247 "driver_specific": {} 00:36:05.247 } 00:36:05.247 ] 00:36:05.247 11:46:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:36:05.247 11:46:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:36:05.247 11:46:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:36:05.247 11:46:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:36:05.506 BaseBdev4 00:36:05.506 11:46:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:36:05.506 11:46:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev4 00:36:05.506 11:46:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:36:05.506 11:46:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:36:05.506 11:46:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:36:05.506 11:46:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:36:05.506 11:46:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:36:05.765 11:46:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:36:05.765 [ 00:36:05.765 { 00:36:05.765 "name": "BaseBdev4", 00:36:05.765 "aliases": [ 00:36:05.765 "ebc51e5a-e43e-45e5-9092-9eb2452e910f" 00:36:05.765 ], 00:36:05.765 "product_name": "Malloc disk", 00:36:05.765 "block_size": 512, 00:36:05.765 "num_blocks": 65536, 00:36:05.765 "uuid": "ebc51e5a-e43e-45e5-9092-9eb2452e910f", 00:36:05.765 "assigned_rate_limits": { 00:36:05.765 "rw_ios_per_sec": 0, 00:36:05.765 "rw_mbytes_per_sec": 0, 00:36:05.765 "r_mbytes_per_sec": 0, 00:36:05.765 "w_mbytes_per_sec": 0 00:36:05.765 }, 00:36:05.765 "claimed": false, 00:36:05.765 "zoned": false, 00:36:05.765 "supported_io_types": { 00:36:05.765 "read": true, 00:36:05.765 "write": true, 00:36:05.765 "unmap": true, 00:36:05.765 "write_zeroes": true, 00:36:05.765 "flush": true, 00:36:05.765 "reset": true, 00:36:05.765 "compare": false, 00:36:05.765 "compare_and_write": false, 00:36:05.765 "abort": true, 00:36:05.765 "nvme_admin": false, 00:36:05.765 "nvme_io": false 00:36:05.765 }, 00:36:05.765 "memory_domains": [ 00:36:05.765 { 00:36:05.765 "dma_device_id": "system", 00:36:05.765 "dma_device_type": 1 00:36:05.765 }, 00:36:05.765 { 00:36:05.765 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:36:05.765 "dma_device_type": 2 00:36:05.765 } 00:36:05.765 ], 00:36:05.765 "driver_specific": {} 00:36:05.765 } 00:36:05.765 ] 00:36:05.765 11:46:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:36:05.765 11:46:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:36:05.765 11:46:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:36:05.765 11:46:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:36:06.024 [2024-06-10 11:46:49.805742] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:36:06.024 [2024-06-10 11:46:49.805776] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:36:06.024 [2024-06-10 11:46:49.805789] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:36:06.024 [2024-06-10 11:46:49.806831] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:36:06.024 [2024-06-10 11:46:49.806862] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:36:06.024 11:46:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:36:06.024 11:46:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:36:06.024 11:46:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:36:06.024 11:46:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:36:06.024 11:46:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:36:06.024 11:46:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:36:06.024 11:46:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:36:06.024 11:46:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:36:06.024 11:46:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:36:06.024 11:46:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:36:06.024 11:46:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:06.024 11:46:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:36:06.283 11:46:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:36:06.283 "name": "Existed_Raid", 00:36:06.283 "uuid": "e23b9c8f-55d8-414f-b18c-958f2616471d", 00:36:06.283 "strip_size_kb": 0, 00:36:06.283 "state": "configuring", 00:36:06.283 "raid_level": "raid1", 00:36:06.283 "superblock": true, 00:36:06.283 "num_base_bdevs": 4, 00:36:06.283 "num_base_bdevs_discovered": 3, 00:36:06.283 "num_base_bdevs_operational": 4, 00:36:06.283 "base_bdevs_list": [ 00:36:06.283 { 00:36:06.283 "name": "BaseBdev1", 00:36:06.283 "uuid": "00000000-0000-0000-0000-000000000000", 00:36:06.283 "is_configured": false, 00:36:06.283 "data_offset": 0, 00:36:06.283 "data_size": 0 00:36:06.283 }, 00:36:06.283 { 00:36:06.283 "name": "BaseBdev2", 00:36:06.283 "uuid": "cb8b92a5-9778-4d6e-859b-57d9ec8a495f", 00:36:06.283 "is_configured": true, 00:36:06.283 "data_offset": 2048, 00:36:06.283 "data_size": 63488 00:36:06.283 }, 00:36:06.283 { 00:36:06.283 "name": "BaseBdev3", 00:36:06.283 "uuid": "d4444dfb-f6fc-46c8-9421-826bc14e0d06", 00:36:06.283 "is_configured": true, 00:36:06.283 "data_offset": 2048, 00:36:06.283 "data_size": 63488 00:36:06.283 }, 00:36:06.283 { 00:36:06.283 "name": "BaseBdev4", 00:36:06.283 "uuid": "ebc51e5a-e43e-45e5-9092-9eb2452e910f", 00:36:06.283 "is_configured": true, 00:36:06.283 "data_offset": 2048, 00:36:06.283 "data_size": 63488 00:36:06.283 } 00:36:06.283 ] 00:36:06.283 }' 00:36:06.283 11:46:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:36:06.283 11:46:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:36:06.541 11:46:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:36:06.800 [2024-06-10 11:46:50.619824] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:36:06.800 11:46:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:36:06.800 11:46:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:36:06.800 11:46:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:36:06.800 11:46:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:36:06.800 11:46:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:36:06.800 11:46:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:36:06.800 11:46:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:36:06.800 11:46:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:36:06.800 11:46:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:36:06.800 11:46:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:36:06.800 11:46:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:06.800 11:46:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:36:07.059 11:46:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:36:07.059 "name": "Existed_Raid", 00:36:07.059 "uuid": "e23b9c8f-55d8-414f-b18c-958f2616471d", 00:36:07.059 "strip_size_kb": 0, 00:36:07.059 "state": "configuring", 00:36:07.059 "raid_level": "raid1", 00:36:07.059 "superblock": true, 00:36:07.059 "num_base_bdevs": 4, 00:36:07.059 "num_base_bdevs_discovered": 2, 00:36:07.059 "num_base_bdevs_operational": 4, 00:36:07.059 "base_bdevs_list": [ 00:36:07.059 { 00:36:07.059 "name": "BaseBdev1", 00:36:07.059 "uuid": "00000000-0000-0000-0000-000000000000", 00:36:07.059 "is_configured": false, 00:36:07.059 "data_offset": 0, 00:36:07.059 "data_size": 0 00:36:07.059 }, 00:36:07.059 { 00:36:07.059 "name": null, 00:36:07.059 "uuid": "cb8b92a5-9778-4d6e-859b-57d9ec8a495f", 00:36:07.059 "is_configured": false, 00:36:07.059 "data_offset": 2048, 00:36:07.059 "data_size": 63488 00:36:07.059 }, 00:36:07.059 { 00:36:07.059 "name": "BaseBdev3", 00:36:07.059 "uuid": "d4444dfb-f6fc-46c8-9421-826bc14e0d06", 00:36:07.059 "is_configured": true, 00:36:07.059 "data_offset": 2048, 00:36:07.059 "data_size": 63488 00:36:07.059 }, 00:36:07.059 { 00:36:07.059 "name": "BaseBdev4", 00:36:07.059 "uuid": "ebc51e5a-e43e-45e5-9092-9eb2452e910f", 00:36:07.059 "is_configured": true, 00:36:07.059 "data_offset": 2048, 00:36:07.059 "data_size": 63488 00:36:07.059 } 00:36:07.059 ] 00:36:07.059 }' 00:36:07.059 11:46:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:36:07.059 11:46:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:36:07.626 11:46:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:07.626 11:46:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:36:07.626 11:46:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:36:07.626 11:46:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:36:07.884 [2024-06-10 11:46:51.654437] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:36:07.884 BaseBdev1 00:36:07.884 11:46:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:36:07.884 11:46:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:36:07.884 11:46:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:36:07.884 11:46:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:36:07.884 11:46:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:36:07.884 11:46:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:36:07.884 11:46:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:36:08.142 11:46:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:36:08.142 [ 00:36:08.142 { 00:36:08.142 "name": "BaseBdev1", 00:36:08.142 "aliases": [ 00:36:08.142 "871629b2-bed0-406e-ad89-0a9cb94f333e" 00:36:08.142 ], 00:36:08.142 "product_name": "Malloc disk", 00:36:08.142 "block_size": 512, 00:36:08.142 "num_blocks": 65536, 00:36:08.142 "uuid": "871629b2-bed0-406e-ad89-0a9cb94f333e", 00:36:08.142 "assigned_rate_limits": { 00:36:08.142 "rw_ios_per_sec": 0, 00:36:08.142 "rw_mbytes_per_sec": 0, 00:36:08.142 "r_mbytes_per_sec": 0, 00:36:08.142 "w_mbytes_per_sec": 0 00:36:08.142 }, 00:36:08.142 "claimed": true, 00:36:08.142 "claim_type": "exclusive_write", 00:36:08.142 "zoned": false, 00:36:08.142 "supported_io_types": { 00:36:08.142 "read": true, 00:36:08.142 "write": true, 00:36:08.142 "unmap": true, 00:36:08.142 "write_zeroes": true, 00:36:08.142 "flush": true, 00:36:08.142 "reset": true, 00:36:08.142 "compare": false, 00:36:08.142 "compare_and_write": false, 00:36:08.142 "abort": true, 00:36:08.142 "nvme_admin": false, 00:36:08.142 "nvme_io": false 00:36:08.142 }, 00:36:08.142 "memory_domains": [ 00:36:08.142 { 00:36:08.142 "dma_device_id": "system", 00:36:08.142 "dma_device_type": 1 00:36:08.142 }, 00:36:08.142 { 00:36:08.142 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:36:08.142 "dma_device_type": 2 00:36:08.142 } 00:36:08.142 ], 00:36:08.142 "driver_specific": {} 00:36:08.142 } 00:36:08.142 ] 00:36:08.142 11:46:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:36:08.142 11:46:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:36:08.142 11:46:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:36:08.142 11:46:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:36:08.143 11:46:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:36:08.143 11:46:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:36:08.143 11:46:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:36:08.143 11:46:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:36:08.143 11:46:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:36:08.143 11:46:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:36:08.143 11:46:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:36:08.143 11:46:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:08.143 11:46:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:36:08.400 11:46:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:36:08.400 "name": "Existed_Raid", 00:36:08.400 "uuid": "e23b9c8f-55d8-414f-b18c-958f2616471d", 00:36:08.400 "strip_size_kb": 0, 00:36:08.400 "state": "configuring", 00:36:08.400 "raid_level": "raid1", 00:36:08.400 "superblock": true, 00:36:08.400 "num_base_bdevs": 4, 00:36:08.400 "num_base_bdevs_discovered": 3, 00:36:08.400 "num_base_bdevs_operational": 4, 00:36:08.400 "base_bdevs_list": [ 00:36:08.400 { 00:36:08.400 "name": "BaseBdev1", 00:36:08.400 "uuid": "871629b2-bed0-406e-ad89-0a9cb94f333e", 00:36:08.400 "is_configured": true, 00:36:08.400 "data_offset": 2048, 00:36:08.400 "data_size": 63488 00:36:08.400 }, 00:36:08.400 { 00:36:08.400 "name": null, 00:36:08.400 "uuid": "cb8b92a5-9778-4d6e-859b-57d9ec8a495f", 00:36:08.400 "is_configured": false, 00:36:08.400 "data_offset": 2048, 00:36:08.400 "data_size": 63488 00:36:08.400 }, 00:36:08.400 { 00:36:08.400 "name": "BaseBdev3", 00:36:08.400 "uuid": "d4444dfb-f6fc-46c8-9421-826bc14e0d06", 00:36:08.400 "is_configured": true, 00:36:08.400 "data_offset": 2048, 00:36:08.400 "data_size": 63488 00:36:08.400 }, 00:36:08.400 { 00:36:08.400 "name": "BaseBdev4", 00:36:08.400 "uuid": "ebc51e5a-e43e-45e5-9092-9eb2452e910f", 00:36:08.400 "is_configured": true, 00:36:08.400 "data_offset": 2048, 00:36:08.400 "data_size": 63488 00:36:08.400 } 00:36:08.400 ] 00:36:08.400 }' 00:36:08.400 11:46:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:36:08.400 11:46:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:36:08.966 11:46:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:08.966 11:46:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:36:08.966 11:46:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:36:08.966 11:46:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:36:09.225 [2024-06-10 11:46:53.038036] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:36:09.225 11:46:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:36:09.225 11:46:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:36:09.225 11:46:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:36:09.225 11:46:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:36:09.225 11:46:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:36:09.225 11:46:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:36:09.225 11:46:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:36:09.225 11:46:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:36:09.225 11:46:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:36:09.225 11:46:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:36:09.225 11:46:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:09.225 11:46:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:36:09.483 11:46:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:36:09.483 "name": "Existed_Raid", 00:36:09.483 "uuid": "e23b9c8f-55d8-414f-b18c-958f2616471d", 00:36:09.483 "strip_size_kb": 0, 00:36:09.483 "state": "configuring", 00:36:09.483 "raid_level": "raid1", 00:36:09.483 "superblock": true, 00:36:09.483 "num_base_bdevs": 4, 00:36:09.483 "num_base_bdevs_discovered": 2, 00:36:09.483 "num_base_bdevs_operational": 4, 00:36:09.483 "base_bdevs_list": [ 00:36:09.483 { 00:36:09.483 "name": "BaseBdev1", 00:36:09.483 "uuid": "871629b2-bed0-406e-ad89-0a9cb94f333e", 00:36:09.483 "is_configured": true, 00:36:09.483 "data_offset": 2048, 00:36:09.483 "data_size": 63488 00:36:09.483 }, 00:36:09.483 { 00:36:09.483 "name": null, 00:36:09.483 "uuid": "cb8b92a5-9778-4d6e-859b-57d9ec8a495f", 00:36:09.483 "is_configured": false, 00:36:09.483 "data_offset": 2048, 00:36:09.483 "data_size": 63488 00:36:09.483 }, 00:36:09.483 { 00:36:09.483 "name": null, 00:36:09.483 "uuid": "d4444dfb-f6fc-46c8-9421-826bc14e0d06", 00:36:09.483 "is_configured": false, 00:36:09.483 "data_offset": 2048, 00:36:09.483 "data_size": 63488 00:36:09.483 }, 00:36:09.483 { 00:36:09.483 "name": "BaseBdev4", 00:36:09.483 "uuid": "ebc51e5a-e43e-45e5-9092-9eb2452e910f", 00:36:09.483 "is_configured": true, 00:36:09.483 "data_offset": 2048, 00:36:09.483 "data_size": 63488 00:36:09.483 } 00:36:09.483 ] 00:36:09.483 }' 00:36:09.483 11:46:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:36:09.483 11:46:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:36:10.051 11:46:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:10.051 11:46:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:36:10.051 11:46:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:36:10.051 11:46:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:36:10.310 [2024-06-10 11:46:54.072734] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:36:10.310 11:46:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:36:10.310 11:46:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:36:10.310 11:46:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:36:10.310 11:46:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:36:10.310 11:46:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:36:10.310 11:46:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:36:10.310 11:46:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:36:10.310 11:46:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:36:10.310 11:46:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:36:10.310 11:46:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:36:10.310 11:46:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:10.310 11:46:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:36:10.570 11:46:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:36:10.570 "name": "Existed_Raid", 00:36:10.570 "uuid": "e23b9c8f-55d8-414f-b18c-958f2616471d", 00:36:10.570 "strip_size_kb": 0, 00:36:10.570 "state": "configuring", 00:36:10.570 "raid_level": "raid1", 00:36:10.570 "superblock": true, 00:36:10.570 "num_base_bdevs": 4, 00:36:10.570 "num_base_bdevs_discovered": 3, 00:36:10.570 "num_base_bdevs_operational": 4, 00:36:10.570 "base_bdevs_list": [ 00:36:10.570 { 00:36:10.570 "name": "BaseBdev1", 00:36:10.570 "uuid": "871629b2-bed0-406e-ad89-0a9cb94f333e", 00:36:10.570 "is_configured": true, 00:36:10.570 "data_offset": 2048, 00:36:10.570 "data_size": 63488 00:36:10.570 }, 00:36:10.570 { 00:36:10.570 "name": null, 00:36:10.570 "uuid": "cb8b92a5-9778-4d6e-859b-57d9ec8a495f", 00:36:10.570 "is_configured": false, 00:36:10.570 "data_offset": 2048, 00:36:10.570 "data_size": 63488 00:36:10.570 }, 00:36:10.570 { 00:36:10.570 "name": "BaseBdev3", 00:36:10.570 "uuid": "d4444dfb-f6fc-46c8-9421-826bc14e0d06", 00:36:10.570 "is_configured": true, 00:36:10.570 "data_offset": 2048, 00:36:10.570 "data_size": 63488 00:36:10.570 }, 00:36:10.570 { 00:36:10.570 "name": "BaseBdev4", 00:36:10.570 "uuid": "ebc51e5a-e43e-45e5-9092-9eb2452e910f", 00:36:10.570 "is_configured": true, 00:36:10.570 "data_offset": 2048, 00:36:10.570 "data_size": 63488 00:36:10.570 } 00:36:10.570 ] 00:36:10.570 }' 00:36:10.570 11:46:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:36:10.570 11:46:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:36:10.829 11:46:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:10.829 11:46:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:36:11.107 11:46:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:36:11.107 11:46:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:36:11.444 [2024-06-10 11:46:55.091374] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:36:11.444 11:46:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:36:11.444 11:46:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:36:11.444 11:46:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:36:11.444 11:46:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:36:11.444 11:46:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:36:11.444 11:46:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:36:11.444 11:46:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:36:11.444 11:46:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:36:11.444 11:46:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:36:11.444 11:46:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:36:11.444 11:46:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:11.444 11:46:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:36:11.445 11:46:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:36:11.445 "name": "Existed_Raid", 00:36:11.445 "uuid": "e23b9c8f-55d8-414f-b18c-958f2616471d", 00:36:11.445 "strip_size_kb": 0, 00:36:11.445 "state": "configuring", 00:36:11.445 "raid_level": "raid1", 00:36:11.445 "superblock": true, 00:36:11.445 "num_base_bdevs": 4, 00:36:11.445 "num_base_bdevs_discovered": 2, 00:36:11.445 "num_base_bdevs_operational": 4, 00:36:11.445 "base_bdevs_list": [ 00:36:11.445 { 00:36:11.445 "name": null, 00:36:11.445 "uuid": "871629b2-bed0-406e-ad89-0a9cb94f333e", 00:36:11.445 "is_configured": false, 00:36:11.445 "data_offset": 2048, 00:36:11.445 "data_size": 63488 00:36:11.445 }, 00:36:11.445 { 00:36:11.445 "name": null, 00:36:11.445 "uuid": "cb8b92a5-9778-4d6e-859b-57d9ec8a495f", 00:36:11.445 "is_configured": false, 00:36:11.445 "data_offset": 2048, 00:36:11.445 "data_size": 63488 00:36:11.445 }, 00:36:11.445 { 00:36:11.445 "name": "BaseBdev3", 00:36:11.445 "uuid": "d4444dfb-f6fc-46c8-9421-826bc14e0d06", 00:36:11.445 "is_configured": true, 00:36:11.445 "data_offset": 2048, 00:36:11.445 "data_size": 63488 00:36:11.445 }, 00:36:11.445 { 00:36:11.445 "name": "BaseBdev4", 00:36:11.445 "uuid": "ebc51e5a-e43e-45e5-9092-9eb2452e910f", 00:36:11.445 "is_configured": true, 00:36:11.445 "data_offset": 2048, 00:36:11.445 "data_size": 63488 00:36:11.445 } 00:36:11.445 ] 00:36:11.445 }' 00:36:11.445 11:46:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:36:11.445 11:46:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:36:12.013 11:46:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:36:12.013 11:46:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:12.273 11:46:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:36:12.273 11:46:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:36:12.273 [2024-06-10 11:46:56.137341] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:36:12.273 11:46:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:36:12.273 11:46:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:36:12.273 11:46:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:36:12.273 11:46:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:36:12.273 11:46:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:36:12.273 11:46:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:36:12.273 11:46:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:36:12.273 11:46:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:36:12.273 11:46:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:36:12.273 11:46:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:36:12.273 11:46:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:12.273 11:46:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:36:12.533 11:46:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:36:12.533 "name": "Existed_Raid", 00:36:12.533 "uuid": "e23b9c8f-55d8-414f-b18c-958f2616471d", 00:36:12.533 "strip_size_kb": 0, 00:36:12.533 "state": "configuring", 00:36:12.533 "raid_level": "raid1", 00:36:12.533 "superblock": true, 00:36:12.533 "num_base_bdevs": 4, 00:36:12.533 "num_base_bdevs_discovered": 3, 00:36:12.533 "num_base_bdevs_operational": 4, 00:36:12.533 "base_bdevs_list": [ 00:36:12.533 { 00:36:12.533 "name": null, 00:36:12.533 "uuid": "871629b2-bed0-406e-ad89-0a9cb94f333e", 00:36:12.533 "is_configured": false, 00:36:12.533 "data_offset": 2048, 00:36:12.533 "data_size": 63488 00:36:12.533 }, 00:36:12.533 { 00:36:12.533 "name": "BaseBdev2", 00:36:12.533 "uuid": "cb8b92a5-9778-4d6e-859b-57d9ec8a495f", 00:36:12.533 "is_configured": true, 00:36:12.533 "data_offset": 2048, 00:36:12.533 "data_size": 63488 00:36:12.533 }, 00:36:12.533 { 00:36:12.533 "name": "BaseBdev3", 00:36:12.533 "uuid": "d4444dfb-f6fc-46c8-9421-826bc14e0d06", 00:36:12.533 "is_configured": true, 00:36:12.533 "data_offset": 2048, 00:36:12.533 "data_size": 63488 00:36:12.533 }, 00:36:12.533 { 00:36:12.533 "name": "BaseBdev4", 00:36:12.533 "uuid": "ebc51e5a-e43e-45e5-9092-9eb2452e910f", 00:36:12.533 "is_configured": true, 00:36:12.533 "data_offset": 2048, 00:36:12.533 "data_size": 63488 00:36:12.533 } 00:36:12.533 ] 00:36:12.533 }' 00:36:12.533 11:46:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:36:12.533 11:46:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:36:13.101 11:46:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:13.101 11:46:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:36:13.101 11:46:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:36:13.101 11:46:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:13.101 11:46:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:36:13.360 11:46:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 871629b2-bed0-406e-ad89-0a9cb94f333e 00:36:13.620 [2024-06-10 11:46:57.327208] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:36:13.620 [2024-06-10 11:46:57.327330] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2170c50 00:36:13.620 [2024-06-10 11:46:57.327338] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:36:13.620 [2024-06-10 11:46:57.327457] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x216bd90 00:36:13.620 [2024-06-10 11:46:57.327542] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2170c50 00:36:13.620 [2024-06-10 11:46:57.327548] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2170c50 00:36:13.620 [2024-06-10 11:46:57.327610] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:36:13.620 NewBaseBdev 00:36:13.620 11:46:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:36:13.620 11:46:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=NewBaseBdev 00:36:13.620 11:46:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:36:13.620 11:46:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:36:13.620 11:46:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:36:13.620 11:46:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:36:13.620 11:46:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:36:13.620 11:46:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:36:13.879 [ 00:36:13.879 { 00:36:13.879 "name": "NewBaseBdev", 00:36:13.879 "aliases": [ 00:36:13.879 "871629b2-bed0-406e-ad89-0a9cb94f333e" 00:36:13.879 ], 00:36:13.879 "product_name": "Malloc disk", 00:36:13.879 "block_size": 512, 00:36:13.879 "num_blocks": 65536, 00:36:13.879 "uuid": "871629b2-bed0-406e-ad89-0a9cb94f333e", 00:36:13.879 "assigned_rate_limits": { 00:36:13.879 "rw_ios_per_sec": 0, 00:36:13.879 "rw_mbytes_per_sec": 0, 00:36:13.879 "r_mbytes_per_sec": 0, 00:36:13.879 "w_mbytes_per_sec": 0 00:36:13.879 }, 00:36:13.879 "claimed": true, 00:36:13.879 "claim_type": "exclusive_write", 00:36:13.879 "zoned": false, 00:36:13.879 "supported_io_types": { 00:36:13.879 "read": true, 00:36:13.879 "write": true, 00:36:13.879 "unmap": true, 00:36:13.879 "write_zeroes": true, 00:36:13.879 "flush": true, 00:36:13.879 "reset": true, 00:36:13.879 "compare": false, 00:36:13.879 "compare_and_write": false, 00:36:13.879 "abort": true, 00:36:13.879 "nvme_admin": false, 00:36:13.879 "nvme_io": false 00:36:13.879 }, 00:36:13.879 "memory_domains": [ 00:36:13.879 { 00:36:13.879 "dma_device_id": "system", 00:36:13.879 "dma_device_type": 1 00:36:13.879 }, 00:36:13.879 { 00:36:13.879 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:36:13.879 "dma_device_type": 2 00:36:13.879 } 00:36:13.879 ], 00:36:13.879 "driver_specific": {} 00:36:13.879 } 00:36:13.879 ] 00:36:13.879 11:46:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:36:13.879 11:46:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:36:13.879 11:46:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:36:13.879 11:46:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:36:13.879 11:46:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:36:13.879 11:46:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:36:13.879 11:46:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:36:13.879 11:46:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:36:13.879 11:46:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:36:13.879 11:46:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:36:13.879 11:46:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:36:13.879 11:46:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:13.879 11:46:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:36:14.138 11:46:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:36:14.138 "name": "Existed_Raid", 00:36:14.138 "uuid": "e23b9c8f-55d8-414f-b18c-958f2616471d", 00:36:14.138 "strip_size_kb": 0, 00:36:14.138 "state": "online", 00:36:14.138 "raid_level": "raid1", 00:36:14.138 "superblock": true, 00:36:14.138 "num_base_bdevs": 4, 00:36:14.138 "num_base_bdevs_discovered": 4, 00:36:14.138 "num_base_bdevs_operational": 4, 00:36:14.138 "base_bdevs_list": [ 00:36:14.138 { 00:36:14.138 "name": "NewBaseBdev", 00:36:14.138 "uuid": "871629b2-bed0-406e-ad89-0a9cb94f333e", 00:36:14.138 "is_configured": true, 00:36:14.138 "data_offset": 2048, 00:36:14.138 "data_size": 63488 00:36:14.138 }, 00:36:14.138 { 00:36:14.138 "name": "BaseBdev2", 00:36:14.138 "uuid": "cb8b92a5-9778-4d6e-859b-57d9ec8a495f", 00:36:14.138 "is_configured": true, 00:36:14.138 "data_offset": 2048, 00:36:14.138 "data_size": 63488 00:36:14.138 }, 00:36:14.138 { 00:36:14.138 "name": "BaseBdev3", 00:36:14.138 "uuid": "d4444dfb-f6fc-46c8-9421-826bc14e0d06", 00:36:14.138 "is_configured": true, 00:36:14.138 "data_offset": 2048, 00:36:14.138 "data_size": 63488 00:36:14.138 }, 00:36:14.138 { 00:36:14.138 "name": "BaseBdev4", 00:36:14.138 "uuid": "ebc51e5a-e43e-45e5-9092-9eb2452e910f", 00:36:14.138 "is_configured": true, 00:36:14.138 "data_offset": 2048, 00:36:14.138 "data_size": 63488 00:36:14.138 } 00:36:14.138 ] 00:36:14.138 }' 00:36:14.138 11:46:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:36:14.138 11:46:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:36:14.707 11:46:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:36:14.707 11:46:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:36:14.707 11:46:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:36:14.707 11:46:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:36:14.707 11:46:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:36:14.707 11:46:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:36:14.707 11:46:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:36:14.707 11:46:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:36:14.707 [2024-06-10 11:46:58.518467] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:36:14.707 11:46:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:36:14.707 "name": "Existed_Raid", 00:36:14.707 "aliases": [ 00:36:14.707 "e23b9c8f-55d8-414f-b18c-958f2616471d" 00:36:14.707 ], 00:36:14.707 "product_name": "Raid Volume", 00:36:14.707 "block_size": 512, 00:36:14.707 "num_blocks": 63488, 00:36:14.707 "uuid": "e23b9c8f-55d8-414f-b18c-958f2616471d", 00:36:14.707 "assigned_rate_limits": { 00:36:14.707 "rw_ios_per_sec": 0, 00:36:14.707 "rw_mbytes_per_sec": 0, 00:36:14.707 "r_mbytes_per_sec": 0, 00:36:14.707 "w_mbytes_per_sec": 0 00:36:14.707 }, 00:36:14.707 "claimed": false, 00:36:14.707 "zoned": false, 00:36:14.707 "supported_io_types": { 00:36:14.707 "read": true, 00:36:14.707 "write": true, 00:36:14.707 "unmap": false, 00:36:14.707 "write_zeroes": true, 00:36:14.707 "flush": false, 00:36:14.707 "reset": true, 00:36:14.707 "compare": false, 00:36:14.707 "compare_and_write": false, 00:36:14.707 "abort": false, 00:36:14.707 "nvme_admin": false, 00:36:14.707 "nvme_io": false 00:36:14.707 }, 00:36:14.707 "memory_domains": [ 00:36:14.707 { 00:36:14.707 "dma_device_id": "system", 00:36:14.707 "dma_device_type": 1 00:36:14.707 }, 00:36:14.707 { 00:36:14.707 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:36:14.707 "dma_device_type": 2 00:36:14.707 }, 00:36:14.707 { 00:36:14.707 "dma_device_id": "system", 00:36:14.707 "dma_device_type": 1 00:36:14.707 }, 00:36:14.707 { 00:36:14.707 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:36:14.707 "dma_device_type": 2 00:36:14.707 }, 00:36:14.707 { 00:36:14.707 "dma_device_id": "system", 00:36:14.707 "dma_device_type": 1 00:36:14.707 }, 00:36:14.707 { 00:36:14.707 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:36:14.707 "dma_device_type": 2 00:36:14.707 }, 00:36:14.707 { 00:36:14.707 "dma_device_id": "system", 00:36:14.707 "dma_device_type": 1 00:36:14.707 }, 00:36:14.707 { 00:36:14.707 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:36:14.707 "dma_device_type": 2 00:36:14.707 } 00:36:14.707 ], 00:36:14.707 "driver_specific": { 00:36:14.707 "raid": { 00:36:14.707 "uuid": "e23b9c8f-55d8-414f-b18c-958f2616471d", 00:36:14.707 "strip_size_kb": 0, 00:36:14.707 "state": "online", 00:36:14.707 "raid_level": "raid1", 00:36:14.707 "superblock": true, 00:36:14.707 "num_base_bdevs": 4, 00:36:14.707 "num_base_bdevs_discovered": 4, 00:36:14.707 "num_base_bdevs_operational": 4, 00:36:14.707 "base_bdevs_list": [ 00:36:14.707 { 00:36:14.707 "name": "NewBaseBdev", 00:36:14.707 "uuid": "871629b2-bed0-406e-ad89-0a9cb94f333e", 00:36:14.707 "is_configured": true, 00:36:14.707 "data_offset": 2048, 00:36:14.707 "data_size": 63488 00:36:14.707 }, 00:36:14.707 { 00:36:14.707 "name": "BaseBdev2", 00:36:14.707 "uuid": "cb8b92a5-9778-4d6e-859b-57d9ec8a495f", 00:36:14.707 "is_configured": true, 00:36:14.707 "data_offset": 2048, 00:36:14.707 "data_size": 63488 00:36:14.707 }, 00:36:14.707 { 00:36:14.707 "name": "BaseBdev3", 00:36:14.707 "uuid": "d4444dfb-f6fc-46c8-9421-826bc14e0d06", 00:36:14.707 "is_configured": true, 00:36:14.707 "data_offset": 2048, 00:36:14.707 "data_size": 63488 00:36:14.707 }, 00:36:14.707 { 00:36:14.707 "name": "BaseBdev4", 00:36:14.707 "uuid": "ebc51e5a-e43e-45e5-9092-9eb2452e910f", 00:36:14.707 "is_configured": true, 00:36:14.707 "data_offset": 2048, 00:36:14.707 "data_size": 63488 00:36:14.707 } 00:36:14.707 ] 00:36:14.707 } 00:36:14.707 } 00:36:14.707 }' 00:36:14.707 11:46:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:36:14.707 11:46:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:36:14.707 BaseBdev2 00:36:14.707 BaseBdev3 00:36:14.707 BaseBdev4' 00:36:14.707 11:46:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:36:14.707 11:46:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:36:14.707 11:46:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:36:14.967 11:46:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:36:14.967 "name": "NewBaseBdev", 00:36:14.967 "aliases": [ 00:36:14.967 "871629b2-bed0-406e-ad89-0a9cb94f333e" 00:36:14.967 ], 00:36:14.967 "product_name": "Malloc disk", 00:36:14.967 "block_size": 512, 00:36:14.967 "num_blocks": 65536, 00:36:14.967 "uuid": "871629b2-bed0-406e-ad89-0a9cb94f333e", 00:36:14.967 "assigned_rate_limits": { 00:36:14.967 "rw_ios_per_sec": 0, 00:36:14.967 "rw_mbytes_per_sec": 0, 00:36:14.967 "r_mbytes_per_sec": 0, 00:36:14.967 "w_mbytes_per_sec": 0 00:36:14.967 }, 00:36:14.967 "claimed": true, 00:36:14.967 "claim_type": "exclusive_write", 00:36:14.967 "zoned": false, 00:36:14.967 "supported_io_types": { 00:36:14.967 "read": true, 00:36:14.967 "write": true, 00:36:14.967 "unmap": true, 00:36:14.967 "write_zeroes": true, 00:36:14.967 "flush": true, 00:36:14.967 "reset": true, 00:36:14.967 "compare": false, 00:36:14.967 "compare_and_write": false, 00:36:14.967 "abort": true, 00:36:14.967 "nvme_admin": false, 00:36:14.967 "nvme_io": false 00:36:14.967 }, 00:36:14.967 "memory_domains": [ 00:36:14.967 { 00:36:14.967 "dma_device_id": "system", 00:36:14.967 "dma_device_type": 1 00:36:14.967 }, 00:36:14.967 { 00:36:14.967 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:36:14.967 "dma_device_type": 2 00:36:14.967 } 00:36:14.967 ], 00:36:14.967 "driver_specific": {} 00:36:14.967 }' 00:36:14.967 11:46:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:36:14.967 11:46:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:36:14.967 11:46:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:36:14.967 11:46:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:36:14.967 11:46:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:36:15.226 11:46:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:36:15.226 11:46:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:36:15.226 11:46:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:36:15.226 11:46:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:36:15.226 11:46:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:36:15.226 11:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:36:15.226 11:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:36:15.226 11:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:36:15.226 11:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:36:15.226 11:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:36:15.486 11:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:36:15.486 "name": "BaseBdev2", 00:36:15.486 "aliases": [ 00:36:15.486 "cb8b92a5-9778-4d6e-859b-57d9ec8a495f" 00:36:15.486 ], 00:36:15.486 "product_name": "Malloc disk", 00:36:15.486 "block_size": 512, 00:36:15.486 "num_blocks": 65536, 00:36:15.486 "uuid": "cb8b92a5-9778-4d6e-859b-57d9ec8a495f", 00:36:15.486 "assigned_rate_limits": { 00:36:15.486 "rw_ios_per_sec": 0, 00:36:15.486 "rw_mbytes_per_sec": 0, 00:36:15.486 "r_mbytes_per_sec": 0, 00:36:15.486 "w_mbytes_per_sec": 0 00:36:15.486 }, 00:36:15.486 "claimed": true, 00:36:15.486 "claim_type": "exclusive_write", 00:36:15.486 "zoned": false, 00:36:15.486 "supported_io_types": { 00:36:15.486 "read": true, 00:36:15.486 "write": true, 00:36:15.486 "unmap": true, 00:36:15.486 "write_zeroes": true, 00:36:15.486 "flush": true, 00:36:15.486 "reset": true, 00:36:15.486 "compare": false, 00:36:15.486 "compare_and_write": false, 00:36:15.486 "abort": true, 00:36:15.486 "nvme_admin": false, 00:36:15.486 "nvme_io": false 00:36:15.486 }, 00:36:15.486 "memory_domains": [ 00:36:15.486 { 00:36:15.486 "dma_device_id": "system", 00:36:15.486 "dma_device_type": 1 00:36:15.486 }, 00:36:15.486 { 00:36:15.486 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:36:15.486 "dma_device_type": 2 00:36:15.486 } 00:36:15.486 ], 00:36:15.486 "driver_specific": {} 00:36:15.486 }' 00:36:15.486 11:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:36:15.486 11:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:36:15.486 11:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:36:15.486 11:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:36:15.486 11:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:36:15.486 11:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:36:15.486 11:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:36:15.745 11:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:36:15.745 11:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:36:15.745 11:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:36:15.745 11:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:36:15.745 11:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:36:15.745 11:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:36:15.745 11:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:36:15.745 11:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:36:16.004 11:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:36:16.004 "name": "BaseBdev3", 00:36:16.004 "aliases": [ 00:36:16.004 "d4444dfb-f6fc-46c8-9421-826bc14e0d06" 00:36:16.004 ], 00:36:16.004 "product_name": "Malloc disk", 00:36:16.004 "block_size": 512, 00:36:16.004 "num_blocks": 65536, 00:36:16.004 "uuid": "d4444dfb-f6fc-46c8-9421-826bc14e0d06", 00:36:16.004 "assigned_rate_limits": { 00:36:16.004 "rw_ios_per_sec": 0, 00:36:16.004 "rw_mbytes_per_sec": 0, 00:36:16.004 "r_mbytes_per_sec": 0, 00:36:16.004 "w_mbytes_per_sec": 0 00:36:16.004 }, 00:36:16.004 "claimed": true, 00:36:16.004 "claim_type": "exclusive_write", 00:36:16.004 "zoned": false, 00:36:16.004 "supported_io_types": { 00:36:16.004 "read": true, 00:36:16.004 "write": true, 00:36:16.004 "unmap": true, 00:36:16.004 "write_zeroes": true, 00:36:16.005 "flush": true, 00:36:16.005 "reset": true, 00:36:16.005 "compare": false, 00:36:16.005 "compare_and_write": false, 00:36:16.005 "abort": true, 00:36:16.005 "nvme_admin": false, 00:36:16.005 "nvme_io": false 00:36:16.005 }, 00:36:16.005 "memory_domains": [ 00:36:16.005 { 00:36:16.005 "dma_device_id": "system", 00:36:16.005 "dma_device_type": 1 00:36:16.005 }, 00:36:16.005 { 00:36:16.005 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:36:16.005 "dma_device_type": 2 00:36:16.005 } 00:36:16.005 ], 00:36:16.005 "driver_specific": {} 00:36:16.005 }' 00:36:16.005 11:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:36:16.005 11:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:36:16.005 11:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:36:16.005 11:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:36:16.005 11:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:36:16.005 11:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:36:16.005 11:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:36:16.005 11:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:36:16.264 11:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:36:16.264 11:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:36:16.264 11:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:36:16.264 11:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:36:16.264 11:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:36:16.264 11:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:36:16.264 11:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:36:16.523 11:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:36:16.523 "name": "BaseBdev4", 00:36:16.523 "aliases": [ 00:36:16.523 "ebc51e5a-e43e-45e5-9092-9eb2452e910f" 00:36:16.523 ], 00:36:16.523 "product_name": "Malloc disk", 00:36:16.523 "block_size": 512, 00:36:16.523 "num_blocks": 65536, 00:36:16.523 "uuid": "ebc51e5a-e43e-45e5-9092-9eb2452e910f", 00:36:16.523 "assigned_rate_limits": { 00:36:16.523 "rw_ios_per_sec": 0, 00:36:16.523 "rw_mbytes_per_sec": 0, 00:36:16.523 "r_mbytes_per_sec": 0, 00:36:16.523 "w_mbytes_per_sec": 0 00:36:16.523 }, 00:36:16.523 "claimed": true, 00:36:16.523 "claim_type": "exclusive_write", 00:36:16.523 "zoned": false, 00:36:16.523 "supported_io_types": { 00:36:16.523 "read": true, 00:36:16.523 "write": true, 00:36:16.523 "unmap": true, 00:36:16.523 "write_zeroes": true, 00:36:16.523 "flush": true, 00:36:16.523 "reset": true, 00:36:16.523 "compare": false, 00:36:16.523 "compare_and_write": false, 00:36:16.523 "abort": true, 00:36:16.523 "nvme_admin": false, 00:36:16.523 "nvme_io": false 00:36:16.523 }, 00:36:16.523 "memory_domains": [ 00:36:16.523 { 00:36:16.523 "dma_device_id": "system", 00:36:16.523 "dma_device_type": 1 00:36:16.523 }, 00:36:16.523 { 00:36:16.523 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:36:16.523 "dma_device_type": 2 00:36:16.523 } 00:36:16.523 ], 00:36:16.523 "driver_specific": {} 00:36:16.523 }' 00:36:16.523 11:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:36:16.523 11:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:36:16.523 11:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:36:16.524 11:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:36:16.524 11:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:36:16.524 11:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:36:16.524 11:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:36:16.524 11:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:36:16.783 11:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:36:16.783 11:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:36:16.783 11:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:36:16.783 11:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:36:16.783 11:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:36:16.783 [2024-06-10 11:47:00.720007] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:36:16.783 [2024-06-10 11:47:00.720033] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:36:16.783 [2024-06-10 11:47:00.720070] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:36:16.783 [2024-06-10 11:47:00.720260] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:36:16.783 [2024-06-10 11:47:00.720269] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2170c50 name Existed_Raid, state offline 00:36:17.043 11:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 207537 00:36:17.043 11:47:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@949 -- # '[' -z 207537 ']' 00:36:17.043 11:47:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # kill -0 207537 00:36:17.043 11:47:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # uname 00:36:17.043 11:47:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:36:17.043 11:47:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 207537 00:36:17.043 11:47:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:36:17.043 11:47:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:36:17.043 11:47:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # echo 'killing process with pid 207537' 00:36:17.043 killing process with pid 207537 00:36:17.043 11:47:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # kill 207537 00:36:17.043 [2024-06-10 11:47:00.783115] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:36:17.043 11:47:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@973 -- # wait 207537 00:36:17.043 [2024-06-10 11:47:00.818103] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:36:17.302 11:47:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:36:17.302 00:36:17.302 real 0m24.805s 00:36:17.302 user 0m45.252s 00:36:17.302 sys 0m4.840s 00:36:17.302 11:47:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # xtrace_disable 00:36:17.302 11:47:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:36:17.302 ************************************ 00:36:17.303 END TEST raid_state_function_test_sb 00:36:17.303 ************************************ 00:36:17.303 11:47:01 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 4 00:36:17.303 11:47:01 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:36:17.303 11:47:01 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:36:17.303 11:47:01 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:36:17.303 ************************************ 00:36:17.303 START TEST raid_superblock_test 00:36:17.303 ************************************ 00:36:17.303 11:47:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # raid_superblock_test raid1 4 00:36:17.303 11:47:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:36:17.303 11:47:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:36:17.303 11:47:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:36:17.303 11:47:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:36:17.303 11:47:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:36:17.303 11:47:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:36:17.303 11:47:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:36:17.303 11:47:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:36:17.303 11:47:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:36:17.303 11:47:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:36:17.303 11:47:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:36:17.303 11:47:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:36:17.303 11:47:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:36:17.303 11:47:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:36:17.303 11:47:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:36:17.303 11:47:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=211433 00:36:17.303 11:47:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 211433 /var/tmp/spdk-raid.sock 00:36:17.303 11:47:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:36:17.303 11:47:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@830 -- # '[' -z 211433 ']' 00:36:17.303 11:47:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:36:17.303 11:47:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:36:17.303 11:47:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:36:17.303 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:36:17.303 11:47:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:36:17.303 11:47:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:36:17.303 [2024-06-10 11:47:01.140234] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:36:17.303 [2024-06-10 11:47:01.140291] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid211433 ] 00:36:17.303 [2024-06-10 11:47:01.227030] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:36:17.562 [2024-06-10 11:47:01.315682] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:36:17.562 [2024-06-10 11:47:01.374809] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:36:17.562 [2024-06-10 11:47:01.374840] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:36:18.131 11:47:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:36:18.131 11:47:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@863 -- # return 0 00:36:18.131 11:47:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:36:18.131 11:47:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:36:18.131 11:47:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:36:18.131 11:47:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:36:18.131 11:47:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:36:18.131 11:47:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:36:18.131 11:47:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:36:18.131 11:47:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:36:18.131 11:47:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:36:18.389 malloc1 00:36:18.390 11:47:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:36:18.390 [2024-06-10 11:47:02.300220] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:36:18.390 [2024-06-10 11:47:02.300260] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:36:18.390 [2024-06-10 11:47:02.300274] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x982100 00:36:18.390 [2024-06-10 11:47:02.300282] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:36:18.390 [2024-06-10 11:47:02.301470] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:36:18.390 [2024-06-10 11:47:02.301498] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:36:18.390 pt1 00:36:18.390 11:47:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:36:18.390 11:47:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:36:18.390 11:47:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:36:18.390 11:47:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:36:18.390 11:47:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:36:18.390 11:47:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:36:18.390 11:47:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:36:18.390 11:47:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:36:18.390 11:47:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:36:18.648 malloc2 00:36:18.648 11:47:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:36:18.907 [2024-06-10 11:47:02.673075] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:36:18.907 [2024-06-10 11:47:02.673107] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:36:18.907 [2024-06-10 11:47:02.673119] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x983500 00:36:18.907 [2024-06-10 11:47:02.673126] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:36:18.907 [2024-06-10 11:47:02.674139] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:36:18.907 [2024-06-10 11:47:02.674160] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:36:18.907 pt2 00:36:18.907 11:47:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:36:18.907 11:47:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:36:18.907 11:47:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:36:18.907 11:47:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:36:18.907 11:47:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:36:18.907 11:47:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:36:18.907 11:47:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:36:18.907 11:47:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:36:18.907 11:47:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:36:19.166 malloc3 00:36:19.166 11:47:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:36:19.166 [2024-06-10 11:47:03.026793] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:36:19.166 [2024-06-10 11:47:03.026829] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:36:19.166 [2024-06-10 11:47:03.026842] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb2d7a0 00:36:19.166 [2024-06-10 11:47:03.026850] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:36:19.166 [2024-06-10 11:47:03.027947] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:36:19.166 [2024-06-10 11:47:03.027971] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:36:19.166 pt3 00:36:19.166 11:47:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:36:19.166 11:47:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:36:19.166 11:47:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:36:19.166 11:47:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:36:19.166 11:47:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:36:19.166 11:47:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:36:19.166 11:47:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:36:19.166 11:47:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:36:19.166 11:47:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:36:19.425 malloc4 00:36:19.425 11:47:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:36:19.425 [2024-06-10 11:47:03.368619] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:36:19.425 [2024-06-10 11:47:03.368658] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:36:19.425 [2024-06-10 11:47:03.368671] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb2fb50 00:36:19.425 [2024-06-10 11:47:03.368679] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:36:19.425 [2024-06-10 11:47:03.369844] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:36:19.425 [2024-06-10 11:47:03.369876] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:36:19.685 pt4 00:36:19.685 11:47:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:36:19.685 11:47:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:36:19.685 11:47:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:36:19.685 [2024-06-10 11:47:03.545097] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:36:19.685 [2024-06-10 11:47:03.546137] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:36:19.685 [2024-06-10 11:47:03.546178] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:36:19.685 [2024-06-10 11:47:03.546210] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:36:19.685 [2024-06-10 11:47:03.546336] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xb301a0 00:36:19.685 [2024-06-10 11:47:03.546344] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:36:19.685 [2024-06-10 11:47:03.546493] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x981090 00:36:19.685 [2024-06-10 11:47:03.546601] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xb301a0 00:36:19.685 [2024-06-10 11:47:03.546608] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xb301a0 00:36:19.685 [2024-06-10 11:47:03.546678] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:36:19.685 11:47:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:36:19.685 11:47:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:36:19.685 11:47:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:36:19.685 11:47:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:36:19.685 11:47:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:36:19.685 11:47:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:36:19.685 11:47:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:36:19.685 11:47:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:36:19.685 11:47:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:36:19.685 11:47:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:36:19.685 11:47:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:19.685 11:47:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:36:19.944 11:47:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:36:19.944 "name": "raid_bdev1", 00:36:19.944 "uuid": "1c16718b-d2b5-4851-af63-1f54b4c2ebe2", 00:36:19.944 "strip_size_kb": 0, 00:36:19.944 "state": "online", 00:36:19.944 "raid_level": "raid1", 00:36:19.944 "superblock": true, 00:36:19.944 "num_base_bdevs": 4, 00:36:19.944 "num_base_bdevs_discovered": 4, 00:36:19.944 "num_base_bdevs_operational": 4, 00:36:19.944 "base_bdevs_list": [ 00:36:19.944 { 00:36:19.944 "name": "pt1", 00:36:19.944 "uuid": "00000000-0000-0000-0000-000000000001", 00:36:19.944 "is_configured": true, 00:36:19.944 "data_offset": 2048, 00:36:19.944 "data_size": 63488 00:36:19.944 }, 00:36:19.944 { 00:36:19.944 "name": "pt2", 00:36:19.944 "uuid": "00000000-0000-0000-0000-000000000002", 00:36:19.944 "is_configured": true, 00:36:19.944 "data_offset": 2048, 00:36:19.944 "data_size": 63488 00:36:19.944 }, 00:36:19.944 { 00:36:19.944 "name": "pt3", 00:36:19.944 "uuid": "00000000-0000-0000-0000-000000000003", 00:36:19.944 "is_configured": true, 00:36:19.944 "data_offset": 2048, 00:36:19.944 "data_size": 63488 00:36:19.944 }, 00:36:19.945 { 00:36:19.945 "name": "pt4", 00:36:19.945 "uuid": "00000000-0000-0000-0000-000000000004", 00:36:19.945 "is_configured": true, 00:36:19.945 "data_offset": 2048, 00:36:19.945 "data_size": 63488 00:36:19.945 } 00:36:19.945 ] 00:36:19.945 }' 00:36:19.945 11:47:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:36:19.945 11:47:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:36:20.513 11:47:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:36:20.513 11:47:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:36:20.513 11:47:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:36:20.513 11:47:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:36:20.513 11:47:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:36:20.513 11:47:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:36:20.513 11:47:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:36:20.513 11:47:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:36:20.513 [2024-06-10 11:47:04.355354] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:36:20.513 11:47:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:36:20.513 "name": "raid_bdev1", 00:36:20.513 "aliases": [ 00:36:20.513 "1c16718b-d2b5-4851-af63-1f54b4c2ebe2" 00:36:20.513 ], 00:36:20.513 "product_name": "Raid Volume", 00:36:20.513 "block_size": 512, 00:36:20.513 "num_blocks": 63488, 00:36:20.513 "uuid": "1c16718b-d2b5-4851-af63-1f54b4c2ebe2", 00:36:20.513 "assigned_rate_limits": { 00:36:20.513 "rw_ios_per_sec": 0, 00:36:20.513 "rw_mbytes_per_sec": 0, 00:36:20.513 "r_mbytes_per_sec": 0, 00:36:20.513 "w_mbytes_per_sec": 0 00:36:20.513 }, 00:36:20.513 "claimed": false, 00:36:20.513 "zoned": false, 00:36:20.513 "supported_io_types": { 00:36:20.513 "read": true, 00:36:20.513 "write": true, 00:36:20.513 "unmap": false, 00:36:20.513 "write_zeroes": true, 00:36:20.513 "flush": false, 00:36:20.513 "reset": true, 00:36:20.513 "compare": false, 00:36:20.513 "compare_and_write": false, 00:36:20.513 "abort": false, 00:36:20.513 "nvme_admin": false, 00:36:20.513 "nvme_io": false 00:36:20.513 }, 00:36:20.513 "memory_domains": [ 00:36:20.513 { 00:36:20.513 "dma_device_id": "system", 00:36:20.513 "dma_device_type": 1 00:36:20.513 }, 00:36:20.513 { 00:36:20.513 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:36:20.513 "dma_device_type": 2 00:36:20.513 }, 00:36:20.513 { 00:36:20.513 "dma_device_id": "system", 00:36:20.513 "dma_device_type": 1 00:36:20.513 }, 00:36:20.513 { 00:36:20.513 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:36:20.513 "dma_device_type": 2 00:36:20.513 }, 00:36:20.513 { 00:36:20.513 "dma_device_id": "system", 00:36:20.513 "dma_device_type": 1 00:36:20.513 }, 00:36:20.513 { 00:36:20.513 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:36:20.513 "dma_device_type": 2 00:36:20.513 }, 00:36:20.513 { 00:36:20.513 "dma_device_id": "system", 00:36:20.513 "dma_device_type": 1 00:36:20.513 }, 00:36:20.513 { 00:36:20.513 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:36:20.513 "dma_device_type": 2 00:36:20.513 } 00:36:20.513 ], 00:36:20.513 "driver_specific": { 00:36:20.513 "raid": { 00:36:20.513 "uuid": "1c16718b-d2b5-4851-af63-1f54b4c2ebe2", 00:36:20.513 "strip_size_kb": 0, 00:36:20.513 "state": "online", 00:36:20.513 "raid_level": "raid1", 00:36:20.513 "superblock": true, 00:36:20.513 "num_base_bdevs": 4, 00:36:20.513 "num_base_bdevs_discovered": 4, 00:36:20.513 "num_base_bdevs_operational": 4, 00:36:20.513 "base_bdevs_list": [ 00:36:20.513 { 00:36:20.513 "name": "pt1", 00:36:20.513 "uuid": "00000000-0000-0000-0000-000000000001", 00:36:20.513 "is_configured": true, 00:36:20.513 "data_offset": 2048, 00:36:20.513 "data_size": 63488 00:36:20.513 }, 00:36:20.513 { 00:36:20.513 "name": "pt2", 00:36:20.513 "uuid": "00000000-0000-0000-0000-000000000002", 00:36:20.513 "is_configured": true, 00:36:20.513 "data_offset": 2048, 00:36:20.513 "data_size": 63488 00:36:20.513 }, 00:36:20.513 { 00:36:20.513 "name": "pt3", 00:36:20.513 "uuid": "00000000-0000-0000-0000-000000000003", 00:36:20.513 "is_configured": true, 00:36:20.513 "data_offset": 2048, 00:36:20.513 "data_size": 63488 00:36:20.513 }, 00:36:20.513 { 00:36:20.513 "name": "pt4", 00:36:20.513 "uuid": "00000000-0000-0000-0000-000000000004", 00:36:20.513 "is_configured": true, 00:36:20.513 "data_offset": 2048, 00:36:20.513 "data_size": 63488 00:36:20.513 } 00:36:20.514 ] 00:36:20.514 } 00:36:20.514 } 00:36:20.514 }' 00:36:20.514 11:47:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:36:20.514 11:47:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:36:20.514 pt2 00:36:20.514 pt3 00:36:20.514 pt4' 00:36:20.514 11:47:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:36:20.514 11:47:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:36:20.514 11:47:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:36:20.772 11:47:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:36:20.772 "name": "pt1", 00:36:20.772 "aliases": [ 00:36:20.772 "00000000-0000-0000-0000-000000000001" 00:36:20.772 ], 00:36:20.772 "product_name": "passthru", 00:36:20.772 "block_size": 512, 00:36:20.772 "num_blocks": 65536, 00:36:20.772 "uuid": "00000000-0000-0000-0000-000000000001", 00:36:20.772 "assigned_rate_limits": { 00:36:20.772 "rw_ios_per_sec": 0, 00:36:20.772 "rw_mbytes_per_sec": 0, 00:36:20.772 "r_mbytes_per_sec": 0, 00:36:20.772 "w_mbytes_per_sec": 0 00:36:20.772 }, 00:36:20.772 "claimed": true, 00:36:20.772 "claim_type": "exclusive_write", 00:36:20.772 "zoned": false, 00:36:20.772 "supported_io_types": { 00:36:20.772 "read": true, 00:36:20.772 "write": true, 00:36:20.772 "unmap": true, 00:36:20.772 "write_zeroes": true, 00:36:20.772 "flush": true, 00:36:20.772 "reset": true, 00:36:20.772 "compare": false, 00:36:20.772 "compare_and_write": false, 00:36:20.772 "abort": true, 00:36:20.772 "nvme_admin": false, 00:36:20.772 "nvme_io": false 00:36:20.772 }, 00:36:20.772 "memory_domains": [ 00:36:20.772 { 00:36:20.772 "dma_device_id": "system", 00:36:20.772 "dma_device_type": 1 00:36:20.772 }, 00:36:20.772 { 00:36:20.773 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:36:20.773 "dma_device_type": 2 00:36:20.773 } 00:36:20.773 ], 00:36:20.773 "driver_specific": { 00:36:20.773 "passthru": { 00:36:20.773 "name": "pt1", 00:36:20.773 "base_bdev_name": "malloc1" 00:36:20.773 } 00:36:20.773 } 00:36:20.773 }' 00:36:20.773 11:47:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:36:20.773 11:47:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:36:20.773 11:47:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:36:20.773 11:47:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:36:20.773 11:47:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:36:21.032 11:47:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:36:21.032 11:47:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:36:21.032 11:47:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:36:21.032 11:47:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:36:21.032 11:47:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:36:21.032 11:47:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:36:21.032 11:47:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:36:21.032 11:47:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:36:21.032 11:47:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:36:21.032 11:47:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:36:21.291 11:47:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:36:21.291 "name": "pt2", 00:36:21.291 "aliases": [ 00:36:21.291 "00000000-0000-0000-0000-000000000002" 00:36:21.291 ], 00:36:21.291 "product_name": "passthru", 00:36:21.291 "block_size": 512, 00:36:21.291 "num_blocks": 65536, 00:36:21.291 "uuid": "00000000-0000-0000-0000-000000000002", 00:36:21.291 "assigned_rate_limits": { 00:36:21.291 "rw_ios_per_sec": 0, 00:36:21.291 "rw_mbytes_per_sec": 0, 00:36:21.291 "r_mbytes_per_sec": 0, 00:36:21.291 "w_mbytes_per_sec": 0 00:36:21.291 }, 00:36:21.291 "claimed": true, 00:36:21.291 "claim_type": "exclusive_write", 00:36:21.291 "zoned": false, 00:36:21.291 "supported_io_types": { 00:36:21.291 "read": true, 00:36:21.291 "write": true, 00:36:21.291 "unmap": true, 00:36:21.291 "write_zeroes": true, 00:36:21.291 "flush": true, 00:36:21.291 "reset": true, 00:36:21.291 "compare": false, 00:36:21.291 "compare_and_write": false, 00:36:21.291 "abort": true, 00:36:21.291 "nvme_admin": false, 00:36:21.291 "nvme_io": false 00:36:21.291 }, 00:36:21.291 "memory_domains": [ 00:36:21.291 { 00:36:21.291 "dma_device_id": "system", 00:36:21.291 "dma_device_type": 1 00:36:21.291 }, 00:36:21.291 { 00:36:21.291 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:36:21.291 "dma_device_type": 2 00:36:21.291 } 00:36:21.291 ], 00:36:21.291 "driver_specific": { 00:36:21.291 "passthru": { 00:36:21.291 "name": "pt2", 00:36:21.291 "base_bdev_name": "malloc2" 00:36:21.291 } 00:36:21.291 } 00:36:21.291 }' 00:36:21.291 11:47:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:36:21.291 11:47:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:36:21.291 11:47:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:36:21.291 11:47:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:36:21.291 11:47:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:36:21.291 11:47:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:36:21.291 11:47:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:36:21.291 11:47:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:36:21.550 11:47:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:36:21.550 11:47:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:36:21.550 11:47:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:36:21.550 11:47:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:36:21.550 11:47:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:36:21.550 11:47:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:36:21.550 11:47:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:36:21.550 11:47:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:36:21.550 "name": "pt3", 00:36:21.550 "aliases": [ 00:36:21.550 "00000000-0000-0000-0000-000000000003" 00:36:21.550 ], 00:36:21.550 "product_name": "passthru", 00:36:21.550 "block_size": 512, 00:36:21.550 "num_blocks": 65536, 00:36:21.550 "uuid": "00000000-0000-0000-0000-000000000003", 00:36:21.550 "assigned_rate_limits": { 00:36:21.550 "rw_ios_per_sec": 0, 00:36:21.550 "rw_mbytes_per_sec": 0, 00:36:21.550 "r_mbytes_per_sec": 0, 00:36:21.550 "w_mbytes_per_sec": 0 00:36:21.550 }, 00:36:21.550 "claimed": true, 00:36:21.550 "claim_type": "exclusive_write", 00:36:21.550 "zoned": false, 00:36:21.550 "supported_io_types": { 00:36:21.550 "read": true, 00:36:21.550 "write": true, 00:36:21.550 "unmap": true, 00:36:21.550 "write_zeroes": true, 00:36:21.550 "flush": true, 00:36:21.550 "reset": true, 00:36:21.550 "compare": false, 00:36:21.550 "compare_and_write": false, 00:36:21.550 "abort": true, 00:36:21.550 "nvme_admin": false, 00:36:21.550 "nvme_io": false 00:36:21.550 }, 00:36:21.550 "memory_domains": [ 00:36:21.550 { 00:36:21.550 "dma_device_id": "system", 00:36:21.550 "dma_device_type": 1 00:36:21.550 }, 00:36:21.550 { 00:36:21.550 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:36:21.550 "dma_device_type": 2 00:36:21.550 } 00:36:21.550 ], 00:36:21.550 "driver_specific": { 00:36:21.550 "passthru": { 00:36:21.550 "name": "pt3", 00:36:21.550 "base_bdev_name": "malloc3" 00:36:21.550 } 00:36:21.550 } 00:36:21.550 }' 00:36:21.550 11:47:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:36:21.807 11:47:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:36:21.807 11:47:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:36:21.807 11:47:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:36:21.807 11:47:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:36:21.807 11:47:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:36:21.807 11:47:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:36:21.807 11:47:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:36:21.807 11:47:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:36:21.807 11:47:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:36:21.807 11:47:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:36:22.065 11:47:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:36:22.065 11:47:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:36:22.065 11:47:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:36:22.065 11:47:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:36:22.065 11:47:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:36:22.065 "name": "pt4", 00:36:22.065 "aliases": [ 00:36:22.065 "00000000-0000-0000-0000-000000000004" 00:36:22.065 ], 00:36:22.065 "product_name": "passthru", 00:36:22.065 "block_size": 512, 00:36:22.065 "num_blocks": 65536, 00:36:22.065 "uuid": "00000000-0000-0000-0000-000000000004", 00:36:22.065 "assigned_rate_limits": { 00:36:22.065 "rw_ios_per_sec": 0, 00:36:22.065 "rw_mbytes_per_sec": 0, 00:36:22.065 "r_mbytes_per_sec": 0, 00:36:22.065 "w_mbytes_per_sec": 0 00:36:22.065 }, 00:36:22.065 "claimed": true, 00:36:22.065 "claim_type": "exclusive_write", 00:36:22.065 "zoned": false, 00:36:22.065 "supported_io_types": { 00:36:22.065 "read": true, 00:36:22.065 "write": true, 00:36:22.065 "unmap": true, 00:36:22.065 "write_zeroes": true, 00:36:22.065 "flush": true, 00:36:22.065 "reset": true, 00:36:22.065 "compare": false, 00:36:22.065 "compare_and_write": false, 00:36:22.065 "abort": true, 00:36:22.065 "nvme_admin": false, 00:36:22.065 "nvme_io": false 00:36:22.065 }, 00:36:22.065 "memory_domains": [ 00:36:22.065 { 00:36:22.065 "dma_device_id": "system", 00:36:22.065 "dma_device_type": 1 00:36:22.065 }, 00:36:22.065 { 00:36:22.065 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:36:22.065 "dma_device_type": 2 00:36:22.065 } 00:36:22.065 ], 00:36:22.065 "driver_specific": { 00:36:22.065 "passthru": { 00:36:22.065 "name": "pt4", 00:36:22.065 "base_bdev_name": "malloc4" 00:36:22.065 } 00:36:22.065 } 00:36:22.065 }' 00:36:22.065 11:47:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:36:22.065 11:47:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:36:22.065 11:47:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:36:22.065 11:47:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:36:22.323 11:47:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:36:22.323 11:47:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:36:22.323 11:47:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:36:22.323 11:47:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:36:22.323 11:47:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:36:22.323 11:47:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:36:22.323 11:47:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:36:22.323 11:47:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:36:22.323 11:47:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:36:22.323 11:47:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:36:22.582 [2024-06-10 11:47:06.352523] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:36:22.582 11:47:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=1c16718b-d2b5-4851-af63-1f54b4c2ebe2 00:36:22.582 11:47:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 1c16718b-d2b5-4851-af63-1f54b4c2ebe2 ']' 00:36:22.582 11:47:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:36:22.582 [2024-06-10 11:47:06.524784] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:36:22.582 [2024-06-10 11:47:06.524802] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:36:22.582 [2024-06-10 11:47:06.524842] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:36:22.582 [2024-06-10 11:47:06.524921] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:36:22.582 [2024-06-10 11:47:06.524932] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb301a0 name raid_bdev1, state offline 00:36:22.840 11:47:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:22.840 11:47:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:36:22.840 11:47:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:36:22.840 11:47:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:36:22.840 11:47:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:36:22.840 11:47:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:36:23.099 11:47:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:36:23.099 11:47:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:36:23.099 11:47:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:36:23.099 11:47:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:36:23.358 11:47:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:36:23.359 11:47:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:36:23.618 11:47:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:36:23.618 11:47:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:36:23.618 11:47:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:36:23.618 11:47:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:36:23.618 11:47:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@649 -- # local es=0 00:36:23.618 11:47:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:36:23.618 11:47:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:36:23.618 11:47:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:36:23.618 11:47:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:36:23.618 11:47:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:36:23.618 11:47:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:36:23.618 11:47:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:36:23.618 11:47:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:36:23.618 11:47:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:36:23.618 11:47:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:36:23.878 [2024-06-10 11:47:07.695781] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:36:23.878 [2024-06-10 11:47:07.696802] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:36:23.878 [2024-06-10 11:47:07.696833] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:36:23.878 [2024-06-10 11:47:07.696855] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:36:23.878 [2024-06-10 11:47:07.696910] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:36:23.878 [2024-06-10 11:47:07.696940] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:36:23.878 [2024-06-10 11:47:07.696955] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:36:23.878 [2024-06-10 11:47:07.696969] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:36:23.878 [2024-06-10 11:47:07.696981] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:36:23.878 [2024-06-10 11:47:07.696989] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb338d0 name raid_bdev1, state configuring 00:36:23.878 request: 00:36:23.878 { 00:36:23.878 "name": "raid_bdev1", 00:36:23.878 "raid_level": "raid1", 00:36:23.878 "base_bdevs": [ 00:36:23.878 "malloc1", 00:36:23.878 "malloc2", 00:36:23.878 "malloc3", 00:36:23.878 "malloc4" 00:36:23.878 ], 00:36:23.878 "superblock": false, 00:36:23.878 "method": "bdev_raid_create", 00:36:23.878 "req_id": 1 00:36:23.878 } 00:36:23.878 Got JSON-RPC error response 00:36:23.878 response: 00:36:23.878 { 00:36:23.878 "code": -17, 00:36:23.878 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:36:23.878 } 00:36:23.878 11:47:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # es=1 00:36:23.878 11:47:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:36:23.878 11:47:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:36:23.878 11:47:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:36:23.878 11:47:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:23.878 11:47:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:36:24.137 11:47:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:36:24.137 11:47:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:36:24.137 11:47:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:36:24.137 [2024-06-10 11:47:08.040638] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:36:24.137 [2024-06-10 11:47:08.040673] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:36:24.137 [2024-06-10 11:47:08.040685] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb334a0 00:36:24.137 [2024-06-10 11:47:08.040694] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:36:24.137 [2024-06-10 11:47:08.041897] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:36:24.137 [2024-06-10 11:47:08.041919] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:36:24.137 [2024-06-10 11:47:08.041968] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:36:24.137 [2024-06-10 11:47:08.041994] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:36:24.137 pt1 00:36:24.137 11:47:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 4 00:36:24.137 11:47:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:36:24.137 11:47:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:36:24.137 11:47:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:36:24.137 11:47:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:36:24.137 11:47:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:36:24.137 11:47:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:36:24.137 11:47:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:36:24.137 11:47:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:36:24.137 11:47:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:36:24.137 11:47:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:36:24.137 11:47:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:24.397 11:47:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:36:24.397 "name": "raid_bdev1", 00:36:24.397 "uuid": "1c16718b-d2b5-4851-af63-1f54b4c2ebe2", 00:36:24.397 "strip_size_kb": 0, 00:36:24.397 "state": "configuring", 00:36:24.397 "raid_level": "raid1", 00:36:24.397 "superblock": true, 00:36:24.397 "num_base_bdevs": 4, 00:36:24.397 "num_base_bdevs_discovered": 1, 00:36:24.397 "num_base_bdevs_operational": 4, 00:36:24.397 "base_bdevs_list": [ 00:36:24.397 { 00:36:24.397 "name": "pt1", 00:36:24.397 "uuid": "00000000-0000-0000-0000-000000000001", 00:36:24.397 "is_configured": true, 00:36:24.397 "data_offset": 2048, 00:36:24.397 "data_size": 63488 00:36:24.397 }, 00:36:24.397 { 00:36:24.397 "name": null, 00:36:24.397 "uuid": "00000000-0000-0000-0000-000000000002", 00:36:24.397 "is_configured": false, 00:36:24.397 "data_offset": 2048, 00:36:24.397 "data_size": 63488 00:36:24.397 }, 00:36:24.397 { 00:36:24.397 "name": null, 00:36:24.397 "uuid": "00000000-0000-0000-0000-000000000003", 00:36:24.397 "is_configured": false, 00:36:24.397 "data_offset": 2048, 00:36:24.397 "data_size": 63488 00:36:24.397 }, 00:36:24.397 { 00:36:24.397 "name": null, 00:36:24.397 "uuid": "00000000-0000-0000-0000-000000000004", 00:36:24.397 "is_configured": false, 00:36:24.397 "data_offset": 2048, 00:36:24.397 "data_size": 63488 00:36:24.397 } 00:36:24.397 ] 00:36:24.397 }' 00:36:24.397 11:47:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:36:24.397 11:47:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:36:24.965 11:47:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:36:24.965 11:47:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:36:24.965 [2024-06-10 11:47:08.902884] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:36:24.965 [2024-06-10 11:47:08.902924] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:36:24.965 [2024-06-10 11:47:08.902938] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x980a00 00:36:24.965 [2024-06-10 11:47:08.902946] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:36:24.965 [2024-06-10 11:47:08.903196] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:36:24.965 [2024-06-10 11:47:08.903210] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:36:24.965 [2024-06-10 11:47:08.903259] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:36:24.965 [2024-06-10 11:47:08.903276] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:36:24.965 pt2 00:36:25.224 11:47:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:36:25.224 [2024-06-10 11:47:09.079346] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:36:25.224 11:47:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 4 00:36:25.224 11:47:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:36:25.224 11:47:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:36:25.224 11:47:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:36:25.224 11:47:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:36:25.224 11:47:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:36:25.224 11:47:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:36:25.224 11:47:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:36:25.224 11:47:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:36:25.224 11:47:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:36:25.224 11:47:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:25.224 11:47:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:36:25.483 11:47:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:36:25.483 "name": "raid_bdev1", 00:36:25.483 "uuid": "1c16718b-d2b5-4851-af63-1f54b4c2ebe2", 00:36:25.483 "strip_size_kb": 0, 00:36:25.483 "state": "configuring", 00:36:25.483 "raid_level": "raid1", 00:36:25.483 "superblock": true, 00:36:25.483 "num_base_bdevs": 4, 00:36:25.483 "num_base_bdevs_discovered": 1, 00:36:25.483 "num_base_bdevs_operational": 4, 00:36:25.483 "base_bdevs_list": [ 00:36:25.483 { 00:36:25.483 "name": "pt1", 00:36:25.483 "uuid": "00000000-0000-0000-0000-000000000001", 00:36:25.483 "is_configured": true, 00:36:25.483 "data_offset": 2048, 00:36:25.483 "data_size": 63488 00:36:25.483 }, 00:36:25.483 { 00:36:25.483 "name": null, 00:36:25.483 "uuid": "00000000-0000-0000-0000-000000000002", 00:36:25.483 "is_configured": false, 00:36:25.483 "data_offset": 2048, 00:36:25.483 "data_size": 63488 00:36:25.483 }, 00:36:25.483 { 00:36:25.483 "name": null, 00:36:25.483 "uuid": "00000000-0000-0000-0000-000000000003", 00:36:25.483 "is_configured": false, 00:36:25.483 "data_offset": 2048, 00:36:25.483 "data_size": 63488 00:36:25.483 }, 00:36:25.483 { 00:36:25.483 "name": null, 00:36:25.483 "uuid": "00000000-0000-0000-0000-000000000004", 00:36:25.483 "is_configured": false, 00:36:25.483 "data_offset": 2048, 00:36:25.483 "data_size": 63488 00:36:25.483 } 00:36:25.483 ] 00:36:25.483 }' 00:36:25.483 11:47:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:36:25.483 11:47:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:36:26.052 11:47:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:36:26.052 11:47:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:36:26.052 11:47:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:36:26.052 [2024-06-10 11:47:09.897451] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:36:26.052 [2024-06-10 11:47:09.897491] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:36:26.052 [2024-06-10 11:47:09.897505] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb2dbe0 00:36:26.052 [2024-06-10 11:47:09.897513] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:36:26.052 [2024-06-10 11:47:09.897746] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:36:26.052 [2024-06-10 11:47:09.897757] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:36:26.052 [2024-06-10 11:47:09.897802] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:36:26.052 [2024-06-10 11:47:09.897816] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:36:26.052 pt2 00:36:26.052 11:47:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:36:26.052 11:47:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:36:26.052 11:47:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:36:26.312 [2024-06-10 11:47:10.085962] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:36:26.312 [2024-06-10 11:47:10.085993] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:36:26.312 [2024-06-10 11:47:10.086005] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb2de50 00:36:26.312 [2024-06-10 11:47:10.086013] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:36:26.312 [2024-06-10 11:47:10.086230] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:36:26.312 [2024-06-10 11:47:10.086244] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:36:26.312 [2024-06-10 11:47:10.086282] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:36:26.312 [2024-06-10 11:47:10.086295] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:36:26.312 pt3 00:36:26.312 11:47:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:36:26.312 11:47:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:36:26.312 11:47:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:36:26.571 [2024-06-10 11:47:10.270444] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:36:26.571 [2024-06-10 11:47:10.270477] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:36:26.572 [2024-06-10 11:47:10.270490] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb2c630 00:36:26.572 [2024-06-10 11:47:10.270498] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:36:26.572 [2024-06-10 11:47:10.270725] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:36:26.572 [2024-06-10 11:47:10.270737] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:36:26.572 [2024-06-10 11:47:10.270779] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:36:26.572 [2024-06-10 11:47:10.270793] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:36:26.572 [2024-06-10 11:47:10.270887] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x980d20 00:36:26.572 [2024-06-10 11:47:10.270895] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:36:26.572 [2024-06-10 11:47:10.271006] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb2ce90 00:36:26.572 [2024-06-10 11:47:10.271098] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x980d20 00:36:26.572 [2024-06-10 11:47:10.271105] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x980d20 00:36:26.572 [2024-06-10 11:47:10.271170] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:36:26.572 pt4 00:36:26.572 11:47:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:36:26.572 11:47:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:36:26.572 11:47:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:36:26.572 11:47:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:36:26.572 11:47:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:36:26.572 11:47:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:36:26.572 11:47:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:36:26.572 11:47:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:36:26.572 11:47:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:36:26.572 11:47:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:36:26.572 11:47:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:36:26.572 11:47:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:36:26.572 11:47:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:26.572 11:47:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:36:26.572 11:47:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:36:26.572 "name": "raid_bdev1", 00:36:26.572 "uuid": "1c16718b-d2b5-4851-af63-1f54b4c2ebe2", 00:36:26.572 "strip_size_kb": 0, 00:36:26.572 "state": "online", 00:36:26.572 "raid_level": "raid1", 00:36:26.572 "superblock": true, 00:36:26.572 "num_base_bdevs": 4, 00:36:26.572 "num_base_bdevs_discovered": 4, 00:36:26.572 "num_base_bdevs_operational": 4, 00:36:26.572 "base_bdevs_list": [ 00:36:26.572 { 00:36:26.572 "name": "pt1", 00:36:26.572 "uuid": "00000000-0000-0000-0000-000000000001", 00:36:26.572 "is_configured": true, 00:36:26.572 "data_offset": 2048, 00:36:26.572 "data_size": 63488 00:36:26.572 }, 00:36:26.572 { 00:36:26.572 "name": "pt2", 00:36:26.572 "uuid": "00000000-0000-0000-0000-000000000002", 00:36:26.572 "is_configured": true, 00:36:26.572 "data_offset": 2048, 00:36:26.572 "data_size": 63488 00:36:26.572 }, 00:36:26.572 { 00:36:26.572 "name": "pt3", 00:36:26.572 "uuid": "00000000-0000-0000-0000-000000000003", 00:36:26.572 "is_configured": true, 00:36:26.572 "data_offset": 2048, 00:36:26.572 "data_size": 63488 00:36:26.572 }, 00:36:26.572 { 00:36:26.572 "name": "pt4", 00:36:26.572 "uuid": "00000000-0000-0000-0000-000000000004", 00:36:26.572 "is_configured": true, 00:36:26.572 "data_offset": 2048, 00:36:26.572 "data_size": 63488 00:36:26.572 } 00:36:26.572 ] 00:36:26.572 }' 00:36:26.572 11:47:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:36:26.572 11:47:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:36:27.141 11:47:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:36:27.141 11:47:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:36:27.141 11:47:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:36:27.141 11:47:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:36:27.141 11:47:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:36:27.141 11:47:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:36:27.141 11:47:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:36:27.141 11:47:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:36:27.400 [2024-06-10 11:47:11.128842] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:36:27.400 11:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:36:27.400 "name": "raid_bdev1", 00:36:27.400 "aliases": [ 00:36:27.400 "1c16718b-d2b5-4851-af63-1f54b4c2ebe2" 00:36:27.400 ], 00:36:27.400 "product_name": "Raid Volume", 00:36:27.400 "block_size": 512, 00:36:27.400 "num_blocks": 63488, 00:36:27.400 "uuid": "1c16718b-d2b5-4851-af63-1f54b4c2ebe2", 00:36:27.400 "assigned_rate_limits": { 00:36:27.400 "rw_ios_per_sec": 0, 00:36:27.400 "rw_mbytes_per_sec": 0, 00:36:27.400 "r_mbytes_per_sec": 0, 00:36:27.400 "w_mbytes_per_sec": 0 00:36:27.400 }, 00:36:27.400 "claimed": false, 00:36:27.400 "zoned": false, 00:36:27.400 "supported_io_types": { 00:36:27.400 "read": true, 00:36:27.400 "write": true, 00:36:27.400 "unmap": false, 00:36:27.400 "write_zeroes": true, 00:36:27.400 "flush": false, 00:36:27.400 "reset": true, 00:36:27.400 "compare": false, 00:36:27.400 "compare_and_write": false, 00:36:27.400 "abort": false, 00:36:27.400 "nvme_admin": false, 00:36:27.400 "nvme_io": false 00:36:27.400 }, 00:36:27.400 "memory_domains": [ 00:36:27.400 { 00:36:27.400 "dma_device_id": "system", 00:36:27.400 "dma_device_type": 1 00:36:27.400 }, 00:36:27.400 { 00:36:27.400 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:36:27.400 "dma_device_type": 2 00:36:27.400 }, 00:36:27.400 { 00:36:27.400 "dma_device_id": "system", 00:36:27.400 "dma_device_type": 1 00:36:27.400 }, 00:36:27.400 { 00:36:27.400 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:36:27.400 "dma_device_type": 2 00:36:27.400 }, 00:36:27.400 { 00:36:27.400 "dma_device_id": "system", 00:36:27.400 "dma_device_type": 1 00:36:27.400 }, 00:36:27.400 { 00:36:27.400 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:36:27.400 "dma_device_type": 2 00:36:27.400 }, 00:36:27.400 { 00:36:27.400 "dma_device_id": "system", 00:36:27.400 "dma_device_type": 1 00:36:27.400 }, 00:36:27.400 { 00:36:27.400 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:36:27.400 "dma_device_type": 2 00:36:27.400 } 00:36:27.400 ], 00:36:27.400 "driver_specific": { 00:36:27.400 "raid": { 00:36:27.400 "uuid": "1c16718b-d2b5-4851-af63-1f54b4c2ebe2", 00:36:27.400 "strip_size_kb": 0, 00:36:27.400 "state": "online", 00:36:27.400 "raid_level": "raid1", 00:36:27.400 "superblock": true, 00:36:27.400 "num_base_bdevs": 4, 00:36:27.400 "num_base_bdevs_discovered": 4, 00:36:27.400 "num_base_bdevs_operational": 4, 00:36:27.400 "base_bdevs_list": [ 00:36:27.400 { 00:36:27.400 "name": "pt1", 00:36:27.400 "uuid": "00000000-0000-0000-0000-000000000001", 00:36:27.400 "is_configured": true, 00:36:27.400 "data_offset": 2048, 00:36:27.400 "data_size": 63488 00:36:27.400 }, 00:36:27.400 { 00:36:27.400 "name": "pt2", 00:36:27.400 "uuid": "00000000-0000-0000-0000-000000000002", 00:36:27.400 "is_configured": true, 00:36:27.400 "data_offset": 2048, 00:36:27.400 "data_size": 63488 00:36:27.400 }, 00:36:27.400 { 00:36:27.400 "name": "pt3", 00:36:27.400 "uuid": "00000000-0000-0000-0000-000000000003", 00:36:27.400 "is_configured": true, 00:36:27.400 "data_offset": 2048, 00:36:27.400 "data_size": 63488 00:36:27.400 }, 00:36:27.400 { 00:36:27.400 "name": "pt4", 00:36:27.400 "uuid": "00000000-0000-0000-0000-000000000004", 00:36:27.400 "is_configured": true, 00:36:27.400 "data_offset": 2048, 00:36:27.400 "data_size": 63488 00:36:27.400 } 00:36:27.400 ] 00:36:27.400 } 00:36:27.400 } 00:36:27.400 }' 00:36:27.400 11:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:36:27.400 11:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:36:27.400 pt2 00:36:27.400 pt3 00:36:27.400 pt4' 00:36:27.400 11:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:36:27.400 11:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:36:27.400 11:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:36:27.659 11:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:36:27.659 "name": "pt1", 00:36:27.660 "aliases": [ 00:36:27.660 "00000000-0000-0000-0000-000000000001" 00:36:27.660 ], 00:36:27.660 "product_name": "passthru", 00:36:27.660 "block_size": 512, 00:36:27.660 "num_blocks": 65536, 00:36:27.660 "uuid": "00000000-0000-0000-0000-000000000001", 00:36:27.660 "assigned_rate_limits": { 00:36:27.660 "rw_ios_per_sec": 0, 00:36:27.660 "rw_mbytes_per_sec": 0, 00:36:27.660 "r_mbytes_per_sec": 0, 00:36:27.660 "w_mbytes_per_sec": 0 00:36:27.660 }, 00:36:27.660 "claimed": true, 00:36:27.660 "claim_type": "exclusive_write", 00:36:27.660 "zoned": false, 00:36:27.660 "supported_io_types": { 00:36:27.660 "read": true, 00:36:27.660 "write": true, 00:36:27.660 "unmap": true, 00:36:27.660 "write_zeroes": true, 00:36:27.660 "flush": true, 00:36:27.660 "reset": true, 00:36:27.660 "compare": false, 00:36:27.660 "compare_and_write": false, 00:36:27.660 "abort": true, 00:36:27.660 "nvme_admin": false, 00:36:27.660 "nvme_io": false 00:36:27.660 }, 00:36:27.660 "memory_domains": [ 00:36:27.660 { 00:36:27.660 "dma_device_id": "system", 00:36:27.660 "dma_device_type": 1 00:36:27.660 }, 00:36:27.660 { 00:36:27.660 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:36:27.660 "dma_device_type": 2 00:36:27.660 } 00:36:27.660 ], 00:36:27.660 "driver_specific": { 00:36:27.660 "passthru": { 00:36:27.660 "name": "pt1", 00:36:27.660 "base_bdev_name": "malloc1" 00:36:27.660 } 00:36:27.660 } 00:36:27.660 }' 00:36:27.660 11:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:36:27.660 11:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:36:27.660 11:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:36:27.660 11:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:36:27.660 11:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:36:27.660 11:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:36:27.660 11:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:36:27.660 11:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:36:27.660 11:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:36:27.660 11:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:36:27.919 11:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:36:27.919 11:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:36:27.919 11:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:36:27.919 11:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:36:27.919 11:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:36:27.919 11:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:36:27.919 "name": "pt2", 00:36:27.919 "aliases": [ 00:36:27.919 "00000000-0000-0000-0000-000000000002" 00:36:27.919 ], 00:36:27.919 "product_name": "passthru", 00:36:27.919 "block_size": 512, 00:36:27.919 "num_blocks": 65536, 00:36:27.919 "uuid": "00000000-0000-0000-0000-000000000002", 00:36:27.919 "assigned_rate_limits": { 00:36:27.919 "rw_ios_per_sec": 0, 00:36:27.919 "rw_mbytes_per_sec": 0, 00:36:27.919 "r_mbytes_per_sec": 0, 00:36:27.919 "w_mbytes_per_sec": 0 00:36:27.919 }, 00:36:27.919 "claimed": true, 00:36:27.919 "claim_type": "exclusive_write", 00:36:27.919 "zoned": false, 00:36:27.919 "supported_io_types": { 00:36:27.919 "read": true, 00:36:27.919 "write": true, 00:36:27.919 "unmap": true, 00:36:27.919 "write_zeroes": true, 00:36:27.919 "flush": true, 00:36:27.919 "reset": true, 00:36:27.919 "compare": false, 00:36:27.919 "compare_and_write": false, 00:36:27.919 "abort": true, 00:36:27.919 "nvme_admin": false, 00:36:27.919 "nvme_io": false 00:36:27.919 }, 00:36:27.919 "memory_domains": [ 00:36:27.919 { 00:36:27.919 "dma_device_id": "system", 00:36:27.919 "dma_device_type": 1 00:36:27.919 }, 00:36:27.919 { 00:36:27.919 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:36:27.919 "dma_device_type": 2 00:36:27.919 } 00:36:27.919 ], 00:36:27.919 "driver_specific": { 00:36:27.919 "passthru": { 00:36:27.919 "name": "pt2", 00:36:27.919 "base_bdev_name": "malloc2" 00:36:27.919 } 00:36:27.919 } 00:36:27.919 }' 00:36:27.919 11:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:36:28.178 11:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:36:28.178 11:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:36:28.178 11:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:36:28.178 11:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:36:28.178 11:47:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:36:28.178 11:47:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:36:28.178 11:47:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:36:28.178 11:47:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:36:28.178 11:47:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:36:28.437 11:47:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:36:28.437 11:47:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:36:28.437 11:47:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:36:28.437 11:47:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:36:28.437 11:47:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:36:28.437 11:47:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:36:28.437 "name": "pt3", 00:36:28.437 "aliases": [ 00:36:28.437 "00000000-0000-0000-0000-000000000003" 00:36:28.437 ], 00:36:28.437 "product_name": "passthru", 00:36:28.437 "block_size": 512, 00:36:28.437 "num_blocks": 65536, 00:36:28.437 "uuid": "00000000-0000-0000-0000-000000000003", 00:36:28.437 "assigned_rate_limits": { 00:36:28.437 "rw_ios_per_sec": 0, 00:36:28.437 "rw_mbytes_per_sec": 0, 00:36:28.437 "r_mbytes_per_sec": 0, 00:36:28.437 "w_mbytes_per_sec": 0 00:36:28.437 }, 00:36:28.437 "claimed": true, 00:36:28.437 "claim_type": "exclusive_write", 00:36:28.437 "zoned": false, 00:36:28.437 "supported_io_types": { 00:36:28.437 "read": true, 00:36:28.437 "write": true, 00:36:28.437 "unmap": true, 00:36:28.437 "write_zeroes": true, 00:36:28.437 "flush": true, 00:36:28.437 "reset": true, 00:36:28.437 "compare": false, 00:36:28.437 "compare_and_write": false, 00:36:28.437 "abort": true, 00:36:28.437 "nvme_admin": false, 00:36:28.437 "nvme_io": false 00:36:28.437 }, 00:36:28.437 "memory_domains": [ 00:36:28.437 { 00:36:28.437 "dma_device_id": "system", 00:36:28.437 "dma_device_type": 1 00:36:28.437 }, 00:36:28.437 { 00:36:28.437 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:36:28.437 "dma_device_type": 2 00:36:28.437 } 00:36:28.437 ], 00:36:28.437 "driver_specific": { 00:36:28.437 "passthru": { 00:36:28.437 "name": "pt3", 00:36:28.437 "base_bdev_name": "malloc3" 00:36:28.437 } 00:36:28.437 } 00:36:28.437 }' 00:36:28.437 11:47:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:36:28.696 11:47:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:36:28.696 11:47:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:36:28.696 11:47:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:36:28.696 11:47:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:36:28.696 11:47:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:36:28.696 11:47:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:36:28.696 11:47:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:36:28.696 11:47:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:36:28.696 11:47:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:36:28.696 11:47:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:36:28.955 11:47:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:36:28.955 11:47:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:36:28.955 11:47:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:36:28.955 11:47:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:36:28.955 11:47:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:36:28.955 "name": "pt4", 00:36:28.955 "aliases": [ 00:36:28.955 "00000000-0000-0000-0000-000000000004" 00:36:28.955 ], 00:36:28.955 "product_name": "passthru", 00:36:28.955 "block_size": 512, 00:36:28.955 "num_blocks": 65536, 00:36:28.955 "uuid": "00000000-0000-0000-0000-000000000004", 00:36:28.955 "assigned_rate_limits": { 00:36:28.955 "rw_ios_per_sec": 0, 00:36:28.955 "rw_mbytes_per_sec": 0, 00:36:28.955 "r_mbytes_per_sec": 0, 00:36:28.955 "w_mbytes_per_sec": 0 00:36:28.955 }, 00:36:28.955 "claimed": true, 00:36:28.955 "claim_type": "exclusive_write", 00:36:28.955 "zoned": false, 00:36:28.955 "supported_io_types": { 00:36:28.955 "read": true, 00:36:28.955 "write": true, 00:36:28.955 "unmap": true, 00:36:28.955 "write_zeroes": true, 00:36:28.955 "flush": true, 00:36:28.955 "reset": true, 00:36:28.955 "compare": false, 00:36:28.955 "compare_and_write": false, 00:36:28.955 "abort": true, 00:36:28.955 "nvme_admin": false, 00:36:28.955 "nvme_io": false 00:36:28.955 }, 00:36:28.955 "memory_domains": [ 00:36:28.955 { 00:36:28.955 "dma_device_id": "system", 00:36:28.955 "dma_device_type": 1 00:36:28.955 }, 00:36:28.955 { 00:36:28.955 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:36:28.955 "dma_device_type": 2 00:36:28.955 } 00:36:28.955 ], 00:36:28.955 "driver_specific": { 00:36:28.955 "passthru": { 00:36:28.955 "name": "pt4", 00:36:28.955 "base_bdev_name": "malloc4" 00:36:28.955 } 00:36:28.955 } 00:36:28.955 }' 00:36:28.955 11:47:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:36:28.955 11:47:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:36:29.214 11:47:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:36:29.214 11:47:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:36:29.214 11:47:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:36:29.214 11:47:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:36:29.214 11:47:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:36:29.214 11:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:36:29.214 11:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:36:29.214 11:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:36:29.214 11:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:36:29.473 11:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:36:29.473 11:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:36:29.473 11:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:36:29.473 [2024-06-10 11:47:13.322570] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:36:29.473 11:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 1c16718b-d2b5-4851-af63-1f54b4c2ebe2 '!=' 1c16718b-d2b5-4851-af63-1f54b4c2ebe2 ']' 00:36:29.473 11:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:36:29.473 11:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:36:29.473 11:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:36:29.473 11:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:36:29.732 [2024-06-10 11:47:13.498875] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:36:29.732 11:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:36:29.732 11:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:36:29.732 11:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:36:29.732 11:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:36:29.732 11:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:36:29.732 11:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:36:29.732 11:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:36:29.732 11:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:36:29.732 11:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:36:29.732 11:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:36:29.732 11:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:29.732 11:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:36:29.991 11:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:36:29.991 "name": "raid_bdev1", 00:36:29.991 "uuid": "1c16718b-d2b5-4851-af63-1f54b4c2ebe2", 00:36:29.991 "strip_size_kb": 0, 00:36:29.991 "state": "online", 00:36:29.991 "raid_level": "raid1", 00:36:29.991 "superblock": true, 00:36:29.991 "num_base_bdevs": 4, 00:36:29.991 "num_base_bdevs_discovered": 3, 00:36:29.991 "num_base_bdevs_operational": 3, 00:36:29.991 "base_bdevs_list": [ 00:36:29.991 { 00:36:29.991 "name": null, 00:36:29.991 "uuid": "00000000-0000-0000-0000-000000000000", 00:36:29.991 "is_configured": false, 00:36:29.991 "data_offset": 2048, 00:36:29.991 "data_size": 63488 00:36:29.991 }, 00:36:29.991 { 00:36:29.991 "name": "pt2", 00:36:29.991 "uuid": "00000000-0000-0000-0000-000000000002", 00:36:29.991 "is_configured": true, 00:36:29.991 "data_offset": 2048, 00:36:29.991 "data_size": 63488 00:36:29.991 }, 00:36:29.991 { 00:36:29.991 "name": "pt3", 00:36:29.991 "uuid": "00000000-0000-0000-0000-000000000003", 00:36:29.991 "is_configured": true, 00:36:29.991 "data_offset": 2048, 00:36:29.991 "data_size": 63488 00:36:29.991 }, 00:36:29.991 { 00:36:29.991 "name": "pt4", 00:36:29.991 "uuid": "00000000-0000-0000-0000-000000000004", 00:36:29.991 "is_configured": true, 00:36:29.991 "data_offset": 2048, 00:36:29.991 "data_size": 63488 00:36:29.991 } 00:36:29.991 ] 00:36:29.991 }' 00:36:29.992 11:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:36:29.992 11:47:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:36:30.558 11:47:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:36:30.558 [2024-06-10 11:47:14.357095] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:36:30.558 [2024-06-10 11:47:14.357117] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:36:30.558 [2024-06-10 11:47:14.357153] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:36:30.558 [2024-06-10 11:47:14.357199] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:36:30.558 [2024-06-10 11:47:14.357208] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x980d20 name raid_bdev1, state offline 00:36:30.558 11:47:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:30.558 11:47:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:36:30.817 11:47:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:36:30.817 11:47:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:36:30.817 11:47:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:36:30.817 11:47:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:36:30.817 11:47:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:36:30.817 11:47:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:36:30.817 11:47:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:36:30.817 11:47:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:36:31.076 11:47:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:36:31.076 11:47:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:36:31.076 11:47:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:36:31.335 11:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:36:31.335 11:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:36:31.335 11:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:36:31.335 11:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:36:31.335 11:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:36:31.335 [2024-06-10 11:47:15.223473] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:36:31.335 [2024-06-10 11:47:15.223510] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:36:31.335 [2024-06-10 11:47:15.223522] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9815c0 00:36:31.335 [2024-06-10 11:47:15.223530] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:36:31.335 [2024-06-10 11:47:15.224696] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:36:31.335 [2024-06-10 11:47:15.224722] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:36:31.335 [2024-06-10 11:47:15.224772] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:36:31.335 [2024-06-10 11:47:15.224796] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:36:31.335 pt2 00:36:31.335 11:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:36:31.335 11:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:36:31.335 11:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:36:31.335 11:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:36:31.335 11:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:36:31.335 11:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:36:31.335 11:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:36:31.335 11:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:36:31.335 11:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:36:31.335 11:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:36:31.335 11:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:31.335 11:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:36:31.594 11:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:36:31.594 "name": "raid_bdev1", 00:36:31.594 "uuid": "1c16718b-d2b5-4851-af63-1f54b4c2ebe2", 00:36:31.594 "strip_size_kb": 0, 00:36:31.594 "state": "configuring", 00:36:31.594 "raid_level": "raid1", 00:36:31.594 "superblock": true, 00:36:31.594 "num_base_bdevs": 4, 00:36:31.594 "num_base_bdevs_discovered": 1, 00:36:31.594 "num_base_bdevs_operational": 3, 00:36:31.594 "base_bdevs_list": [ 00:36:31.594 { 00:36:31.594 "name": null, 00:36:31.594 "uuid": "00000000-0000-0000-0000-000000000000", 00:36:31.594 "is_configured": false, 00:36:31.594 "data_offset": 2048, 00:36:31.594 "data_size": 63488 00:36:31.594 }, 00:36:31.594 { 00:36:31.594 "name": "pt2", 00:36:31.594 "uuid": "00000000-0000-0000-0000-000000000002", 00:36:31.594 "is_configured": true, 00:36:31.594 "data_offset": 2048, 00:36:31.594 "data_size": 63488 00:36:31.594 }, 00:36:31.594 { 00:36:31.594 "name": null, 00:36:31.594 "uuid": "00000000-0000-0000-0000-000000000003", 00:36:31.594 "is_configured": false, 00:36:31.594 "data_offset": 2048, 00:36:31.594 "data_size": 63488 00:36:31.594 }, 00:36:31.594 { 00:36:31.594 "name": null, 00:36:31.594 "uuid": "00000000-0000-0000-0000-000000000004", 00:36:31.594 "is_configured": false, 00:36:31.594 "data_offset": 2048, 00:36:31.594 "data_size": 63488 00:36:31.594 } 00:36:31.594 ] 00:36:31.594 }' 00:36:31.595 11:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:36:31.595 11:47:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:36:32.161 11:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:36:32.161 11:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:36:32.161 11:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:36:32.161 [2024-06-10 11:47:16.037568] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:36:32.161 [2024-06-10 11:47:16.037605] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:36:32.161 [2024-06-10 11:47:16.037619] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb2ead0 00:36:32.161 [2024-06-10 11:47:16.037627] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:36:32.161 [2024-06-10 11:47:16.037855] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:36:32.161 [2024-06-10 11:47:16.037872] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:36:32.161 [2024-06-10 11:47:16.037933] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:36:32.161 [2024-06-10 11:47:16.037947] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:36:32.161 pt3 00:36:32.161 11:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:36:32.161 11:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:36:32.161 11:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:36:32.161 11:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:36:32.161 11:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:36:32.161 11:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:36:32.161 11:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:36:32.161 11:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:36:32.161 11:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:36:32.161 11:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:36:32.161 11:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:32.161 11:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:36:32.420 11:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:36:32.420 "name": "raid_bdev1", 00:36:32.420 "uuid": "1c16718b-d2b5-4851-af63-1f54b4c2ebe2", 00:36:32.420 "strip_size_kb": 0, 00:36:32.420 "state": "configuring", 00:36:32.420 "raid_level": "raid1", 00:36:32.420 "superblock": true, 00:36:32.420 "num_base_bdevs": 4, 00:36:32.420 "num_base_bdevs_discovered": 2, 00:36:32.420 "num_base_bdevs_operational": 3, 00:36:32.420 "base_bdevs_list": [ 00:36:32.420 { 00:36:32.420 "name": null, 00:36:32.420 "uuid": "00000000-0000-0000-0000-000000000000", 00:36:32.420 "is_configured": false, 00:36:32.420 "data_offset": 2048, 00:36:32.420 "data_size": 63488 00:36:32.420 }, 00:36:32.420 { 00:36:32.420 "name": "pt2", 00:36:32.420 "uuid": "00000000-0000-0000-0000-000000000002", 00:36:32.420 "is_configured": true, 00:36:32.420 "data_offset": 2048, 00:36:32.420 "data_size": 63488 00:36:32.420 }, 00:36:32.420 { 00:36:32.420 "name": "pt3", 00:36:32.420 "uuid": "00000000-0000-0000-0000-000000000003", 00:36:32.420 "is_configured": true, 00:36:32.420 "data_offset": 2048, 00:36:32.420 "data_size": 63488 00:36:32.420 }, 00:36:32.420 { 00:36:32.420 "name": null, 00:36:32.420 "uuid": "00000000-0000-0000-0000-000000000004", 00:36:32.420 "is_configured": false, 00:36:32.420 "data_offset": 2048, 00:36:32.420 "data_size": 63488 00:36:32.420 } 00:36:32.420 ] 00:36:32.420 }' 00:36:32.420 11:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:36:32.420 11:47:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:36:32.987 11:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:36:32.987 11:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:36:32.987 11:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=3 00:36:32.987 11:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:36:32.987 [2024-06-10 11:47:16.899809] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:36:32.987 [2024-06-10 11:47:16.899845] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:36:32.987 [2024-06-10 11:47:16.899858] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb183a0 00:36:32.987 [2024-06-10 11:47:16.899873] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:36:32.987 [2024-06-10 11:47:16.900108] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:36:32.987 [2024-06-10 11:47:16.900122] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:36:32.987 [2024-06-10 11:47:16.900167] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:36:32.987 [2024-06-10 11:47:16.900182] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:36:32.987 [2024-06-10 11:47:16.900261] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xb32f50 00:36:32.987 [2024-06-10 11:47:16.900268] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:36:32.987 [2024-06-10 11:47:16.900379] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb2c840 00:36:32.987 [2024-06-10 11:47:16.900469] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xb32f50 00:36:32.987 [2024-06-10 11:47:16.900476] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xb32f50 00:36:32.987 [2024-06-10 11:47:16.900542] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:36:32.987 pt4 00:36:32.987 11:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:36:32.987 11:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:36:32.987 11:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:36:32.987 11:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:36:32.987 11:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:36:32.987 11:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:36:32.987 11:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:36:32.987 11:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:36:32.987 11:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:36:32.987 11:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:36:32.987 11:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:36:32.987 11:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:33.245 11:47:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:36:33.246 "name": "raid_bdev1", 00:36:33.246 "uuid": "1c16718b-d2b5-4851-af63-1f54b4c2ebe2", 00:36:33.246 "strip_size_kb": 0, 00:36:33.246 "state": "online", 00:36:33.246 "raid_level": "raid1", 00:36:33.246 "superblock": true, 00:36:33.246 "num_base_bdevs": 4, 00:36:33.246 "num_base_bdevs_discovered": 3, 00:36:33.246 "num_base_bdevs_operational": 3, 00:36:33.246 "base_bdevs_list": [ 00:36:33.246 { 00:36:33.246 "name": null, 00:36:33.246 "uuid": "00000000-0000-0000-0000-000000000000", 00:36:33.246 "is_configured": false, 00:36:33.246 "data_offset": 2048, 00:36:33.246 "data_size": 63488 00:36:33.246 }, 00:36:33.246 { 00:36:33.246 "name": "pt2", 00:36:33.246 "uuid": "00000000-0000-0000-0000-000000000002", 00:36:33.246 "is_configured": true, 00:36:33.246 "data_offset": 2048, 00:36:33.246 "data_size": 63488 00:36:33.246 }, 00:36:33.246 { 00:36:33.246 "name": "pt3", 00:36:33.246 "uuid": "00000000-0000-0000-0000-000000000003", 00:36:33.246 "is_configured": true, 00:36:33.246 "data_offset": 2048, 00:36:33.246 "data_size": 63488 00:36:33.246 }, 00:36:33.246 { 00:36:33.246 "name": "pt4", 00:36:33.246 "uuid": "00000000-0000-0000-0000-000000000004", 00:36:33.246 "is_configured": true, 00:36:33.246 "data_offset": 2048, 00:36:33.246 "data_size": 63488 00:36:33.246 } 00:36:33.246 ] 00:36:33.246 }' 00:36:33.246 11:47:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:36:33.246 11:47:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:36:33.813 11:47:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:36:33.813 [2024-06-10 11:47:17.741980] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:36:33.813 [2024-06-10 11:47:17.741997] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:36:33.813 [2024-06-10 11:47:17.742031] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:36:33.813 [2024-06-10 11:47:17.742075] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:36:33.813 [2024-06-10 11:47:17.742082] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb32f50 name raid_bdev1, state offline 00:36:34.094 11:47:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:34.094 11:47:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:36:34.094 11:47:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:36:34.094 11:47:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:36:34.094 11:47:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 4 -gt 2 ']' 00:36:34.094 11:47:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@533 -- # i=3 00:36:34.094 11:47:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@534 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:36:34.397 11:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:36:34.397 [2024-06-10 11:47:18.271342] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:36:34.397 [2024-06-10 11:47:18.271378] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:36:34.397 [2024-06-10 11:47:18.271390] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb2b370 00:36:34.397 [2024-06-10 11:47:18.271400] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:36:34.397 [2024-06-10 11:47:18.272659] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:36:34.397 [2024-06-10 11:47:18.272683] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:36:34.397 [2024-06-10 11:47:18.272732] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:36:34.397 [2024-06-10 11:47:18.272755] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:36:34.397 [2024-06-10 11:47:18.272831] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:36:34.397 [2024-06-10 11:47:18.272841] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:36:34.397 [2024-06-10 11:47:18.272851] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb2c330 name raid_bdev1, state configuring 00:36:34.397 [2024-06-10 11:47:18.272877] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:36:34.397 [2024-06-10 11:47:18.272952] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:36:34.397 pt1 00:36:34.397 11:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 4 -gt 2 ']' 00:36:34.397 11:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@544 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:36:34.397 11:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:36:34.397 11:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:36:34.397 11:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:36:34.397 11:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:36:34.397 11:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:36:34.397 11:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:36:34.397 11:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:36:34.397 11:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:36:34.397 11:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:36:34.397 11:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:34.397 11:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:36:34.660 11:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:36:34.660 "name": "raid_bdev1", 00:36:34.660 "uuid": "1c16718b-d2b5-4851-af63-1f54b4c2ebe2", 00:36:34.660 "strip_size_kb": 0, 00:36:34.660 "state": "configuring", 00:36:34.660 "raid_level": "raid1", 00:36:34.660 "superblock": true, 00:36:34.660 "num_base_bdevs": 4, 00:36:34.660 "num_base_bdevs_discovered": 2, 00:36:34.660 "num_base_bdevs_operational": 3, 00:36:34.660 "base_bdevs_list": [ 00:36:34.660 { 00:36:34.660 "name": null, 00:36:34.660 "uuid": "00000000-0000-0000-0000-000000000000", 00:36:34.660 "is_configured": false, 00:36:34.660 "data_offset": 2048, 00:36:34.660 "data_size": 63488 00:36:34.660 }, 00:36:34.660 { 00:36:34.660 "name": "pt2", 00:36:34.660 "uuid": "00000000-0000-0000-0000-000000000002", 00:36:34.660 "is_configured": true, 00:36:34.660 "data_offset": 2048, 00:36:34.660 "data_size": 63488 00:36:34.660 }, 00:36:34.660 { 00:36:34.660 "name": "pt3", 00:36:34.660 "uuid": "00000000-0000-0000-0000-000000000003", 00:36:34.660 "is_configured": true, 00:36:34.660 "data_offset": 2048, 00:36:34.660 "data_size": 63488 00:36:34.660 }, 00:36:34.660 { 00:36:34.660 "name": null, 00:36:34.660 "uuid": "00000000-0000-0000-0000-000000000004", 00:36:34.660 "is_configured": false, 00:36:34.660 "data_offset": 2048, 00:36:34.660 "data_size": 63488 00:36:34.660 } 00:36:34.660 ] 00:36:34.660 }' 00:36:34.660 11:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:36:34.660 11:47:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:36:35.225 11:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs configuring 00:36:35.225 11:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:36:35.225 11:47:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # [[ false == \f\a\l\s\e ]] 00:36:35.225 11:47:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@548 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:36:35.483 [2024-06-10 11:47:19.294000] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:36:35.483 [2024-06-10 11:47:19.294039] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:36:35.483 [2024-06-10 11:47:19.294055] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb2b5a0 00:36:35.483 [2024-06-10 11:47:19.294063] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:36:35.483 [2024-06-10 11:47:19.294305] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:36:35.483 [2024-06-10 11:47:19.294319] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:36:35.483 [2024-06-10 11:47:19.294365] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:36:35.483 [2024-06-10 11:47:19.294381] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:36:35.483 [2024-06-10 11:47:19.294460] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x981280 00:36:35.483 [2024-06-10 11:47:19.294467] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:36:35.483 [2024-06-10 11:47:19.294576] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb180e0 00:36:35.483 [2024-06-10 11:47:19.294665] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x981280 00:36:35.483 [2024-06-10 11:47:19.294671] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x981280 00:36:35.483 [2024-06-10 11:47:19.294741] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:36:35.483 pt4 00:36:35.483 11:47:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:36:35.483 11:47:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:36:35.483 11:47:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:36:35.483 11:47:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:36:35.483 11:47:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:36:35.483 11:47:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:36:35.483 11:47:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:36:35.483 11:47:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:36:35.483 11:47:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:36:35.483 11:47:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:36:35.483 11:47:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:35.483 11:47:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:36:35.740 11:47:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:36:35.740 "name": "raid_bdev1", 00:36:35.740 "uuid": "1c16718b-d2b5-4851-af63-1f54b4c2ebe2", 00:36:35.740 "strip_size_kb": 0, 00:36:35.740 "state": "online", 00:36:35.740 "raid_level": "raid1", 00:36:35.740 "superblock": true, 00:36:35.740 "num_base_bdevs": 4, 00:36:35.740 "num_base_bdevs_discovered": 3, 00:36:35.740 "num_base_bdevs_operational": 3, 00:36:35.740 "base_bdevs_list": [ 00:36:35.740 { 00:36:35.740 "name": null, 00:36:35.740 "uuid": "00000000-0000-0000-0000-000000000000", 00:36:35.740 "is_configured": false, 00:36:35.740 "data_offset": 2048, 00:36:35.740 "data_size": 63488 00:36:35.740 }, 00:36:35.740 { 00:36:35.741 "name": "pt2", 00:36:35.741 "uuid": "00000000-0000-0000-0000-000000000002", 00:36:35.741 "is_configured": true, 00:36:35.741 "data_offset": 2048, 00:36:35.741 "data_size": 63488 00:36:35.741 }, 00:36:35.741 { 00:36:35.741 "name": "pt3", 00:36:35.741 "uuid": "00000000-0000-0000-0000-000000000003", 00:36:35.741 "is_configured": true, 00:36:35.741 "data_offset": 2048, 00:36:35.741 "data_size": 63488 00:36:35.741 }, 00:36:35.741 { 00:36:35.741 "name": "pt4", 00:36:35.741 "uuid": "00000000-0000-0000-0000-000000000004", 00:36:35.741 "is_configured": true, 00:36:35.741 "data_offset": 2048, 00:36:35.741 "data_size": 63488 00:36:35.741 } 00:36:35.741 ] 00:36:35.741 }' 00:36:35.741 11:47:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:36:35.741 11:47:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:36:36.306 11:47:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:36:36.307 11:47:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:36:36.307 11:47:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:36:36.307 11:47:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:36:36.307 11:47:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:36:36.565 [2024-06-10 11:47:20.336865] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:36:36.565 11:47:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' 1c16718b-d2b5-4851-af63-1f54b4c2ebe2 '!=' 1c16718b-d2b5-4851-af63-1f54b4c2ebe2 ']' 00:36:36.565 11:47:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 211433 00:36:36.565 11:47:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@949 -- # '[' -z 211433 ']' 00:36:36.565 11:47:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # kill -0 211433 00:36:36.565 11:47:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # uname 00:36:36.565 11:47:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:36:36.565 11:47:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 211433 00:36:36.565 11:47:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:36:36.565 11:47:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:36:36.565 11:47:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 211433' 00:36:36.565 killing process with pid 211433 00:36:36.565 11:47:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # kill 211433 00:36:36.565 [2024-06-10 11:47:20.384279] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:36:36.565 [2024-06-10 11:47:20.384316] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:36:36.565 [2024-06-10 11:47:20.384365] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:36:36.565 [2024-06-10 11:47:20.384373] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x981280 name raid_bdev1, state offline 00:36:36.565 11:47:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@973 -- # wait 211433 00:36:36.565 [2024-06-10 11:47:20.419556] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:36:36.824 11:47:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:36:36.824 00:36:36.824 real 0m19.513s 00:36:36.824 user 0m35.387s 00:36:36.824 sys 0m3.720s 00:36:36.824 11:47:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:36:36.824 11:47:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:36:36.824 ************************************ 00:36:36.824 END TEST raid_superblock_test 00:36:36.824 ************************************ 00:36:36.824 11:47:20 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 4 read 00:36:36.824 11:47:20 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:36:36.824 11:47:20 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:36:36.824 11:47:20 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:36:36.824 ************************************ 00:36:36.824 START TEST raid_read_error_test 00:36:36.824 ************************************ 00:36:36.824 11:47:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test raid1 4 read 00:36:36.824 11:47:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:36:36.824 11:47:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:36:36.824 11:47:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:36:36.824 11:47:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:36:36.824 11:47:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:36:36.824 11:47:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:36:36.825 11:47:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:36:36.825 11:47:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:36:36.825 11:47:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:36:36.825 11:47:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:36:36.825 11:47:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:36:36.825 11:47:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:36:36.825 11:47:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:36:36.825 11:47:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:36:36.825 11:47:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:36:36.825 11:47:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:36:36.825 11:47:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:36:36.825 11:47:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:36:36.825 11:47:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:36:36.825 11:47:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:36:36.825 11:47:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:36:36.825 11:47:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:36:36.825 11:47:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:36:36.825 11:47:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:36:36.825 11:47:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:36:36.825 11:47:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:36:36.825 11:47:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:36:36.825 11:47:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.dapxt0toVg 00:36:36.825 11:47:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:36:36.825 11:47:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=214441 00:36:36.825 11:47:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 214441 /var/tmp/spdk-raid.sock 00:36:36.825 11:47:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@830 -- # '[' -z 214441 ']' 00:36:36.825 11:47:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:36:36.825 11:47:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:36:36.825 11:47:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:36:36.825 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:36:36.825 11:47:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:36:36.825 11:47:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:36:36.825 [2024-06-10 11:47:20.740293] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:36:36.825 [2024-06-10 11:47:20.740344] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid214441 ] 00:36:37.084 [2024-06-10 11:47:20.826668] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:36:37.084 [2024-06-10 11:47:20.915093] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:36:37.084 [2024-06-10 11:47:20.976734] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:36:37.084 [2024-06-10 11:47:20.976765] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:36:37.651 11:47:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:36:37.651 11:47:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@863 -- # return 0 00:36:37.651 11:47:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:36:37.651 11:47:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:36:37.909 BaseBdev1_malloc 00:36:37.909 11:47:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:36:38.167 true 00:36:38.167 11:47:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:36:38.167 [2024-06-10 11:47:22.033571] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:36:38.167 [2024-06-10 11:47:22.033607] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:36:38.167 [2024-06-10 11:47:22.033621] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x10d9b10 00:36:38.167 [2024-06-10 11:47:22.033629] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:36:38.167 [2024-06-10 11:47:22.034895] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:36:38.167 [2024-06-10 11:47:22.034920] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:36:38.167 BaseBdev1 00:36:38.167 11:47:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:36:38.167 11:47:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:36:38.423 BaseBdev2_malloc 00:36:38.423 11:47:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:36:38.679 true 00:36:38.679 11:47:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:36:38.679 [2024-06-10 11:47:22.558825] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:36:38.679 [2024-06-10 11:47:22.558862] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:36:38.679 [2024-06-10 11:47:22.558884] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x10de280 00:36:38.679 [2024-06-10 11:47:22.558892] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:36:38.679 [2024-06-10 11:47:22.560021] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:36:38.679 [2024-06-10 11:47:22.560043] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:36:38.679 BaseBdev2 00:36:38.679 11:47:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:36:38.679 11:47:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:36:38.935 BaseBdev3_malloc 00:36:38.935 11:47:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:36:39.192 true 00:36:39.192 11:47:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:36:39.192 [2024-06-10 11:47:23.085157] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:36:39.192 [2024-06-10 11:47:23.085195] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:36:39.192 [2024-06-10 11:47:23.085211] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x10e0ab0 00:36:39.192 [2024-06-10 11:47:23.085219] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:36:39.192 [2024-06-10 11:47:23.086371] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:36:39.192 [2024-06-10 11:47:23.086396] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:36:39.192 BaseBdev3 00:36:39.192 11:47:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:36:39.192 11:47:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:36:39.449 BaseBdev4_malloc 00:36:39.449 11:47:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:36:39.706 true 00:36:39.706 11:47:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:36:39.706 [2024-06-10 11:47:23.595438] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:36:39.706 [2024-06-10 11:47:23.595473] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:36:39.706 [2024-06-10 11:47:23.595487] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x10e1380 00:36:39.706 [2024-06-10 11:47:23.595496] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:36:39.706 [2024-06-10 11:47:23.596611] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:36:39.706 [2024-06-10 11:47:23.596639] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:36:39.706 BaseBdev4 00:36:39.706 11:47:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:36:39.964 [2024-06-10 11:47:23.763910] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:36:39.964 [2024-06-10 11:47:23.764908] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:36:39.964 [2024-06-10 11:47:23.764957] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:36:39.964 [2024-06-10 11:47:23.764998] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:36:39.964 [2024-06-10 11:47:23.765170] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x10d9020 00:36:39.964 [2024-06-10 11:47:23.765179] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:36:39.964 [2024-06-10 11:47:23.765324] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x10db400 00:36:39.964 [2024-06-10 11:47:23.765439] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x10d9020 00:36:39.964 [2024-06-10 11:47:23.765447] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x10d9020 00:36:39.964 [2024-06-10 11:47:23.765522] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:36:39.964 11:47:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:36:39.964 11:47:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:36:39.964 11:47:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:36:39.964 11:47:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:36:39.964 11:47:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:36:39.964 11:47:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:36:39.964 11:47:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:36:39.964 11:47:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:36:39.964 11:47:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:36:39.964 11:47:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:36:39.964 11:47:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:39.964 11:47:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:36:40.222 11:47:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:36:40.222 "name": "raid_bdev1", 00:36:40.222 "uuid": "70e09cef-3a95-4ab2-ae02-925b2aa6df83", 00:36:40.222 "strip_size_kb": 0, 00:36:40.222 "state": "online", 00:36:40.222 "raid_level": "raid1", 00:36:40.222 "superblock": true, 00:36:40.222 "num_base_bdevs": 4, 00:36:40.222 "num_base_bdevs_discovered": 4, 00:36:40.222 "num_base_bdevs_operational": 4, 00:36:40.222 "base_bdevs_list": [ 00:36:40.222 { 00:36:40.222 "name": "BaseBdev1", 00:36:40.222 "uuid": "61c1121c-1371-5980-ad14-a6d59c5383e6", 00:36:40.222 "is_configured": true, 00:36:40.222 "data_offset": 2048, 00:36:40.222 "data_size": 63488 00:36:40.222 }, 00:36:40.222 { 00:36:40.222 "name": "BaseBdev2", 00:36:40.222 "uuid": "5b62dbec-74b2-500f-98fe-b7763ceebbe6", 00:36:40.222 "is_configured": true, 00:36:40.222 "data_offset": 2048, 00:36:40.222 "data_size": 63488 00:36:40.222 }, 00:36:40.222 { 00:36:40.222 "name": "BaseBdev3", 00:36:40.222 "uuid": "f9f240a4-338f-5dba-97c3-8438b5ae4080", 00:36:40.222 "is_configured": true, 00:36:40.222 "data_offset": 2048, 00:36:40.222 "data_size": 63488 00:36:40.222 }, 00:36:40.222 { 00:36:40.222 "name": "BaseBdev4", 00:36:40.222 "uuid": "d15db0a2-5c1f-5be1-bc98-c0ff0a29c0a1", 00:36:40.222 "is_configured": true, 00:36:40.222 "data_offset": 2048, 00:36:40.222 "data_size": 63488 00:36:40.222 } 00:36:40.222 ] 00:36:40.222 }' 00:36:40.222 11:47:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:36:40.222 11:47:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:36:40.788 11:47:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:36:40.788 11:47:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:36:40.788 [2024-06-10 11:47:24.518039] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x10db700 00:36:41.720 11:47:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:36:41.720 11:47:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:36:41.720 11:47:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:36:41.720 11:47:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:36:41.720 11:47:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:36:41.720 11:47:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:36:41.720 11:47:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:36:41.720 11:47:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:36:41.720 11:47:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:36:41.720 11:47:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:36:41.720 11:47:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:36:41.720 11:47:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:36:41.720 11:47:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:36:41.720 11:47:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:36:41.720 11:47:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:36:41.720 11:47:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:41.720 11:47:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:36:41.978 11:47:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:36:41.978 "name": "raid_bdev1", 00:36:41.978 "uuid": "70e09cef-3a95-4ab2-ae02-925b2aa6df83", 00:36:41.978 "strip_size_kb": 0, 00:36:41.978 "state": "online", 00:36:41.978 "raid_level": "raid1", 00:36:41.978 "superblock": true, 00:36:41.978 "num_base_bdevs": 4, 00:36:41.978 "num_base_bdevs_discovered": 4, 00:36:41.978 "num_base_bdevs_operational": 4, 00:36:41.978 "base_bdevs_list": [ 00:36:41.978 { 00:36:41.978 "name": "BaseBdev1", 00:36:41.978 "uuid": "61c1121c-1371-5980-ad14-a6d59c5383e6", 00:36:41.978 "is_configured": true, 00:36:41.978 "data_offset": 2048, 00:36:41.978 "data_size": 63488 00:36:41.978 }, 00:36:41.978 { 00:36:41.978 "name": "BaseBdev2", 00:36:41.978 "uuid": "5b62dbec-74b2-500f-98fe-b7763ceebbe6", 00:36:41.978 "is_configured": true, 00:36:41.978 "data_offset": 2048, 00:36:41.978 "data_size": 63488 00:36:41.978 }, 00:36:41.978 { 00:36:41.978 "name": "BaseBdev3", 00:36:41.978 "uuid": "f9f240a4-338f-5dba-97c3-8438b5ae4080", 00:36:41.978 "is_configured": true, 00:36:41.978 "data_offset": 2048, 00:36:41.978 "data_size": 63488 00:36:41.978 }, 00:36:41.978 { 00:36:41.978 "name": "BaseBdev4", 00:36:41.978 "uuid": "d15db0a2-5c1f-5be1-bc98-c0ff0a29c0a1", 00:36:41.978 "is_configured": true, 00:36:41.978 "data_offset": 2048, 00:36:41.978 "data_size": 63488 00:36:41.978 } 00:36:41.978 ] 00:36:41.978 }' 00:36:41.978 11:47:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:36:41.978 11:47:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:36:42.543 11:47:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:36:42.543 [2024-06-10 11:47:26.453775] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:36:42.543 [2024-06-10 11:47:26.453811] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:36:42.543 [2024-06-10 11:47:26.455900] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:36:42.543 [2024-06-10 11:47:26.455930] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:36:42.543 [2024-06-10 11:47:26.456014] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:36:42.543 [2024-06-10 11:47:26.456021] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x10d9020 name raid_bdev1, state offline 00:36:42.543 0 00:36:42.543 11:47:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 214441 00:36:42.543 11:47:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@949 -- # '[' -z 214441 ']' 00:36:42.543 11:47:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # kill -0 214441 00:36:42.543 11:47:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # uname 00:36:42.543 11:47:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:36:42.543 11:47:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 214441 00:36:42.801 11:47:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:36:42.801 11:47:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:36:42.801 11:47:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 214441' 00:36:42.801 killing process with pid 214441 00:36:42.801 11:47:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # kill 214441 00:36:42.801 [2024-06-10 11:47:26.520648] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:36:42.801 11:47:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@973 -- # wait 214441 00:36:42.801 [2024-06-10 11:47:26.551368] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:36:43.058 11:47:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.dapxt0toVg 00:36:43.058 11:47:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:36:43.058 11:47:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:36:43.058 11:47:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:36:43.058 11:47:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:36:43.058 11:47:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:36:43.058 11:47:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:36:43.058 11:47:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:36:43.058 00:36:43.058 real 0m6.084s 00:36:43.058 user 0m9.380s 00:36:43.058 sys 0m1.094s 00:36:43.058 11:47:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:36:43.058 11:47:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:36:43.058 ************************************ 00:36:43.058 END TEST raid_read_error_test 00:36:43.058 ************************************ 00:36:43.058 11:47:26 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 4 write 00:36:43.058 11:47:26 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:36:43.058 11:47:26 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:36:43.058 11:47:26 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:36:43.058 ************************************ 00:36:43.058 START TEST raid_write_error_test 00:36:43.058 ************************************ 00:36:43.058 11:47:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test raid1 4 write 00:36:43.058 11:47:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:36:43.058 11:47:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:36:43.058 11:47:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:36:43.058 11:47:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:36:43.058 11:47:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:36:43.058 11:47:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:36:43.058 11:47:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:36:43.058 11:47:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:36:43.058 11:47:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:36:43.058 11:47:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:36:43.058 11:47:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:36:43.058 11:47:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:36:43.058 11:47:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:36:43.058 11:47:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:36:43.058 11:47:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:36:43.058 11:47:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:36:43.058 11:47:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:36:43.059 11:47:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:36:43.059 11:47:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:36:43.059 11:47:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:36:43.059 11:47:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:36:43.059 11:47:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:36:43.059 11:47:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:36:43.059 11:47:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:36:43.059 11:47:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:36:43.059 11:47:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:36:43.059 11:47:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:36:43.059 11:47:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.KtrJYJ1jow 00:36:43.059 11:47:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=215415 00:36:43.059 11:47:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 215415 /var/tmp/spdk-raid.sock 00:36:43.059 11:47:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:36:43.059 11:47:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@830 -- # '[' -z 215415 ']' 00:36:43.059 11:47:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:36:43.059 11:47:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:36:43.059 11:47:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:36:43.059 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:36:43.059 11:47:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:36:43.059 11:47:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:36:43.059 [2024-06-10 11:47:26.920636] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:36:43.059 [2024-06-10 11:47:26.920689] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid215415 ] 00:36:43.316 [2024-06-10 11:47:27.007406] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:36:43.316 [2024-06-10 11:47:27.092594] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:36:43.316 [2024-06-10 11:47:27.146462] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:36:43.316 [2024-06-10 11:47:27.146490] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:36:43.880 11:47:27 bdev_raid.raid_write_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:36:43.880 11:47:27 bdev_raid.raid_write_error_test -- common/autotest_common.sh@863 -- # return 0 00:36:43.880 11:47:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:36:43.880 11:47:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:36:44.138 BaseBdev1_malloc 00:36:44.138 11:47:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:36:44.138 true 00:36:44.396 11:47:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:36:44.396 [2024-06-10 11:47:28.238653] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:36:44.396 [2024-06-10 11:47:28.238693] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:36:44.396 [2024-06-10 11:47:28.238708] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20d9b10 00:36:44.396 [2024-06-10 11:47:28.238717] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:36:44.396 [2024-06-10 11:47:28.239956] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:36:44.396 [2024-06-10 11:47:28.239982] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:36:44.396 BaseBdev1 00:36:44.396 11:47:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:36:44.396 11:47:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:36:44.654 BaseBdev2_malloc 00:36:44.654 11:47:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:36:44.654 true 00:36:44.654 11:47:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:36:44.911 [2024-06-10 11:47:28.735545] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:36:44.911 [2024-06-10 11:47:28.735584] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:36:44.911 [2024-06-10 11:47:28.735597] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20de280 00:36:44.911 [2024-06-10 11:47:28.735605] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:36:44.911 [2024-06-10 11:47:28.736568] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:36:44.911 [2024-06-10 11:47:28.736591] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:36:44.911 BaseBdev2 00:36:44.911 11:47:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:36:44.911 11:47:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:36:45.170 BaseBdev3_malloc 00:36:45.170 11:47:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:36:45.170 true 00:36:45.170 11:47:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:36:45.428 [2024-06-10 11:47:29.276910] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:36:45.428 [2024-06-10 11:47:29.276949] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:36:45.428 [2024-06-10 11:47:29.276964] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20e0ab0 00:36:45.428 [2024-06-10 11:47:29.276973] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:36:45.428 [2024-06-10 11:47:29.277941] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:36:45.428 [2024-06-10 11:47:29.277963] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:36:45.428 BaseBdev3 00:36:45.428 11:47:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:36:45.428 11:47:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:36:45.687 BaseBdev4_malloc 00:36:45.687 11:47:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:36:45.945 true 00:36:45.945 11:47:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:36:45.945 [2024-06-10 11:47:29.793845] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:36:45.945 [2024-06-10 11:47:29.793886] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:36:45.945 [2024-06-10 11:47:29.793899] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20e1380 00:36:45.945 [2024-06-10 11:47:29.793907] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:36:45.945 [2024-06-10 11:47:29.794817] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:36:45.945 [2024-06-10 11:47:29.794838] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:36:45.945 BaseBdev4 00:36:45.945 11:47:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:36:46.204 [2024-06-10 11:47:29.978366] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:36:46.204 [2024-06-10 11:47:29.979280] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:36:46.204 [2024-06-10 11:47:29.979328] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:36:46.204 [2024-06-10 11:47:29.979367] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:36:46.204 [2024-06-10 11:47:29.979531] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x20d9020 00:36:46.204 [2024-06-10 11:47:29.979539] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:36:46.204 [2024-06-10 11:47:29.979674] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20db400 00:36:46.204 [2024-06-10 11:47:29.979787] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x20d9020 00:36:46.204 [2024-06-10 11:47:29.979794] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x20d9020 00:36:46.204 [2024-06-10 11:47:29.979864] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:36:46.204 11:47:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:36:46.204 11:47:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:36:46.204 11:47:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:36:46.204 11:47:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:36:46.204 11:47:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:36:46.204 11:47:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:36:46.204 11:47:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:36:46.204 11:47:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:36:46.204 11:47:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:36:46.204 11:47:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:36:46.204 11:47:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:46.204 11:47:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:36:46.462 11:47:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:36:46.462 "name": "raid_bdev1", 00:36:46.462 "uuid": "683bf49c-ec88-4137-b270-093f932842b5", 00:36:46.462 "strip_size_kb": 0, 00:36:46.462 "state": "online", 00:36:46.462 "raid_level": "raid1", 00:36:46.462 "superblock": true, 00:36:46.462 "num_base_bdevs": 4, 00:36:46.462 "num_base_bdevs_discovered": 4, 00:36:46.462 "num_base_bdevs_operational": 4, 00:36:46.462 "base_bdevs_list": [ 00:36:46.462 { 00:36:46.462 "name": "BaseBdev1", 00:36:46.462 "uuid": "d0a3b667-cb83-5989-a827-f87517e5a2b1", 00:36:46.462 "is_configured": true, 00:36:46.462 "data_offset": 2048, 00:36:46.462 "data_size": 63488 00:36:46.462 }, 00:36:46.462 { 00:36:46.462 "name": "BaseBdev2", 00:36:46.462 "uuid": "e7aee69e-3e26-5386-801a-2022075b0335", 00:36:46.462 "is_configured": true, 00:36:46.462 "data_offset": 2048, 00:36:46.462 "data_size": 63488 00:36:46.462 }, 00:36:46.462 { 00:36:46.462 "name": "BaseBdev3", 00:36:46.462 "uuid": "e4be561f-108d-5ad1-8d6a-d57afd33d6bc", 00:36:46.462 "is_configured": true, 00:36:46.462 "data_offset": 2048, 00:36:46.462 "data_size": 63488 00:36:46.462 }, 00:36:46.462 { 00:36:46.462 "name": "BaseBdev4", 00:36:46.462 "uuid": "89b049be-52f9-5dc4-93c2-adf0a26147f2", 00:36:46.462 "is_configured": true, 00:36:46.462 "data_offset": 2048, 00:36:46.462 "data_size": 63488 00:36:46.462 } 00:36:46.462 ] 00:36:46.462 }' 00:36:46.462 11:47:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:36:46.462 11:47:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:36:47.027 11:47:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:36:47.027 11:47:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:36:47.027 [2024-06-10 11:47:30.756747] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20db700 00:36:47.960 11:47:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:36:47.961 [2024-06-10 11:47:31.851390] bdev_raid.c:2221:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:36:47.961 [2024-06-10 11:47:31.851441] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:36:47.961 [2024-06-10 11:47:31.851624] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x20db700 00:36:47.961 11:47:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:36:47.961 11:47:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:36:47.961 11:47:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:36:47.961 11:47:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=3 00:36:47.961 11:47:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:36:47.961 11:47:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:36:47.961 11:47:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:36:47.961 11:47:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:36:47.961 11:47:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:36:47.961 11:47:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:36:47.961 11:47:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:36:47.961 11:47:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:36:47.961 11:47:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:36:47.961 11:47:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:36:47.961 11:47:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:47.961 11:47:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:36:48.219 11:47:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:36:48.219 "name": "raid_bdev1", 00:36:48.219 "uuid": "683bf49c-ec88-4137-b270-093f932842b5", 00:36:48.219 "strip_size_kb": 0, 00:36:48.219 "state": "online", 00:36:48.219 "raid_level": "raid1", 00:36:48.219 "superblock": true, 00:36:48.219 "num_base_bdevs": 4, 00:36:48.219 "num_base_bdevs_discovered": 3, 00:36:48.219 "num_base_bdevs_operational": 3, 00:36:48.219 "base_bdevs_list": [ 00:36:48.219 { 00:36:48.219 "name": null, 00:36:48.219 "uuid": "00000000-0000-0000-0000-000000000000", 00:36:48.219 "is_configured": false, 00:36:48.219 "data_offset": 2048, 00:36:48.219 "data_size": 63488 00:36:48.219 }, 00:36:48.219 { 00:36:48.219 "name": "BaseBdev2", 00:36:48.219 "uuid": "e7aee69e-3e26-5386-801a-2022075b0335", 00:36:48.219 "is_configured": true, 00:36:48.219 "data_offset": 2048, 00:36:48.219 "data_size": 63488 00:36:48.219 }, 00:36:48.219 { 00:36:48.219 "name": "BaseBdev3", 00:36:48.219 "uuid": "e4be561f-108d-5ad1-8d6a-d57afd33d6bc", 00:36:48.219 "is_configured": true, 00:36:48.219 "data_offset": 2048, 00:36:48.219 "data_size": 63488 00:36:48.219 }, 00:36:48.219 { 00:36:48.219 "name": "BaseBdev4", 00:36:48.219 "uuid": "89b049be-52f9-5dc4-93c2-adf0a26147f2", 00:36:48.219 "is_configured": true, 00:36:48.219 "data_offset": 2048, 00:36:48.219 "data_size": 63488 00:36:48.219 } 00:36:48.219 ] 00:36:48.219 }' 00:36:48.219 11:47:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:36:48.219 11:47:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:36:48.783 11:47:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:36:48.783 [2024-06-10 11:47:32.680831] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:36:48.783 [2024-06-10 11:47:32.680876] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:36:48.783 [2024-06-10 11:47:32.682881] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:36:48.783 [2024-06-10 11:47:32.682914] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:36:48.783 [2024-06-10 11:47:32.682985] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:36:48.783 [2024-06-10 11:47:32.682994] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20d9020 name raid_bdev1, state offline 00:36:48.783 0 00:36:48.783 11:47:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 215415 00:36:48.783 11:47:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@949 -- # '[' -z 215415 ']' 00:36:48.783 11:47:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # kill -0 215415 00:36:48.783 11:47:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # uname 00:36:48.783 11:47:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:36:48.783 11:47:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 215415 00:36:49.042 11:47:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:36:49.042 11:47:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:36:49.042 11:47:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 215415' 00:36:49.042 killing process with pid 215415 00:36:49.042 11:47:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # kill 215415 00:36:49.042 [2024-06-10 11:47:32.743796] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:36:49.042 11:47:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@973 -- # wait 215415 00:36:49.042 [2024-06-10 11:47:32.773273] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:36:49.042 11:47:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.KtrJYJ1jow 00:36:49.042 11:47:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:36:49.042 11:47:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:36:49.042 11:47:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:36:49.042 11:47:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:36:49.042 11:47:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:36:49.042 11:47:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:36:49.042 11:47:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:36:49.042 00:36:49.042 real 0m6.134s 00:36:49.042 user 0m9.460s 00:36:49.042 sys 0m1.107s 00:36:49.042 11:47:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:36:49.042 11:47:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:36:49.042 ************************************ 00:36:49.042 END TEST raid_write_error_test 00:36:49.042 ************************************ 00:36:49.300 11:47:33 bdev_raid -- bdev/bdev_raid.sh@875 -- # '[' true = true ']' 00:36:49.300 11:47:33 bdev_raid -- bdev/bdev_raid.sh@876 -- # for n in 2 4 00:36:49.300 11:47:33 bdev_raid -- bdev/bdev_raid.sh@877 -- # run_test raid_rebuild_test raid_rebuild_test raid1 2 false false true 00:36:49.300 11:47:33 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:36:49.300 11:47:33 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:36:49.300 11:47:33 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:36:49.300 ************************************ 00:36:49.300 START TEST raid_rebuild_test 00:36:49.300 ************************************ 00:36:49.300 11:47:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1124 -- # raid_rebuild_test raid1 2 false false true 00:36:49.300 11:47:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:36:49.300 11:47:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:36:49.300 11:47:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:36:49.300 11:47:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:36:49.300 11:47:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@572 -- # local verify=true 00:36:49.300 11:47:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:36:49.300 11:47:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:36:49.300 11:47:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:36:49.300 11:47:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:36:49.300 11:47:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:36:49.300 11:47:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:36:49.300 11:47:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:36:49.300 11:47:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:36:49.300 11:47:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:36:49.300 11:47:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:36:49.300 11:47:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:36:49.300 11:47:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # local strip_size 00:36:49.300 11:47:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@576 -- # local create_arg 00:36:49.300 11:47:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:36:49.300 11:47:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@578 -- # local data_offset 00:36:49.300 11:47:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:36:49.300 11:47:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:36:49.300 11:47:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:36:49.300 11:47:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@596 -- # raid_pid=216254 00:36:49.300 11:47:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:36:49.300 11:47:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@597 -- # waitforlisten 216254 /var/tmp/spdk-raid.sock 00:36:49.300 11:47:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@830 -- # '[' -z 216254 ']' 00:36:49.300 11:47:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:36:49.300 11:47:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:36:49.301 11:47:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:36:49.301 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:36:49.301 11:47:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:36:49.301 11:47:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:36:49.301 [2024-06-10 11:47:33.132757] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:36:49.301 [2024-06-10 11:47:33.132820] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid216254 ] 00:36:49.301 I/O size of 3145728 is greater than zero copy threshold (65536). 00:36:49.301 Zero copy mechanism will not be used. 00:36:49.301 [2024-06-10 11:47:33.222062] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:36:49.558 [2024-06-10 11:47:33.302181] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:36:49.558 [2024-06-10 11:47:33.363385] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:36:49.558 [2024-06-10 11:47:33.363411] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:36:50.123 11:47:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:36:50.123 11:47:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@863 -- # return 0 00:36:50.123 11:47:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:36:50.123 11:47:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:36:50.381 BaseBdev1_malloc 00:36:50.381 11:47:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:36:50.381 [2024-06-10 11:47:34.270873] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:36:50.381 [2024-06-10 11:47:34.270917] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:36:50.381 [2024-06-10 11:47:34.270933] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc61780 00:36:50.381 [2024-06-10 11:47:34.270941] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:36:50.381 [2024-06-10 11:47:34.271995] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:36:50.381 [2024-06-10 11:47:34.272018] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:36:50.381 BaseBdev1 00:36:50.381 11:47:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:36:50.381 11:47:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:36:50.639 BaseBdev2_malloc 00:36:50.639 11:47:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:36:50.897 [2024-06-10 11:47:34.643631] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:36:50.897 [2024-06-10 11:47:34.643668] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:36:50.897 [2024-06-10 11:47:34.643680] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe0ca50 00:36:50.897 [2024-06-10 11:47:34.643689] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:36:50.897 [2024-06-10 11:47:34.644594] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:36:50.897 [2024-06-10 11:47:34.644615] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:36:50.897 BaseBdev2 00:36:50.897 11:47:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:36:50.897 spare_malloc 00:36:50.897 11:47:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:36:51.155 spare_delay 00:36:51.155 11:47:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:36:51.413 [2024-06-10 11:47:35.160473] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:36:51.413 [2024-06-10 11:47:35.160512] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:36:51.413 [2024-06-10 11:47:35.160525] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe0f970 00:36:51.413 [2024-06-10 11:47:35.160534] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:36:51.413 [2024-06-10 11:47:35.161558] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:36:51.413 [2024-06-10 11:47:35.161578] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:36:51.413 spare 00:36:51.413 11:47:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:36:51.413 [2024-06-10 11:47:35.340943] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:36:51.413 [2024-06-10 11:47:35.341708] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:36:51.413 [2024-06-10 11:47:35.341761] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xe11410 00:36:51.413 [2024-06-10 11:47:35.341769] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:36:51.413 [2024-06-10 11:47:35.341906] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe10d20 00:36:51.413 [2024-06-10 11:47:35.341996] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe11410 00:36:51.413 [2024-06-10 11:47:35.342002] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xe11410 00:36:51.413 [2024-06-10 11:47:35.342069] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:36:51.673 11:47:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:36:51.673 11:47:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:36:51.673 11:47:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:36:51.673 11:47:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:36:51.673 11:47:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:36:51.673 11:47:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:36:51.673 11:47:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:36:51.673 11:47:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:36:51.673 11:47:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:36:51.673 11:47:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:36:51.673 11:47:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:51.673 11:47:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:36:51.673 11:47:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:36:51.673 "name": "raid_bdev1", 00:36:51.673 "uuid": "2fc4a67d-6123-4a71-8dd0-66deec4defa9", 00:36:51.673 "strip_size_kb": 0, 00:36:51.673 "state": "online", 00:36:51.673 "raid_level": "raid1", 00:36:51.673 "superblock": false, 00:36:51.673 "num_base_bdevs": 2, 00:36:51.673 "num_base_bdevs_discovered": 2, 00:36:51.673 "num_base_bdevs_operational": 2, 00:36:51.673 "base_bdevs_list": [ 00:36:51.673 { 00:36:51.673 "name": "BaseBdev1", 00:36:51.673 "uuid": "2909107f-60d7-557b-ba2f-dcc8a0e94ed4", 00:36:51.673 "is_configured": true, 00:36:51.673 "data_offset": 0, 00:36:51.673 "data_size": 65536 00:36:51.673 }, 00:36:51.673 { 00:36:51.673 "name": "BaseBdev2", 00:36:51.673 "uuid": "47c46a3a-c799-5981-8a33-6795a57e288b", 00:36:51.673 "is_configured": true, 00:36:51.673 "data_offset": 0, 00:36:51.673 "data_size": 65536 00:36:51.673 } 00:36:51.673 ] 00:36:51.673 }' 00:36:51.673 11:47:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:36:51.673 11:47:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:36:52.240 11:47:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:36:52.240 11:47:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:36:52.498 [2024-06-10 11:47:36.191301] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:36:52.498 11:47:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:36:52.499 11:47:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:52.499 11:47:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:36:52.499 11:47:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:36:52.499 11:47:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:36:52.499 11:47:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:36:52.499 11:47:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:36:52.499 11:47:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:36:52.499 11:47:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:36:52.499 11:47:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:36:52.499 11:47:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:36:52.499 11:47:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:36:52.499 11:47:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:36:52.499 11:47:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:36:52.499 11:47:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:36:52.499 11:47:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:36:52.499 11:47:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:36:52.757 [2024-06-10 11:47:36.536079] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe10d20 00:36:52.757 /dev/nbd0 00:36:52.757 11:47:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:36:52.757 11:47:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:36:52.757 11:47:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:36:52.757 11:47:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # local i 00:36:52.757 11:47:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:36:52.757 11:47:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:36:52.757 11:47:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:36:52.757 11:47:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@872 -- # break 00:36:52.757 11:47:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:36:52.757 11:47:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:36:52.757 11:47:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:36:52.757 1+0 records in 00:36:52.757 1+0 records out 00:36:52.757 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000232236 s, 17.6 MB/s 00:36:52.757 11:47:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:36:52.757 11:47:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # size=4096 00:36:52.757 11:47:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:36:52.757 11:47:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:36:52.757 11:47:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@888 -- # return 0 00:36:52.757 11:47:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:36:52.757 11:47:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:36:52.757 11:47:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:36:52.757 11:47:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:36:52.757 11:47:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=65536 oflag=direct 00:36:56.944 65536+0 records in 00:36:56.944 65536+0 records out 00:36:56.944 33554432 bytes (34 MB, 32 MiB) copied, 4.05904 s, 8.3 MB/s 00:36:56.944 11:47:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:36:56.944 11:47:40 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:36:56.944 11:47:40 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:36:56.944 11:47:40 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:36:56.944 11:47:40 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:36:56.944 11:47:40 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:36:56.944 11:47:40 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:36:56.944 [2024-06-10 11:47:40.851835] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:36:56.944 11:47:40 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:36:56.944 11:47:40 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:36:56.944 11:47:40 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:36:56.944 11:47:40 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:36:56.944 11:47:40 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:36:56.944 11:47:40 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:36:56.944 11:47:40 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:36:56.944 11:47:40 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:36:56.944 11:47:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:36:57.201 [2024-06-10 11:47:41.020301] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:36:57.201 11:47:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:36:57.201 11:47:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:36:57.201 11:47:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:36:57.201 11:47:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:36:57.201 11:47:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:36:57.201 11:47:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:36:57.201 11:47:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:36:57.201 11:47:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:36:57.201 11:47:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:36:57.201 11:47:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:36:57.201 11:47:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:57.201 11:47:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:36:57.459 11:47:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:36:57.459 "name": "raid_bdev1", 00:36:57.459 "uuid": "2fc4a67d-6123-4a71-8dd0-66deec4defa9", 00:36:57.459 "strip_size_kb": 0, 00:36:57.459 "state": "online", 00:36:57.459 "raid_level": "raid1", 00:36:57.459 "superblock": false, 00:36:57.459 "num_base_bdevs": 2, 00:36:57.459 "num_base_bdevs_discovered": 1, 00:36:57.459 "num_base_bdevs_operational": 1, 00:36:57.459 "base_bdevs_list": [ 00:36:57.459 { 00:36:57.459 "name": null, 00:36:57.459 "uuid": "00000000-0000-0000-0000-000000000000", 00:36:57.459 "is_configured": false, 00:36:57.459 "data_offset": 0, 00:36:57.459 "data_size": 65536 00:36:57.459 }, 00:36:57.459 { 00:36:57.459 "name": "BaseBdev2", 00:36:57.459 "uuid": "47c46a3a-c799-5981-8a33-6795a57e288b", 00:36:57.459 "is_configured": true, 00:36:57.459 "data_offset": 0, 00:36:57.459 "data_size": 65536 00:36:57.459 } 00:36:57.459 ] 00:36:57.459 }' 00:36:57.459 11:47:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:36:57.459 11:47:41 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:36:58.026 11:47:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:36:58.026 [2024-06-10 11:47:41.878532] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:36:58.026 [2024-06-10 11:47:41.882954] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe12370 00:36:58.026 [2024-06-10 11:47:41.884526] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:36:58.026 11:47:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@646 -- # sleep 1 00:36:58.961 11:47:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:36:58.961 11:47:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:36:58.961 11:47:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:36:58.961 11:47:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:36:58.961 11:47:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:36:59.220 11:47:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:59.220 11:47:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:36:59.220 11:47:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:36:59.220 "name": "raid_bdev1", 00:36:59.220 "uuid": "2fc4a67d-6123-4a71-8dd0-66deec4defa9", 00:36:59.220 "strip_size_kb": 0, 00:36:59.220 "state": "online", 00:36:59.220 "raid_level": "raid1", 00:36:59.220 "superblock": false, 00:36:59.220 "num_base_bdevs": 2, 00:36:59.220 "num_base_bdevs_discovered": 2, 00:36:59.220 "num_base_bdevs_operational": 2, 00:36:59.220 "process": { 00:36:59.220 "type": "rebuild", 00:36:59.220 "target": "spare", 00:36:59.220 "progress": { 00:36:59.220 "blocks": 22528, 00:36:59.220 "percent": 34 00:36:59.220 } 00:36:59.220 }, 00:36:59.220 "base_bdevs_list": [ 00:36:59.220 { 00:36:59.220 "name": "spare", 00:36:59.220 "uuid": "ce870e91-62d4-5436-87a9-d118edfff4cc", 00:36:59.220 "is_configured": true, 00:36:59.220 "data_offset": 0, 00:36:59.220 "data_size": 65536 00:36:59.220 }, 00:36:59.220 { 00:36:59.220 "name": "BaseBdev2", 00:36:59.220 "uuid": "47c46a3a-c799-5981-8a33-6795a57e288b", 00:36:59.220 "is_configured": true, 00:36:59.220 "data_offset": 0, 00:36:59.220 "data_size": 65536 00:36:59.220 } 00:36:59.220 ] 00:36:59.220 }' 00:36:59.220 11:47:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:36:59.220 11:47:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:36:59.220 11:47:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:36:59.478 11:47:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:36:59.478 11:47:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:36:59.478 [2024-06-10 11:47:43.327259] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:36:59.478 [2024-06-10 11:47:43.395433] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:36:59.478 [2024-06-10 11:47:43.395470] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:36:59.478 [2024-06-10 11:47:43.395480] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:36:59.478 [2024-06-10 11:47:43.395486] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:36:59.478 11:47:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:36:59.478 11:47:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:36:59.478 11:47:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:36:59.478 11:47:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:36:59.478 11:47:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:36:59.478 11:47:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:36:59.478 11:47:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:36:59.478 11:47:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:36:59.478 11:47:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:36:59.737 11:47:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:36:59.737 11:47:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:59.737 11:47:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:36:59.737 11:47:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:36:59.737 "name": "raid_bdev1", 00:36:59.737 "uuid": "2fc4a67d-6123-4a71-8dd0-66deec4defa9", 00:36:59.737 "strip_size_kb": 0, 00:36:59.737 "state": "online", 00:36:59.737 "raid_level": "raid1", 00:36:59.737 "superblock": false, 00:36:59.737 "num_base_bdevs": 2, 00:36:59.737 "num_base_bdevs_discovered": 1, 00:36:59.737 "num_base_bdevs_operational": 1, 00:36:59.737 "base_bdevs_list": [ 00:36:59.737 { 00:36:59.737 "name": null, 00:36:59.737 "uuid": "00000000-0000-0000-0000-000000000000", 00:36:59.737 "is_configured": false, 00:36:59.737 "data_offset": 0, 00:36:59.737 "data_size": 65536 00:36:59.737 }, 00:36:59.737 { 00:36:59.737 "name": "BaseBdev2", 00:36:59.737 "uuid": "47c46a3a-c799-5981-8a33-6795a57e288b", 00:36:59.737 "is_configured": true, 00:36:59.737 "data_offset": 0, 00:36:59.737 "data_size": 65536 00:36:59.737 } 00:36:59.737 ] 00:36:59.737 }' 00:36:59.737 11:47:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:36:59.737 11:47:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:37:00.306 11:47:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:37:00.306 11:47:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:37:00.306 11:47:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:37:00.306 11:47:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:37:00.306 11:47:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:37:00.306 11:47:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:00.306 11:47:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:00.306 11:47:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:37:00.306 "name": "raid_bdev1", 00:37:00.306 "uuid": "2fc4a67d-6123-4a71-8dd0-66deec4defa9", 00:37:00.306 "strip_size_kb": 0, 00:37:00.306 "state": "online", 00:37:00.306 "raid_level": "raid1", 00:37:00.306 "superblock": false, 00:37:00.306 "num_base_bdevs": 2, 00:37:00.306 "num_base_bdevs_discovered": 1, 00:37:00.306 "num_base_bdevs_operational": 1, 00:37:00.306 "base_bdevs_list": [ 00:37:00.306 { 00:37:00.306 "name": null, 00:37:00.306 "uuid": "00000000-0000-0000-0000-000000000000", 00:37:00.306 "is_configured": false, 00:37:00.306 "data_offset": 0, 00:37:00.306 "data_size": 65536 00:37:00.306 }, 00:37:00.306 { 00:37:00.306 "name": "BaseBdev2", 00:37:00.306 "uuid": "47c46a3a-c799-5981-8a33-6795a57e288b", 00:37:00.306 "is_configured": true, 00:37:00.306 "data_offset": 0, 00:37:00.306 "data_size": 65536 00:37:00.306 } 00:37:00.306 ] 00:37:00.306 }' 00:37:00.616 11:47:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:37:00.616 11:47:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:37:00.616 11:47:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:37:00.616 11:47:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:37:00.616 11:47:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:37:00.616 [2024-06-10 11:47:44.490560] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:37:00.616 [2024-06-10 11:47:44.495010] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe12370 00:37:00.616 [2024-06-10 11:47:44.496107] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:37:00.616 11:47:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@662 -- # sleep 1 00:37:01.578 11:47:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:37:01.578 11:47:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:37:01.578 11:47:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:37:01.578 11:47:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:37:01.578 11:47:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:37:01.578 11:47:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:01.578 11:47:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:01.836 11:47:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:37:01.836 "name": "raid_bdev1", 00:37:01.836 "uuid": "2fc4a67d-6123-4a71-8dd0-66deec4defa9", 00:37:01.836 "strip_size_kb": 0, 00:37:01.836 "state": "online", 00:37:01.836 "raid_level": "raid1", 00:37:01.836 "superblock": false, 00:37:01.836 "num_base_bdevs": 2, 00:37:01.836 "num_base_bdevs_discovered": 2, 00:37:01.836 "num_base_bdevs_operational": 2, 00:37:01.836 "process": { 00:37:01.836 "type": "rebuild", 00:37:01.836 "target": "spare", 00:37:01.836 "progress": { 00:37:01.836 "blocks": 22528, 00:37:01.836 "percent": 34 00:37:01.836 } 00:37:01.836 }, 00:37:01.836 "base_bdevs_list": [ 00:37:01.836 { 00:37:01.836 "name": "spare", 00:37:01.836 "uuid": "ce870e91-62d4-5436-87a9-d118edfff4cc", 00:37:01.836 "is_configured": true, 00:37:01.836 "data_offset": 0, 00:37:01.836 "data_size": 65536 00:37:01.836 }, 00:37:01.836 { 00:37:01.836 "name": "BaseBdev2", 00:37:01.836 "uuid": "47c46a3a-c799-5981-8a33-6795a57e288b", 00:37:01.836 "is_configured": true, 00:37:01.836 "data_offset": 0, 00:37:01.836 "data_size": 65536 00:37:01.836 } 00:37:01.836 ] 00:37:01.836 }' 00:37:01.836 11:47:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:37:01.836 11:47:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:37:01.836 11:47:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:37:02.095 11:47:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:37:02.095 11:47:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:37:02.095 11:47:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:37:02.095 11:47:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:37:02.095 11:47:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:37:02.095 11:47:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@705 -- # local timeout=595 00:37:02.095 11:47:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:37:02.095 11:47:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:37:02.095 11:47:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:37:02.095 11:47:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:37:02.095 11:47:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:37:02.095 11:47:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:37:02.095 11:47:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:02.095 11:47:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:02.095 11:47:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:37:02.095 "name": "raid_bdev1", 00:37:02.095 "uuid": "2fc4a67d-6123-4a71-8dd0-66deec4defa9", 00:37:02.095 "strip_size_kb": 0, 00:37:02.095 "state": "online", 00:37:02.095 "raid_level": "raid1", 00:37:02.095 "superblock": false, 00:37:02.095 "num_base_bdevs": 2, 00:37:02.095 "num_base_bdevs_discovered": 2, 00:37:02.095 "num_base_bdevs_operational": 2, 00:37:02.095 "process": { 00:37:02.095 "type": "rebuild", 00:37:02.095 "target": "spare", 00:37:02.095 "progress": { 00:37:02.095 "blocks": 28672, 00:37:02.095 "percent": 43 00:37:02.095 } 00:37:02.095 }, 00:37:02.095 "base_bdevs_list": [ 00:37:02.095 { 00:37:02.095 "name": "spare", 00:37:02.095 "uuid": "ce870e91-62d4-5436-87a9-d118edfff4cc", 00:37:02.095 "is_configured": true, 00:37:02.095 "data_offset": 0, 00:37:02.095 "data_size": 65536 00:37:02.095 }, 00:37:02.095 { 00:37:02.095 "name": "BaseBdev2", 00:37:02.095 "uuid": "47c46a3a-c799-5981-8a33-6795a57e288b", 00:37:02.095 "is_configured": true, 00:37:02.095 "data_offset": 0, 00:37:02.095 "data_size": 65536 00:37:02.095 } 00:37:02.095 ] 00:37:02.095 }' 00:37:02.095 11:47:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:37:02.095 11:47:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:37:02.095 11:47:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:37:02.096 11:47:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:37:02.096 11:47:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:37:03.471 11:47:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:37:03.471 11:47:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:37:03.471 11:47:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:37:03.471 11:47:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:37:03.471 11:47:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:37:03.471 11:47:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:37:03.471 11:47:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:03.471 11:47:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:03.471 11:47:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:37:03.471 "name": "raid_bdev1", 00:37:03.471 "uuid": "2fc4a67d-6123-4a71-8dd0-66deec4defa9", 00:37:03.471 "strip_size_kb": 0, 00:37:03.471 "state": "online", 00:37:03.471 "raid_level": "raid1", 00:37:03.471 "superblock": false, 00:37:03.471 "num_base_bdevs": 2, 00:37:03.471 "num_base_bdevs_discovered": 2, 00:37:03.471 "num_base_bdevs_operational": 2, 00:37:03.471 "process": { 00:37:03.471 "type": "rebuild", 00:37:03.471 "target": "spare", 00:37:03.471 "progress": { 00:37:03.471 "blocks": 53248, 00:37:03.471 "percent": 81 00:37:03.471 } 00:37:03.471 }, 00:37:03.471 "base_bdevs_list": [ 00:37:03.471 { 00:37:03.471 "name": "spare", 00:37:03.471 "uuid": "ce870e91-62d4-5436-87a9-d118edfff4cc", 00:37:03.471 "is_configured": true, 00:37:03.471 "data_offset": 0, 00:37:03.472 "data_size": 65536 00:37:03.472 }, 00:37:03.472 { 00:37:03.472 "name": "BaseBdev2", 00:37:03.472 "uuid": "47c46a3a-c799-5981-8a33-6795a57e288b", 00:37:03.472 "is_configured": true, 00:37:03.472 "data_offset": 0, 00:37:03.472 "data_size": 65536 00:37:03.472 } 00:37:03.472 ] 00:37:03.472 }' 00:37:03.472 11:47:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:37:03.472 11:47:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:37:03.472 11:47:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:37:03.472 11:47:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:37:03.472 11:47:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:37:04.039 [2024-06-10 11:47:47.719500] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:37:04.039 [2024-06-10 11:47:47.719546] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:37:04.039 [2024-06-10 11:47:47.719575] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:37:04.605 11:47:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:37:04.605 11:47:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:37:04.605 11:47:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:37:04.605 11:47:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:37:04.605 11:47:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:37:04.605 11:47:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:37:04.605 11:47:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:04.605 11:47:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:04.605 11:47:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:37:04.605 "name": "raid_bdev1", 00:37:04.605 "uuid": "2fc4a67d-6123-4a71-8dd0-66deec4defa9", 00:37:04.605 "strip_size_kb": 0, 00:37:04.605 "state": "online", 00:37:04.605 "raid_level": "raid1", 00:37:04.605 "superblock": false, 00:37:04.605 "num_base_bdevs": 2, 00:37:04.605 "num_base_bdevs_discovered": 2, 00:37:04.605 "num_base_bdevs_operational": 2, 00:37:04.605 "base_bdevs_list": [ 00:37:04.605 { 00:37:04.605 "name": "spare", 00:37:04.605 "uuid": "ce870e91-62d4-5436-87a9-d118edfff4cc", 00:37:04.605 "is_configured": true, 00:37:04.605 "data_offset": 0, 00:37:04.605 "data_size": 65536 00:37:04.605 }, 00:37:04.605 { 00:37:04.605 "name": "BaseBdev2", 00:37:04.605 "uuid": "47c46a3a-c799-5981-8a33-6795a57e288b", 00:37:04.605 "is_configured": true, 00:37:04.605 "data_offset": 0, 00:37:04.605 "data_size": 65536 00:37:04.605 } 00:37:04.605 ] 00:37:04.605 }' 00:37:04.605 11:47:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:37:04.605 11:47:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:37:04.605 11:47:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:37:04.863 11:47:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:37:04.863 11:47:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # break 00:37:04.863 11:47:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:37:04.863 11:47:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:37:04.863 11:47:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:37:04.863 11:47:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:37:04.863 11:47:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:37:04.863 11:47:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:04.863 11:47:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:04.863 11:47:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:37:04.863 "name": "raid_bdev1", 00:37:04.863 "uuid": "2fc4a67d-6123-4a71-8dd0-66deec4defa9", 00:37:04.863 "strip_size_kb": 0, 00:37:04.863 "state": "online", 00:37:04.863 "raid_level": "raid1", 00:37:04.863 "superblock": false, 00:37:04.863 "num_base_bdevs": 2, 00:37:04.863 "num_base_bdevs_discovered": 2, 00:37:04.863 "num_base_bdevs_operational": 2, 00:37:04.863 "base_bdevs_list": [ 00:37:04.863 { 00:37:04.863 "name": "spare", 00:37:04.863 "uuid": "ce870e91-62d4-5436-87a9-d118edfff4cc", 00:37:04.863 "is_configured": true, 00:37:04.863 "data_offset": 0, 00:37:04.863 "data_size": 65536 00:37:04.863 }, 00:37:04.863 { 00:37:04.863 "name": "BaseBdev2", 00:37:04.863 "uuid": "47c46a3a-c799-5981-8a33-6795a57e288b", 00:37:04.863 "is_configured": true, 00:37:04.863 "data_offset": 0, 00:37:04.863 "data_size": 65536 00:37:04.863 } 00:37:04.863 ] 00:37:04.863 }' 00:37:04.863 11:47:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:37:04.863 11:47:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:37:04.863 11:47:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:37:05.121 11:47:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:37:05.121 11:47:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:37:05.121 11:47:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:37:05.121 11:47:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:37:05.121 11:47:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:37:05.121 11:47:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:37:05.121 11:47:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:37:05.121 11:47:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:37:05.121 11:47:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:37:05.121 11:47:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:37:05.121 11:47:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:37:05.121 11:47:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:05.121 11:47:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:05.121 11:47:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:37:05.121 "name": "raid_bdev1", 00:37:05.121 "uuid": "2fc4a67d-6123-4a71-8dd0-66deec4defa9", 00:37:05.121 "strip_size_kb": 0, 00:37:05.121 "state": "online", 00:37:05.121 "raid_level": "raid1", 00:37:05.121 "superblock": false, 00:37:05.121 "num_base_bdevs": 2, 00:37:05.121 "num_base_bdevs_discovered": 2, 00:37:05.121 "num_base_bdevs_operational": 2, 00:37:05.121 "base_bdevs_list": [ 00:37:05.121 { 00:37:05.121 "name": "spare", 00:37:05.121 "uuid": "ce870e91-62d4-5436-87a9-d118edfff4cc", 00:37:05.121 "is_configured": true, 00:37:05.121 "data_offset": 0, 00:37:05.121 "data_size": 65536 00:37:05.121 }, 00:37:05.121 { 00:37:05.121 "name": "BaseBdev2", 00:37:05.121 "uuid": "47c46a3a-c799-5981-8a33-6795a57e288b", 00:37:05.121 "is_configured": true, 00:37:05.121 "data_offset": 0, 00:37:05.121 "data_size": 65536 00:37:05.121 } 00:37:05.121 ] 00:37:05.121 }' 00:37:05.121 11:47:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:37:05.121 11:47:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:37:05.686 11:47:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:37:05.686 [2024-06-10 11:47:49.617079] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:37:05.686 [2024-06-10 11:47:49.617108] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:37:05.686 [2024-06-10 11:47:49.617156] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:37:05.686 [2024-06-10 11:47:49.617196] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:37:05.686 [2024-06-10 11:47:49.617204] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe11410 name raid_bdev1, state offline 00:37:05.944 11:47:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:05.944 11:47:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # jq length 00:37:05.944 11:47:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:37:05.944 11:47:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:37:05.944 11:47:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:37:05.944 11:47:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:37:05.944 11:47:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:37:05.944 11:47:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:37:05.944 11:47:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:37:05.944 11:47:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:37:05.944 11:47:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:37:05.944 11:47:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:37:05.944 11:47:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:37:05.944 11:47:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:37:05.944 11:47:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:37:06.202 /dev/nbd0 00:37:06.202 11:47:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:37:06.202 11:47:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:37:06.202 11:47:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:37:06.202 11:47:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # local i 00:37:06.203 11:47:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:37:06.203 11:47:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:37:06.203 11:47:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:37:06.203 11:47:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@872 -- # break 00:37:06.203 11:47:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:37:06.203 11:47:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:37:06.203 11:47:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:37:06.203 1+0 records in 00:37:06.203 1+0 records out 00:37:06.203 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000261036 s, 15.7 MB/s 00:37:06.203 11:47:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:37:06.203 11:47:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # size=4096 00:37:06.203 11:47:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:37:06.203 11:47:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:37:06.203 11:47:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@888 -- # return 0 00:37:06.203 11:47:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:37:06.203 11:47:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:37:06.203 11:47:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:37:06.460 /dev/nbd1 00:37:06.461 11:47:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:37:06.461 11:47:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:37:06.461 11:47:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:37:06.461 11:47:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # local i 00:37:06.461 11:47:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:37:06.461 11:47:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:37:06.461 11:47:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:37:06.461 11:47:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@872 -- # break 00:37:06.461 11:47:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:37:06.461 11:47:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:37:06.461 11:47:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:37:06.461 1+0 records in 00:37:06.461 1+0 records out 00:37:06.461 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000275061 s, 14.9 MB/s 00:37:06.461 11:47:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:37:06.461 11:47:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # size=4096 00:37:06.461 11:47:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:37:06.461 11:47:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:37:06.461 11:47:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@888 -- # return 0 00:37:06.461 11:47:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:37:06.461 11:47:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:37:06.461 11:47:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@737 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:37:06.461 11:47:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:37:06.461 11:47:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:37:06.461 11:47:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:37:06.461 11:47:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:37:06.461 11:47:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:37:06.461 11:47:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:37:06.461 11:47:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:37:06.718 11:47:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:37:06.718 11:47:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:37:06.718 11:47:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:37:06.718 11:47:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:37:06.718 11:47:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:37:06.718 11:47:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:37:06.718 11:47:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:37:06.718 11:47:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:37:06.718 11:47:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:37:06.718 11:47:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:37:06.976 11:47:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:37:06.976 11:47:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:37:06.976 11:47:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:37:06.976 11:47:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:37:06.976 11:47:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:37:06.976 11:47:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:37:06.976 11:47:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:37:06.976 11:47:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:37:06.976 11:47:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:37:06.976 11:47:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@782 -- # killprocess 216254 00:37:06.976 11:47:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@949 -- # '[' -z 216254 ']' 00:37:06.976 11:47:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # kill -0 216254 00:37:06.976 11:47:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # uname 00:37:06.976 11:47:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:37:06.976 11:47:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 216254 00:37:06.976 11:47:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:37:06.976 11:47:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:37:06.976 11:47:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 216254' 00:37:06.976 killing process with pid 216254 00:37:06.976 11:47:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@968 -- # kill 216254 00:37:06.976 Received shutdown signal, test time was about 60.000000 seconds 00:37:06.976 00:37:06.976 Latency(us) 00:37:06.976 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:37:06.976 =================================================================================================================== 00:37:06.976 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:37:06.976 [2024-06-10 11:47:50.741905] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:37:06.976 11:47:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@973 -- # wait 216254 00:37:06.976 [2024-06-10 11:47:50.767816] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:37:07.235 11:47:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@784 -- # return 0 00:37:07.235 00:37:07.235 real 0m17.894s 00:37:07.235 user 0m23.325s 00:37:07.235 sys 0m4.099s 00:37:07.235 11:47:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:37:07.235 11:47:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:37:07.235 ************************************ 00:37:07.235 END TEST raid_rebuild_test 00:37:07.235 ************************************ 00:37:07.235 11:47:51 bdev_raid -- bdev/bdev_raid.sh@878 -- # run_test raid_rebuild_test_sb raid_rebuild_test raid1 2 true false true 00:37:07.235 11:47:51 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:37:07.235 11:47:51 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:37:07.235 11:47:51 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:37:07.235 ************************************ 00:37:07.235 START TEST raid_rebuild_test_sb 00:37:07.235 ************************************ 00:37:07.235 11:47:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1124 -- # raid_rebuild_test raid1 2 true false true 00:37:07.235 11:47:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:37:07.235 11:47:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:37:07.235 11:47:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:37:07.235 11:47:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:37:07.235 11:47:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@572 -- # local verify=true 00:37:07.235 11:47:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:37:07.235 11:47:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:37:07.235 11:47:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:37:07.235 11:47:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:37:07.235 11:47:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:37:07.235 11:47:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:37:07.235 11:47:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:37:07.235 11:47:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:37:07.235 11:47:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:37:07.235 11:47:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:37:07.235 11:47:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:37:07.235 11:47:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # local strip_size 00:37:07.235 11:47:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@576 -- # local create_arg 00:37:07.235 11:47:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:37:07.235 11:47:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@578 -- # local data_offset 00:37:07.235 11:47:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:37:07.235 11:47:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:37:07.235 11:47:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:37:07.235 11:47:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:37:07.235 11:47:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@596 -- # raid_pid=218918 00:37:07.235 11:47:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@597 -- # waitforlisten 218918 /var/tmp/spdk-raid.sock 00:37:07.235 11:47:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:37:07.235 11:47:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@830 -- # '[' -z 218918 ']' 00:37:07.235 11:47:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:37:07.235 11:47:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@835 -- # local max_retries=100 00:37:07.235 11:47:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:37:07.235 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:37:07.235 11:47:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@839 -- # xtrace_disable 00:37:07.235 11:47:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:37:07.235 [2024-06-10 11:47:51.119203] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:37:07.235 [2024-06-10 11:47:51.119269] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid218918 ] 00:37:07.235 I/O size of 3145728 is greater than zero copy threshold (65536). 00:37:07.235 Zero copy mechanism will not be used. 00:37:07.496 [2024-06-10 11:47:51.207893] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:37:07.496 [2024-06-10 11:47:51.294075] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:37:07.496 [2024-06-10 11:47:51.346463] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:37:07.496 [2024-06-10 11:47:51.346484] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:37:08.061 11:47:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:37:08.061 11:47:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@863 -- # return 0 00:37:08.061 11:47:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:37:08.061 11:47:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:37:08.318 BaseBdev1_malloc 00:37:08.318 11:47:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:37:08.318 [2024-06-10 11:47:52.262268] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:37:08.318 [2024-06-10 11:47:52.262309] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:37:08.318 [2024-06-10 11:47:52.262325] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20e5780 00:37:08.318 [2024-06-10 11:47:52.262334] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:37:08.318 [2024-06-10 11:47:52.263468] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:37:08.318 [2024-06-10 11:47:52.263491] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:37:08.576 BaseBdev1 00:37:08.576 11:47:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:37:08.576 11:47:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:37:08.576 BaseBdev2_malloc 00:37:08.576 11:47:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:37:08.834 [2024-06-10 11:47:52.627065] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:37:08.834 [2024-06-10 11:47:52.627101] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:37:08.834 [2024-06-10 11:47:52.627117] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2290a50 00:37:08.834 [2024-06-10 11:47:52.627125] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:37:08.834 [2024-06-10 11:47:52.628033] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:37:08.834 [2024-06-10 11:47:52.628053] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:37:08.834 BaseBdev2 00:37:08.834 11:47:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:37:09.093 spare_malloc 00:37:09.093 11:47:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:37:09.093 spare_delay 00:37:09.093 11:47:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:37:09.351 [2024-06-10 11:47:53.147882] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:37:09.351 [2024-06-10 11:47:53.147917] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:37:09.351 [2024-06-10 11:47:53.147932] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2293970 00:37:09.351 [2024-06-10 11:47:53.147943] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:37:09.351 [2024-06-10 11:47:53.148847] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:37:09.351 [2024-06-10 11:47:53.148874] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:37:09.351 spare 00:37:09.351 11:47:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:37:09.610 [2024-06-10 11:47:53.316331] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:37:09.610 [2024-06-10 11:47:53.317125] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:37:09.610 [2024-06-10 11:47:53.317237] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2295410 00:37:09.610 [2024-06-10 11:47:53.317245] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:37:09.610 [2024-06-10 11:47:53.317359] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2294d20 00:37:09.610 [2024-06-10 11:47:53.317451] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2295410 00:37:09.610 [2024-06-10 11:47:53.317457] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2295410 00:37:09.610 [2024-06-10 11:47:53.317514] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:37:09.610 11:47:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:37:09.610 11:47:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:37:09.610 11:47:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:37:09.610 11:47:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:37:09.610 11:47:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:37:09.610 11:47:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:37:09.610 11:47:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:37:09.610 11:47:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:37:09.610 11:47:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:37:09.610 11:47:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:37:09.610 11:47:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:09.610 11:47:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:09.610 11:47:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:37:09.610 "name": "raid_bdev1", 00:37:09.610 "uuid": "c9f857db-80d7-4879-bf6d-5916e608f35a", 00:37:09.610 "strip_size_kb": 0, 00:37:09.610 "state": "online", 00:37:09.610 "raid_level": "raid1", 00:37:09.610 "superblock": true, 00:37:09.610 "num_base_bdevs": 2, 00:37:09.610 "num_base_bdevs_discovered": 2, 00:37:09.610 "num_base_bdevs_operational": 2, 00:37:09.610 "base_bdevs_list": [ 00:37:09.610 { 00:37:09.610 "name": "BaseBdev1", 00:37:09.610 "uuid": "9d8880a9-3594-595b-9ae4-dd697efbaa29", 00:37:09.610 "is_configured": true, 00:37:09.610 "data_offset": 2048, 00:37:09.610 "data_size": 63488 00:37:09.610 }, 00:37:09.610 { 00:37:09.610 "name": "BaseBdev2", 00:37:09.610 "uuid": "99f0c359-1348-5101-acc1-9abbaf5a6325", 00:37:09.610 "is_configured": true, 00:37:09.610 "data_offset": 2048, 00:37:09.610 "data_size": 63488 00:37:09.610 } 00:37:09.610 ] 00:37:09.610 }' 00:37:09.610 11:47:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:37:09.610 11:47:53 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:37:10.177 11:47:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:37:10.177 11:47:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:37:10.436 [2024-06-10 11:47:54.178695] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:37:10.436 11:47:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:37:10.436 11:47:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:37:10.436 11:47:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:10.436 11:47:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:37:10.436 11:47:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:37:10.436 11:47:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:37:10.436 11:47:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:37:10.436 11:47:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:37:10.436 11:47:54 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:37:10.436 11:47:54 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:37:10.436 11:47:54 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:37:10.436 11:47:54 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:37:10.436 11:47:54 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:37:10.436 11:47:54 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:37:10.436 11:47:54 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:37:10.436 11:47:54 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:37:10.436 11:47:54 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:37:10.695 [2024-06-10 11:47:54.539516] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2294d20 00:37:10.695 /dev/nbd0 00:37:10.695 11:47:54 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:37:10.695 11:47:54 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:37:10.695 11:47:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:37:10.695 11:47:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # local i 00:37:10.695 11:47:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:37:10.695 11:47:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:37:10.695 11:47:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:37:10.695 11:47:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@872 -- # break 00:37:10.695 11:47:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:37:10.695 11:47:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:37:10.695 11:47:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:37:10.695 1+0 records in 00:37:10.695 1+0 records out 00:37:10.695 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000238421 s, 17.2 MB/s 00:37:10.695 11:47:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:37:10.695 11:47:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # size=4096 00:37:10.695 11:47:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:37:10.695 11:47:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:37:10.695 11:47:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@888 -- # return 0 00:37:10.695 11:47:54 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:37:10.695 11:47:54 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:37:10.695 11:47:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:37:10.695 11:47:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:37:10.695 11:47:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=63488 oflag=direct 00:37:14.879 63488+0 records in 00:37:14.879 63488+0 records out 00:37:14.879 32505856 bytes (33 MB, 31 MiB) copied, 3.82689 s, 8.5 MB/s 00:37:14.879 11:47:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:37:14.879 11:47:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:37:14.879 11:47:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:37:14.879 11:47:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:37:14.879 11:47:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:37:14.879 11:47:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:37:14.879 11:47:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:37:14.879 [2024-06-10 11:47:58.621361] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:37:14.879 11:47:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:37:14.879 11:47:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:37:14.879 11:47:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:37:14.879 11:47:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:37:14.879 11:47:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:37:14.879 11:47:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:37:14.879 11:47:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:37:14.879 11:47:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:37:14.879 11:47:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:37:14.879 [2024-06-10 11:47:58.789833] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:37:14.879 11:47:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:37:14.879 11:47:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:37:14.879 11:47:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:37:14.879 11:47:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:37:14.879 11:47:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:37:14.879 11:47:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:37:14.879 11:47:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:37:14.879 11:47:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:37:14.879 11:47:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:37:14.879 11:47:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:37:14.879 11:47:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:14.879 11:47:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:15.135 11:47:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:37:15.135 "name": "raid_bdev1", 00:37:15.135 "uuid": "c9f857db-80d7-4879-bf6d-5916e608f35a", 00:37:15.135 "strip_size_kb": 0, 00:37:15.135 "state": "online", 00:37:15.135 "raid_level": "raid1", 00:37:15.135 "superblock": true, 00:37:15.135 "num_base_bdevs": 2, 00:37:15.135 "num_base_bdevs_discovered": 1, 00:37:15.135 "num_base_bdevs_operational": 1, 00:37:15.135 "base_bdevs_list": [ 00:37:15.135 { 00:37:15.135 "name": null, 00:37:15.135 "uuid": "00000000-0000-0000-0000-000000000000", 00:37:15.135 "is_configured": false, 00:37:15.135 "data_offset": 2048, 00:37:15.135 "data_size": 63488 00:37:15.136 }, 00:37:15.136 { 00:37:15.136 "name": "BaseBdev2", 00:37:15.136 "uuid": "99f0c359-1348-5101-acc1-9abbaf5a6325", 00:37:15.136 "is_configured": true, 00:37:15.136 "data_offset": 2048, 00:37:15.136 "data_size": 63488 00:37:15.136 } 00:37:15.136 ] 00:37:15.136 }' 00:37:15.136 11:47:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:37:15.136 11:47:58 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:37:15.700 11:47:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:37:15.700 [2024-06-10 11:47:59.624008] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:37:15.700 [2024-06-10 11:47:59.628514] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20e4f60 00:37:15.700 [2024-06-10 11:47:59.630117] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:37:15.700 11:47:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@646 -- # sleep 1 00:37:17.073 11:48:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:37:17.073 11:48:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:37:17.073 11:48:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:37:17.073 11:48:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:37:17.073 11:48:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:37:17.073 11:48:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:17.073 11:48:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:17.073 11:48:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:37:17.073 "name": "raid_bdev1", 00:37:17.073 "uuid": "c9f857db-80d7-4879-bf6d-5916e608f35a", 00:37:17.073 "strip_size_kb": 0, 00:37:17.073 "state": "online", 00:37:17.073 "raid_level": "raid1", 00:37:17.073 "superblock": true, 00:37:17.073 "num_base_bdevs": 2, 00:37:17.073 "num_base_bdevs_discovered": 2, 00:37:17.073 "num_base_bdevs_operational": 2, 00:37:17.073 "process": { 00:37:17.073 "type": "rebuild", 00:37:17.073 "target": "spare", 00:37:17.073 "progress": { 00:37:17.073 "blocks": 22528, 00:37:17.073 "percent": 35 00:37:17.073 } 00:37:17.073 }, 00:37:17.073 "base_bdevs_list": [ 00:37:17.073 { 00:37:17.073 "name": "spare", 00:37:17.073 "uuid": "ef7be36a-ba16-5707-82e5-b65eed834224", 00:37:17.073 "is_configured": true, 00:37:17.073 "data_offset": 2048, 00:37:17.073 "data_size": 63488 00:37:17.073 }, 00:37:17.073 { 00:37:17.073 "name": "BaseBdev2", 00:37:17.073 "uuid": "99f0c359-1348-5101-acc1-9abbaf5a6325", 00:37:17.073 "is_configured": true, 00:37:17.073 "data_offset": 2048, 00:37:17.073 "data_size": 63488 00:37:17.073 } 00:37:17.073 ] 00:37:17.073 }' 00:37:17.073 11:48:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:37:17.073 11:48:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:37:17.073 11:48:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:37:17.073 11:48:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:37:17.073 11:48:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:37:17.331 [2024-06-10 11:48:01.049052] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:37:17.331 [2024-06-10 11:48:01.141280] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:37:17.331 [2024-06-10 11:48:01.141319] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:37:17.331 [2024-06-10 11:48:01.141330] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:37:17.331 [2024-06-10 11:48:01.141336] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:37:17.331 11:48:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:37:17.332 11:48:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:37:17.332 11:48:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:37:17.332 11:48:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:37:17.332 11:48:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:37:17.332 11:48:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:37:17.332 11:48:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:37:17.332 11:48:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:37:17.332 11:48:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:37:17.332 11:48:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:37:17.332 11:48:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:17.332 11:48:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:17.589 11:48:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:37:17.589 "name": "raid_bdev1", 00:37:17.589 "uuid": "c9f857db-80d7-4879-bf6d-5916e608f35a", 00:37:17.589 "strip_size_kb": 0, 00:37:17.589 "state": "online", 00:37:17.589 "raid_level": "raid1", 00:37:17.589 "superblock": true, 00:37:17.589 "num_base_bdevs": 2, 00:37:17.589 "num_base_bdevs_discovered": 1, 00:37:17.589 "num_base_bdevs_operational": 1, 00:37:17.589 "base_bdevs_list": [ 00:37:17.589 { 00:37:17.589 "name": null, 00:37:17.589 "uuid": "00000000-0000-0000-0000-000000000000", 00:37:17.589 "is_configured": false, 00:37:17.589 "data_offset": 2048, 00:37:17.589 "data_size": 63488 00:37:17.589 }, 00:37:17.589 { 00:37:17.589 "name": "BaseBdev2", 00:37:17.589 "uuid": "99f0c359-1348-5101-acc1-9abbaf5a6325", 00:37:17.589 "is_configured": true, 00:37:17.589 "data_offset": 2048, 00:37:17.589 "data_size": 63488 00:37:17.589 } 00:37:17.589 ] 00:37:17.589 }' 00:37:17.589 11:48:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:37:17.589 11:48:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:37:18.154 11:48:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:37:18.154 11:48:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:37:18.154 11:48:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:37:18.154 11:48:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:37:18.154 11:48:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:37:18.154 11:48:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:18.154 11:48:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:18.154 11:48:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:37:18.154 "name": "raid_bdev1", 00:37:18.154 "uuid": "c9f857db-80d7-4879-bf6d-5916e608f35a", 00:37:18.154 "strip_size_kb": 0, 00:37:18.154 "state": "online", 00:37:18.154 "raid_level": "raid1", 00:37:18.154 "superblock": true, 00:37:18.154 "num_base_bdevs": 2, 00:37:18.154 "num_base_bdevs_discovered": 1, 00:37:18.154 "num_base_bdevs_operational": 1, 00:37:18.154 "base_bdevs_list": [ 00:37:18.154 { 00:37:18.154 "name": null, 00:37:18.154 "uuid": "00000000-0000-0000-0000-000000000000", 00:37:18.154 "is_configured": false, 00:37:18.154 "data_offset": 2048, 00:37:18.154 "data_size": 63488 00:37:18.154 }, 00:37:18.154 { 00:37:18.154 "name": "BaseBdev2", 00:37:18.154 "uuid": "99f0c359-1348-5101-acc1-9abbaf5a6325", 00:37:18.154 "is_configured": true, 00:37:18.154 "data_offset": 2048, 00:37:18.154 "data_size": 63488 00:37:18.154 } 00:37:18.154 ] 00:37:18.154 }' 00:37:18.154 11:48:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:37:18.155 11:48:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:37:18.155 11:48:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:37:18.155 11:48:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:37:18.155 11:48:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:37:18.411 [2024-06-10 11:48:02.232503] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:37:18.411 [2024-06-10 11:48:02.237028] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20e4f60 00:37:18.411 [2024-06-10 11:48:02.238097] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:37:18.411 11:48:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@662 -- # sleep 1 00:37:19.342 11:48:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:37:19.342 11:48:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:37:19.342 11:48:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:37:19.342 11:48:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:37:19.342 11:48:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:37:19.342 11:48:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:19.342 11:48:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:19.600 11:48:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:37:19.600 "name": "raid_bdev1", 00:37:19.600 "uuid": "c9f857db-80d7-4879-bf6d-5916e608f35a", 00:37:19.600 "strip_size_kb": 0, 00:37:19.600 "state": "online", 00:37:19.600 "raid_level": "raid1", 00:37:19.600 "superblock": true, 00:37:19.600 "num_base_bdevs": 2, 00:37:19.600 "num_base_bdevs_discovered": 2, 00:37:19.600 "num_base_bdevs_operational": 2, 00:37:19.600 "process": { 00:37:19.600 "type": "rebuild", 00:37:19.600 "target": "spare", 00:37:19.600 "progress": { 00:37:19.600 "blocks": 22528, 00:37:19.600 "percent": 35 00:37:19.600 } 00:37:19.600 }, 00:37:19.600 "base_bdevs_list": [ 00:37:19.600 { 00:37:19.600 "name": "spare", 00:37:19.600 "uuid": "ef7be36a-ba16-5707-82e5-b65eed834224", 00:37:19.600 "is_configured": true, 00:37:19.600 "data_offset": 2048, 00:37:19.600 "data_size": 63488 00:37:19.600 }, 00:37:19.600 { 00:37:19.600 "name": "BaseBdev2", 00:37:19.600 "uuid": "99f0c359-1348-5101-acc1-9abbaf5a6325", 00:37:19.600 "is_configured": true, 00:37:19.600 "data_offset": 2048, 00:37:19.600 "data_size": 63488 00:37:19.600 } 00:37:19.600 ] 00:37:19.600 }' 00:37:19.600 11:48:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:37:19.600 11:48:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:37:19.600 11:48:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:37:19.600 11:48:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:37:19.600 11:48:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:37:19.600 11:48:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:37:19.600 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:37:19.600 11:48:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:37:19.600 11:48:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:37:19.600 11:48:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:37:19.600 11:48:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@705 -- # local timeout=613 00:37:19.600 11:48:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:37:19.600 11:48:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:37:19.600 11:48:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:37:19.600 11:48:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:37:19.600 11:48:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:37:19.600 11:48:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:37:19.600 11:48:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:19.600 11:48:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:19.858 11:48:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:37:19.858 "name": "raid_bdev1", 00:37:19.858 "uuid": "c9f857db-80d7-4879-bf6d-5916e608f35a", 00:37:19.858 "strip_size_kb": 0, 00:37:19.858 "state": "online", 00:37:19.858 "raid_level": "raid1", 00:37:19.858 "superblock": true, 00:37:19.858 "num_base_bdevs": 2, 00:37:19.858 "num_base_bdevs_discovered": 2, 00:37:19.858 "num_base_bdevs_operational": 2, 00:37:19.858 "process": { 00:37:19.858 "type": "rebuild", 00:37:19.858 "target": "spare", 00:37:19.858 "progress": { 00:37:19.858 "blocks": 28672, 00:37:19.858 "percent": 45 00:37:19.858 } 00:37:19.858 }, 00:37:19.858 "base_bdevs_list": [ 00:37:19.858 { 00:37:19.858 "name": "spare", 00:37:19.858 "uuid": "ef7be36a-ba16-5707-82e5-b65eed834224", 00:37:19.858 "is_configured": true, 00:37:19.858 "data_offset": 2048, 00:37:19.858 "data_size": 63488 00:37:19.858 }, 00:37:19.858 { 00:37:19.858 "name": "BaseBdev2", 00:37:19.858 "uuid": "99f0c359-1348-5101-acc1-9abbaf5a6325", 00:37:19.858 "is_configured": true, 00:37:19.858 "data_offset": 2048, 00:37:19.858 "data_size": 63488 00:37:19.858 } 00:37:19.858 ] 00:37:19.858 }' 00:37:19.858 11:48:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:37:19.858 11:48:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:37:19.858 11:48:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:37:20.116 11:48:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:37:20.116 11:48:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:37:21.048 11:48:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:37:21.049 11:48:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:37:21.049 11:48:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:37:21.049 11:48:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:37:21.049 11:48:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:37:21.049 11:48:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:37:21.049 11:48:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:21.049 11:48:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:21.307 11:48:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:37:21.307 "name": "raid_bdev1", 00:37:21.307 "uuid": "c9f857db-80d7-4879-bf6d-5916e608f35a", 00:37:21.307 "strip_size_kb": 0, 00:37:21.307 "state": "online", 00:37:21.307 "raid_level": "raid1", 00:37:21.307 "superblock": true, 00:37:21.307 "num_base_bdevs": 2, 00:37:21.307 "num_base_bdevs_discovered": 2, 00:37:21.307 "num_base_bdevs_operational": 2, 00:37:21.307 "process": { 00:37:21.307 "type": "rebuild", 00:37:21.307 "target": "spare", 00:37:21.307 "progress": { 00:37:21.307 "blocks": 55296, 00:37:21.307 "percent": 87 00:37:21.307 } 00:37:21.307 }, 00:37:21.307 "base_bdevs_list": [ 00:37:21.307 { 00:37:21.307 "name": "spare", 00:37:21.307 "uuid": "ef7be36a-ba16-5707-82e5-b65eed834224", 00:37:21.307 "is_configured": true, 00:37:21.307 "data_offset": 2048, 00:37:21.307 "data_size": 63488 00:37:21.307 }, 00:37:21.307 { 00:37:21.307 "name": "BaseBdev2", 00:37:21.307 "uuid": "99f0c359-1348-5101-acc1-9abbaf5a6325", 00:37:21.307 "is_configured": true, 00:37:21.307 "data_offset": 2048, 00:37:21.307 "data_size": 63488 00:37:21.307 } 00:37:21.307 ] 00:37:21.307 }' 00:37:21.307 11:48:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:37:21.307 11:48:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:37:21.307 11:48:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:37:21.307 11:48:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:37:21.307 11:48:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:37:21.565 [2024-06-10 11:48:05.360858] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:37:21.565 [2024-06-10 11:48:05.360911] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:37:21.565 [2024-06-10 11:48:05.360979] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:37:22.498 11:48:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:37:22.498 11:48:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:37:22.498 11:48:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:37:22.498 11:48:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:37:22.498 11:48:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:37:22.498 11:48:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:37:22.498 11:48:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:22.498 11:48:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:22.498 11:48:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:37:22.498 "name": "raid_bdev1", 00:37:22.498 "uuid": "c9f857db-80d7-4879-bf6d-5916e608f35a", 00:37:22.498 "strip_size_kb": 0, 00:37:22.498 "state": "online", 00:37:22.498 "raid_level": "raid1", 00:37:22.498 "superblock": true, 00:37:22.498 "num_base_bdevs": 2, 00:37:22.498 "num_base_bdevs_discovered": 2, 00:37:22.498 "num_base_bdevs_operational": 2, 00:37:22.498 "base_bdevs_list": [ 00:37:22.498 { 00:37:22.498 "name": "spare", 00:37:22.498 "uuid": "ef7be36a-ba16-5707-82e5-b65eed834224", 00:37:22.498 "is_configured": true, 00:37:22.498 "data_offset": 2048, 00:37:22.498 "data_size": 63488 00:37:22.498 }, 00:37:22.498 { 00:37:22.498 "name": "BaseBdev2", 00:37:22.498 "uuid": "99f0c359-1348-5101-acc1-9abbaf5a6325", 00:37:22.498 "is_configured": true, 00:37:22.498 "data_offset": 2048, 00:37:22.498 "data_size": 63488 00:37:22.498 } 00:37:22.498 ] 00:37:22.498 }' 00:37:22.498 11:48:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:37:22.498 11:48:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:37:22.498 11:48:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:37:22.498 11:48:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:37:22.498 11:48:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # break 00:37:22.498 11:48:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:37:22.498 11:48:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:37:22.498 11:48:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:37:22.498 11:48:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:37:22.499 11:48:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:37:22.499 11:48:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:22.499 11:48:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:22.757 11:48:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:37:22.757 "name": "raid_bdev1", 00:37:22.757 "uuid": "c9f857db-80d7-4879-bf6d-5916e608f35a", 00:37:22.757 "strip_size_kb": 0, 00:37:22.757 "state": "online", 00:37:22.757 "raid_level": "raid1", 00:37:22.757 "superblock": true, 00:37:22.757 "num_base_bdevs": 2, 00:37:22.757 "num_base_bdevs_discovered": 2, 00:37:22.757 "num_base_bdevs_operational": 2, 00:37:22.757 "base_bdevs_list": [ 00:37:22.757 { 00:37:22.757 "name": "spare", 00:37:22.757 "uuid": "ef7be36a-ba16-5707-82e5-b65eed834224", 00:37:22.757 "is_configured": true, 00:37:22.757 "data_offset": 2048, 00:37:22.757 "data_size": 63488 00:37:22.757 }, 00:37:22.757 { 00:37:22.757 "name": "BaseBdev2", 00:37:22.757 "uuid": "99f0c359-1348-5101-acc1-9abbaf5a6325", 00:37:22.757 "is_configured": true, 00:37:22.757 "data_offset": 2048, 00:37:22.757 "data_size": 63488 00:37:22.757 } 00:37:22.757 ] 00:37:22.757 }' 00:37:22.757 11:48:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:37:22.757 11:48:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:37:22.757 11:48:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:37:22.757 11:48:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:37:22.757 11:48:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:37:22.757 11:48:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:37:22.757 11:48:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:37:22.757 11:48:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:37:22.757 11:48:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:37:22.757 11:48:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:37:22.757 11:48:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:37:22.757 11:48:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:37:22.757 11:48:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:37:22.757 11:48:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:37:22.757 11:48:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:22.757 11:48:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:23.016 11:48:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:37:23.016 "name": "raid_bdev1", 00:37:23.016 "uuid": "c9f857db-80d7-4879-bf6d-5916e608f35a", 00:37:23.016 "strip_size_kb": 0, 00:37:23.016 "state": "online", 00:37:23.016 "raid_level": "raid1", 00:37:23.016 "superblock": true, 00:37:23.016 "num_base_bdevs": 2, 00:37:23.016 "num_base_bdevs_discovered": 2, 00:37:23.016 "num_base_bdevs_operational": 2, 00:37:23.016 "base_bdevs_list": [ 00:37:23.016 { 00:37:23.016 "name": "spare", 00:37:23.016 "uuid": "ef7be36a-ba16-5707-82e5-b65eed834224", 00:37:23.016 "is_configured": true, 00:37:23.016 "data_offset": 2048, 00:37:23.016 "data_size": 63488 00:37:23.016 }, 00:37:23.016 { 00:37:23.016 "name": "BaseBdev2", 00:37:23.016 "uuid": "99f0c359-1348-5101-acc1-9abbaf5a6325", 00:37:23.016 "is_configured": true, 00:37:23.016 "data_offset": 2048, 00:37:23.016 "data_size": 63488 00:37:23.016 } 00:37:23.016 ] 00:37:23.016 }' 00:37:23.016 11:48:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:37:23.016 11:48:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:37:23.582 11:48:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:37:23.582 [2024-06-10 11:48:07.411070] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:37:23.582 [2024-06-10 11:48:07.411093] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:37:23.582 [2024-06-10 11:48:07.411138] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:37:23.582 [2024-06-10 11:48:07.411175] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:37:23.582 [2024-06-10 11:48:07.411183] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2295410 name raid_bdev1, state offline 00:37:23.582 11:48:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # jq length 00:37:23.582 11:48:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:23.841 11:48:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:37:23.841 11:48:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:37:23.841 11:48:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:37:23.841 11:48:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:37:23.841 11:48:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:37:23.841 11:48:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:37:23.841 11:48:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:37:23.841 11:48:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:37:23.841 11:48:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:37:23.841 11:48:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:37:23.841 11:48:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:37:23.841 11:48:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:37:23.841 11:48:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:37:23.841 /dev/nbd0 00:37:24.099 11:48:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:37:24.099 11:48:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:37:24.099 11:48:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:37:24.099 11:48:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # local i 00:37:24.099 11:48:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:37:24.099 11:48:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:37:24.099 11:48:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:37:24.099 11:48:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@872 -- # break 00:37:24.099 11:48:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:37:24.099 11:48:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:37:24.099 11:48:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:37:24.099 1+0 records in 00:37:24.099 1+0 records out 00:37:24.099 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00012953 s, 31.6 MB/s 00:37:24.099 11:48:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:37:24.099 11:48:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # size=4096 00:37:24.099 11:48:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:37:24.099 11:48:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:37:24.099 11:48:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@888 -- # return 0 00:37:24.099 11:48:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:37:24.099 11:48:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:37:24.099 11:48:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:37:24.099 /dev/nbd1 00:37:24.099 11:48:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:37:24.099 11:48:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:37:24.099 11:48:08 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:37:24.099 11:48:08 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # local i 00:37:24.099 11:48:08 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:37:24.099 11:48:08 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:37:24.099 11:48:08 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:37:24.099 11:48:08 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@872 -- # break 00:37:24.099 11:48:08 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:37:24.099 11:48:08 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:37:24.100 11:48:08 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:37:24.100 1+0 records in 00:37:24.100 1+0 records out 00:37:24.100 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000314062 s, 13.0 MB/s 00:37:24.100 11:48:08 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:37:24.100 11:48:08 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # size=4096 00:37:24.100 11:48:08 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:37:24.100 11:48:08 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:37:24.100 11:48:08 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@888 -- # return 0 00:37:24.358 11:48:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:37:24.358 11:48:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:37:24.358 11:48:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:37:24.358 11:48:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:37:24.358 11:48:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:37:24.358 11:48:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:37:24.358 11:48:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:37:24.358 11:48:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:37:24.358 11:48:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:37:24.358 11:48:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:37:24.358 11:48:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:37:24.358 11:48:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:37:24.358 11:48:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:37:24.358 11:48:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:37:24.358 11:48:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:37:24.358 11:48:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:37:24.358 11:48:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:37:24.358 11:48:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:37:24.358 11:48:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:37:24.358 11:48:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:37:24.617 11:48:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:37:24.617 11:48:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:37:24.617 11:48:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:37:24.617 11:48:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:37:24.617 11:48:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:37:24.617 11:48:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:37:24.617 11:48:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:37:24.617 11:48:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:37:24.617 11:48:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:37:24.617 11:48:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:37:24.875 11:48:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:37:25.134 [2024-06-10 11:48:08.827403] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:37:25.134 [2024-06-10 11:48:08.827446] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:37:25.134 [2024-06-10 11:48:08.827463] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2295690 00:37:25.134 [2024-06-10 11:48:08.827472] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:37:25.134 [2024-06-10 11:48:08.828655] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:37:25.134 [2024-06-10 11:48:08.828681] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:37:25.134 [2024-06-10 11:48:08.828743] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:37:25.134 [2024-06-10 11:48:08.828762] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:37:25.134 [2024-06-10 11:48:08.828835] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:37:25.134 spare 00:37:25.134 11:48:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:37:25.134 11:48:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:37:25.134 11:48:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:37:25.134 11:48:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:37:25.134 11:48:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:37:25.134 11:48:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:37:25.134 11:48:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:37:25.134 11:48:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:37:25.134 11:48:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:37:25.134 11:48:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:37:25.134 11:48:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:25.134 11:48:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:25.134 [2024-06-10 11:48:08.929137] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x20e46a0 00:37:25.134 [2024-06-10 11:48:08.929151] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:37:25.134 [2024-06-10 11:48:08.929302] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20e72b0 00:37:25.134 [2024-06-10 11:48:08.929416] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x20e46a0 00:37:25.134 [2024-06-10 11:48:08.929424] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x20e46a0 00:37:25.134 [2024-06-10 11:48:08.929501] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:37:25.134 11:48:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:37:25.134 "name": "raid_bdev1", 00:37:25.134 "uuid": "c9f857db-80d7-4879-bf6d-5916e608f35a", 00:37:25.134 "strip_size_kb": 0, 00:37:25.134 "state": "online", 00:37:25.134 "raid_level": "raid1", 00:37:25.134 "superblock": true, 00:37:25.134 "num_base_bdevs": 2, 00:37:25.134 "num_base_bdevs_discovered": 2, 00:37:25.134 "num_base_bdevs_operational": 2, 00:37:25.134 "base_bdevs_list": [ 00:37:25.134 { 00:37:25.134 "name": "spare", 00:37:25.134 "uuid": "ef7be36a-ba16-5707-82e5-b65eed834224", 00:37:25.134 "is_configured": true, 00:37:25.134 "data_offset": 2048, 00:37:25.134 "data_size": 63488 00:37:25.134 }, 00:37:25.134 { 00:37:25.134 "name": "BaseBdev2", 00:37:25.134 "uuid": "99f0c359-1348-5101-acc1-9abbaf5a6325", 00:37:25.134 "is_configured": true, 00:37:25.134 "data_offset": 2048, 00:37:25.134 "data_size": 63488 00:37:25.134 } 00:37:25.134 ] 00:37:25.134 }' 00:37:25.134 11:48:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:37:25.134 11:48:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:37:25.701 11:48:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:37:25.701 11:48:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:37:25.701 11:48:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:37:25.701 11:48:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:37:25.701 11:48:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:37:25.701 11:48:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:25.701 11:48:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:25.959 11:48:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:37:25.959 "name": "raid_bdev1", 00:37:25.959 "uuid": "c9f857db-80d7-4879-bf6d-5916e608f35a", 00:37:25.959 "strip_size_kb": 0, 00:37:25.959 "state": "online", 00:37:25.959 "raid_level": "raid1", 00:37:25.959 "superblock": true, 00:37:25.959 "num_base_bdevs": 2, 00:37:25.959 "num_base_bdevs_discovered": 2, 00:37:25.959 "num_base_bdevs_operational": 2, 00:37:25.959 "base_bdevs_list": [ 00:37:25.959 { 00:37:25.959 "name": "spare", 00:37:25.959 "uuid": "ef7be36a-ba16-5707-82e5-b65eed834224", 00:37:25.960 "is_configured": true, 00:37:25.960 "data_offset": 2048, 00:37:25.960 "data_size": 63488 00:37:25.960 }, 00:37:25.960 { 00:37:25.960 "name": "BaseBdev2", 00:37:25.960 "uuid": "99f0c359-1348-5101-acc1-9abbaf5a6325", 00:37:25.960 "is_configured": true, 00:37:25.960 "data_offset": 2048, 00:37:25.960 "data_size": 63488 00:37:25.960 } 00:37:25.960 ] 00:37:25.960 }' 00:37:25.960 11:48:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:37:25.960 11:48:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:37:25.960 11:48:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:37:25.960 11:48:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:37:25.960 11:48:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:25.960 11:48:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:37:26.229 11:48:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:37:26.229 11:48:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:37:26.229 [2024-06-10 11:48:10.110754] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:37:26.229 11:48:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:37:26.229 11:48:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:37:26.229 11:48:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:37:26.229 11:48:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:37:26.229 11:48:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:37:26.229 11:48:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:37:26.229 11:48:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:37:26.229 11:48:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:37:26.229 11:48:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:37:26.229 11:48:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:37:26.229 11:48:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:26.229 11:48:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:26.488 11:48:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:37:26.488 "name": "raid_bdev1", 00:37:26.488 "uuid": "c9f857db-80d7-4879-bf6d-5916e608f35a", 00:37:26.488 "strip_size_kb": 0, 00:37:26.488 "state": "online", 00:37:26.488 "raid_level": "raid1", 00:37:26.488 "superblock": true, 00:37:26.488 "num_base_bdevs": 2, 00:37:26.488 "num_base_bdevs_discovered": 1, 00:37:26.488 "num_base_bdevs_operational": 1, 00:37:26.488 "base_bdevs_list": [ 00:37:26.488 { 00:37:26.488 "name": null, 00:37:26.488 "uuid": "00000000-0000-0000-0000-000000000000", 00:37:26.488 "is_configured": false, 00:37:26.488 "data_offset": 2048, 00:37:26.488 "data_size": 63488 00:37:26.488 }, 00:37:26.488 { 00:37:26.488 "name": "BaseBdev2", 00:37:26.488 "uuid": "99f0c359-1348-5101-acc1-9abbaf5a6325", 00:37:26.488 "is_configured": true, 00:37:26.488 "data_offset": 2048, 00:37:26.488 "data_size": 63488 00:37:26.488 } 00:37:26.488 ] 00:37:26.488 }' 00:37:26.488 11:48:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:37:26.488 11:48:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:37:27.055 11:48:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:37:27.055 [2024-06-10 11:48:10.892791] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:37:27.055 [2024-06-10 11:48:10.892915] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:37:27.055 [2024-06-10 11:48:10.892927] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:37:27.055 [2024-06-10 11:48:10.892951] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:37:27.055 [2024-06-10 11:48:10.897300] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20e72b0 00:37:27.055 [2024-06-10 11:48:10.898954] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:37:27.055 11:48:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@755 -- # sleep 1 00:37:28.042 11:48:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:37:28.042 11:48:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:37:28.042 11:48:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:37:28.042 11:48:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:37:28.042 11:48:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:37:28.042 11:48:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:28.042 11:48:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:28.311 11:48:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:37:28.311 "name": "raid_bdev1", 00:37:28.311 "uuid": "c9f857db-80d7-4879-bf6d-5916e608f35a", 00:37:28.311 "strip_size_kb": 0, 00:37:28.311 "state": "online", 00:37:28.311 "raid_level": "raid1", 00:37:28.311 "superblock": true, 00:37:28.311 "num_base_bdevs": 2, 00:37:28.311 "num_base_bdevs_discovered": 2, 00:37:28.311 "num_base_bdevs_operational": 2, 00:37:28.311 "process": { 00:37:28.311 "type": "rebuild", 00:37:28.311 "target": "spare", 00:37:28.311 "progress": { 00:37:28.311 "blocks": 22528, 00:37:28.311 "percent": 35 00:37:28.311 } 00:37:28.311 }, 00:37:28.311 "base_bdevs_list": [ 00:37:28.311 { 00:37:28.311 "name": "spare", 00:37:28.311 "uuid": "ef7be36a-ba16-5707-82e5-b65eed834224", 00:37:28.311 "is_configured": true, 00:37:28.311 "data_offset": 2048, 00:37:28.311 "data_size": 63488 00:37:28.311 }, 00:37:28.311 { 00:37:28.311 "name": "BaseBdev2", 00:37:28.311 "uuid": "99f0c359-1348-5101-acc1-9abbaf5a6325", 00:37:28.311 "is_configured": true, 00:37:28.311 "data_offset": 2048, 00:37:28.311 "data_size": 63488 00:37:28.311 } 00:37:28.311 ] 00:37:28.311 }' 00:37:28.311 11:48:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:37:28.311 11:48:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:37:28.311 11:48:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:37:28.312 11:48:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:37:28.312 11:48:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:37:28.577 [2024-06-10 11:48:12.325859] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:37:28.577 [2024-06-10 11:48:12.410056] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:37:28.577 [2024-06-10 11:48:12.410092] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:37:28.577 [2024-06-10 11:48:12.410103] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:37:28.577 [2024-06-10 11:48:12.410109] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:37:28.577 11:48:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:37:28.577 11:48:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:37:28.577 11:48:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:37:28.577 11:48:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:37:28.577 11:48:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:37:28.577 11:48:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:37:28.577 11:48:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:37:28.577 11:48:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:37:28.577 11:48:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:37:28.577 11:48:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:37:28.577 11:48:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:28.577 11:48:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:28.836 11:48:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:37:28.836 "name": "raid_bdev1", 00:37:28.836 "uuid": "c9f857db-80d7-4879-bf6d-5916e608f35a", 00:37:28.836 "strip_size_kb": 0, 00:37:28.836 "state": "online", 00:37:28.836 "raid_level": "raid1", 00:37:28.836 "superblock": true, 00:37:28.836 "num_base_bdevs": 2, 00:37:28.836 "num_base_bdevs_discovered": 1, 00:37:28.836 "num_base_bdevs_operational": 1, 00:37:28.836 "base_bdevs_list": [ 00:37:28.836 { 00:37:28.836 "name": null, 00:37:28.836 "uuid": "00000000-0000-0000-0000-000000000000", 00:37:28.836 "is_configured": false, 00:37:28.836 "data_offset": 2048, 00:37:28.836 "data_size": 63488 00:37:28.836 }, 00:37:28.836 { 00:37:28.836 "name": "BaseBdev2", 00:37:28.836 "uuid": "99f0c359-1348-5101-acc1-9abbaf5a6325", 00:37:28.836 "is_configured": true, 00:37:28.836 "data_offset": 2048, 00:37:28.836 "data_size": 63488 00:37:28.836 } 00:37:28.836 ] 00:37:28.836 }' 00:37:28.836 11:48:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:37:28.836 11:48:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:37:29.403 11:48:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:37:29.403 [2024-06-10 11:48:13.284788] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:37:29.403 [2024-06-10 11:48:13.284831] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:37:29.403 [2024-06-10 11:48:13.284853] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20e4a20 00:37:29.403 [2024-06-10 11:48:13.284861] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:37:29.404 [2024-06-10 11:48:13.285154] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:37:29.404 [2024-06-10 11:48:13.285167] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:37:29.404 [2024-06-10 11:48:13.285226] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:37:29.404 [2024-06-10 11:48:13.285235] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:37:29.404 [2024-06-10 11:48:13.285242] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:37:29.404 [2024-06-10 11:48:13.285255] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:37:29.404 [2024-06-10 11:48:13.289608] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20e72b0 00:37:29.404 spare 00:37:29.404 [2024-06-10 11:48:13.290682] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:37:29.404 11:48:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@762 -- # sleep 1 00:37:30.782 11:48:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:37:30.782 11:48:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:37:30.782 11:48:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:37:30.782 11:48:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:37:30.782 11:48:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:37:30.782 11:48:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:30.782 11:48:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:30.782 11:48:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:37:30.782 "name": "raid_bdev1", 00:37:30.782 "uuid": "c9f857db-80d7-4879-bf6d-5916e608f35a", 00:37:30.782 "strip_size_kb": 0, 00:37:30.782 "state": "online", 00:37:30.782 "raid_level": "raid1", 00:37:30.782 "superblock": true, 00:37:30.782 "num_base_bdevs": 2, 00:37:30.782 "num_base_bdevs_discovered": 2, 00:37:30.782 "num_base_bdevs_operational": 2, 00:37:30.782 "process": { 00:37:30.782 "type": "rebuild", 00:37:30.782 "target": "spare", 00:37:30.782 "progress": { 00:37:30.782 "blocks": 22528, 00:37:30.782 "percent": 35 00:37:30.782 } 00:37:30.782 }, 00:37:30.782 "base_bdevs_list": [ 00:37:30.782 { 00:37:30.782 "name": "spare", 00:37:30.782 "uuid": "ef7be36a-ba16-5707-82e5-b65eed834224", 00:37:30.782 "is_configured": true, 00:37:30.782 "data_offset": 2048, 00:37:30.782 "data_size": 63488 00:37:30.782 }, 00:37:30.782 { 00:37:30.782 "name": "BaseBdev2", 00:37:30.782 "uuid": "99f0c359-1348-5101-acc1-9abbaf5a6325", 00:37:30.782 "is_configured": true, 00:37:30.782 "data_offset": 2048, 00:37:30.782 "data_size": 63488 00:37:30.782 } 00:37:30.782 ] 00:37:30.782 }' 00:37:30.782 11:48:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:37:30.782 11:48:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:37:30.782 11:48:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:37:30.782 11:48:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:37:30.782 11:48:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:37:30.782 [2024-06-10 11:48:14.717222] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:37:31.041 [2024-06-10 11:48:14.801863] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:37:31.041 [2024-06-10 11:48:14.801902] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:37:31.041 [2024-06-10 11:48:14.801913] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:37:31.041 [2024-06-10 11:48:14.801919] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:37:31.041 11:48:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:37:31.041 11:48:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:37:31.041 11:48:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:37:31.041 11:48:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:37:31.041 11:48:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:37:31.041 11:48:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:37:31.041 11:48:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:37:31.041 11:48:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:37:31.041 11:48:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:37:31.041 11:48:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:37:31.041 11:48:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:31.041 11:48:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:31.298 11:48:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:37:31.298 "name": "raid_bdev1", 00:37:31.298 "uuid": "c9f857db-80d7-4879-bf6d-5916e608f35a", 00:37:31.298 "strip_size_kb": 0, 00:37:31.298 "state": "online", 00:37:31.298 "raid_level": "raid1", 00:37:31.298 "superblock": true, 00:37:31.298 "num_base_bdevs": 2, 00:37:31.298 "num_base_bdevs_discovered": 1, 00:37:31.299 "num_base_bdevs_operational": 1, 00:37:31.299 "base_bdevs_list": [ 00:37:31.299 { 00:37:31.299 "name": null, 00:37:31.299 "uuid": "00000000-0000-0000-0000-000000000000", 00:37:31.299 "is_configured": false, 00:37:31.299 "data_offset": 2048, 00:37:31.299 "data_size": 63488 00:37:31.299 }, 00:37:31.299 { 00:37:31.299 "name": "BaseBdev2", 00:37:31.299 "uuid": "99f0c359-1348-5101-acc1-9abbaf5a6325", 00:37:31.299 "is_configured": true, 00:37:31.299 "data_offset": 2048, 00:37:31.299 "data_size": 63488 00:37:31.299 } 00:37:31.299 ] 00:37:31.299 }' 00:37:31.299 11:48:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:37:31.299 11:48:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:37:31.556 11:48:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:37:31.556 11:48:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:37:31.556 11:48:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:37:31.556 11:48:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:37:31.556 11:48:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:37:31.556 11:48:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:31.556 11:48:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:31.814 11:48:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:37:31.814 "name": "raid_bdev1", 00:37:31.814 "uuid": "c9f857db-80d7-4879-bf6d-5916e608f35a", 00:37:31.814 "strip_size_kb": 0, 00:37:31.814 "state": "online", 00:37:31.814 "raid_level": "raid1", 00:37:31.814 "superblock": true, 00:37:31.814 "num_base_bdevs": 2, 00:37:31.814 "num_base_bdevs_discovered": 1, 00:37:31.814 "num_base_bdevs_operational": 1, 00:37:31.814 "base_bdevs_list": [ 00:37:31.814 { 00:37:31.814 "name": null, 00:37:31.814 "uuid": "00000000-0000-0000-0000-000000000000", 00:37:31.814 "is_configured": false, 00:37:31.814 "data_offset": 2048, 00:37:31.814 "data_size": 63488 00:37:31.814 }, 00:37:31.814 { 00:37:31.814 "name": "BaseBdev2", 00:37:31.814 "uuid": "99f0c359-1348-5101-acc1-9abbaf5a6325", 00:37:31.814 "is_configured": true, 00:37:31.814 "data_offset": 2048, 00:37:31.814 "data_size": 63488 00:37:31.814 } 00:37:31.814 ] 00:37:31.814 }' 00:37:31.814 11:48:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:37:31.814 11:48:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:37:31.814 11:48:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:37:31.814 11:48:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:37:31.814 11:48:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:37:32.072 11:48:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:37:32.328 [2024-06-10 11:48:16.057831] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:37:32.328 [2024-06-10 11:48:16.057870] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:37:32.328 [2024-06-10 11:48:16.057903] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2293ba0 00:37:32.328 [2024-06-10 11:48:16.057911] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:37:32.328 [2024-06-10 11:48:16.058174] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:37:32.328 [2024-06-10 11:48:16.058186] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:37:32.328 [2024-06-10 11:48:16.058236] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:37:32.328 [2024-06-10 11:48:16.058245] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:37:32.328 [2024-06-10 11:48:16.058253] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:37:32.328 BaseBdev1 00:37:32.328 11:48:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@773 -- # sleep 1 00:37:33.260 11:48:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:37:33.260 11:48:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:37:33.260 11:48:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:37:33.260 11:48:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:37:33.260 11:48:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:37:33.260 11:48:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:37:33.260 11:48:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:37:33.260 11:48:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:37:33.260 11:48:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:37:33.260 11:48:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:37:33.260 11:48:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:33.260 11:48:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:33.518 11:48:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:37:33.518 "name": "raid_bdev1", 00:37:33.518 "uuid": "c9f857db-80d7-4879-bf6d-5916e608f35a", 00:37:33.518 "strip_size_kb": 0, 00:37:33.518 "state": "online", 00:37:33.518 "raid_level": "raid1", 00:37:33.518 "superblock": true, 00:37:33.518 "num_base_bdevs": 2, 00:37:33.518 "num_base_bdevs_discovered": 1, 00:37:33.518 "num_base_bdevs_operational": 1, 00:37:33.518 "base_bdevs_list": [ 00:37:33.518 { 00:37:33.518 "name": null, 00:37:33.518 "uuid": "00000000-0000-0000-0000-000000000000", 00:37:33.518 "is_configured": false, 00:37:33.518 "data_offset": 2048, 00:37:33.518 "data_size": 63488 00:37:33.518 }, 00:37:33.518 { 00:37:33.518 "name": "BaseBdev2", 00:37:33.518 "uuid": "99f0c359-1348-5101-acc1-9abbaf5a6325", 00:37:33.518 "is_configured": true, 00:37:33.518 "data_offset": 2048, 00:37:33.518 "data_size": 63488 00:37:33.518 } 00:37:33.518 ] 00:37:33.518 }' 00:37:33.518 11:48:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:37:33.518 11:48:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:37:34.085 11:48:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:37:34.085 11:48:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:37:34.085 11:48:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:37:34.085 11:48:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:37:34.085 11:48:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:37:34.085 11:48:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:34.085 11:48:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:34.085 11:48:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:37:34.085 "name": "raid_bdev1", 00:37:34.085 "uuid": "c9f857db-80d7-4879-bf6d-5916e608f35a", 00:37:34.085 "strip_size_kb": 0, 00:37:34.085 "state": "online", 00:37:34.085 "raid_level": "raid1", 00:37:34.085 "superblock": true, 00:37:34.085 "num_base_bdevs": 2, 00:37:34.085 "num_base_bdevs_discovered": 1, 00:37:34.085 "num_base_bdevs_operational": 1, 00:37:34.085 "base_bdevs_list": [ 00:37:34.085 { 00:37:34.085 "name": null, 00:37:34.085 "uuid": "00000000-0000-0000-0000-000000000000", 00:37:34.085 "is_configured": false, 00:37:34.085 "data_offset": 2048, 00:37:34.085 "data_size": 63488 00:37:34.085 }, 00:37:34.085 { 00:37:34.085 "name": "BaseBdev2", 00:37:34.085 "uuid": "99f0c359-1348-5101-acc1-9abbaf5a6325", 00:37:34.085 "is_configured": true, 00:37:34.085 "data_offset": 2048, 00:37:34.085 "data_size": 63488 00:37:34.085 } 00:37:34.085 ] 00:37:34.085 }' 00:37:34.085 11:48:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:37:34.085 11:48:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:37:34.085 11:48:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:37:34.085 11:48:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:37:34.085 11:48:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:37:34.085 11:48:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@649 -- # local es=0 00:37:34.085 11:48:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:37:34.085 11:48:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:37:34.085 11:48:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:37:34.085 11:48:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:37:34.085 11:48:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:37:34.085 11:48:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:37:34.085 11:48:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:37:34.085 11:48:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:37:34.085 11:48:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:37:34.085 11:48:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:37:34.342 [2024-06-10 11:48:18.147251] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:37:34.342 [2024-06-10 11:48:18.147363] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:37:34.342 [2024-06-10 11:48:18.147375] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:37:34.342 request: 00:37:34.342 { 00:37:34.342 "raid_bdev": "raid_bdev1", 00:37:34.342 "base_bdev": "BaseBdev1", 00:37:34.342 "method": "bdev_raid_add_base_bdev", 00:37:34.342 "req_id": 1 00:37:34.342 } 00:37:34.342 Got JSON-RPC error response 00:37:34.342 response: 00:37:34.342 { 00:37:34.342 "code": -22, 00:37:34.342 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:37:34.342 } 00:37:34.342 11:48:18 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@652 -- # es=1 00:37:34.342 11:48:18 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:37:34.342 11:48:18 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:37:34.342 11:48:18 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:37:34.342 11:48:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@777 -- # sleep 1 00:37:35.274 11:48:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:37:35.274 11:48:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:37:35.274 11:48:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:37:35.274 11:48:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:37:35.274 11:48:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:37:35.274 11:48:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:37:35.274 11:48:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:37:35.274 11:48:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:37:35.274 11:48:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:37:35.274 11:48:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:37:35.274 11:48:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:35.274 11:48:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:35.533 11:48:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:37:35.533 "name": "raid_bdev1", 00:37:35.533 "uuid": "c9f857db-80d7-4879-bf6d-5916e608f35a", 00:37:35.533 "strip_size_kb": 0, 00:37:35.533 "state": "online", 00:37:35.533 "raid_level": "raid1", 00:37:35.533 "superblock": true, 00:37:35.533 "num_base_bdevs": 2, 00:37:35.533 "num_base_bdevs_discovered": 1, 00:37:35.533 "num_base_bdevs_operational": 1, 00:37:35.533 "base_bdevs_list": [ 00:37:35.533 { 00:37:35.533 "name": null, 00:37:35.533 "uuid": "00000000-0000-0000-0000-000000000000", 00:37:35.533 "is_configured": false, 00:37:35.533 "data_offset": 2048, 00:37:35.533 "data_size": 63488 00:37:35.533 }, 00:37:35.533 { 00:37:35.533 "name": "BaseBdev2", 00:37:35.533 "uuid": "99f0c359-1348-5101-acc1-9abbaf5a6325", 00:37:35.533 "is_configured": true, 00:37:35.533 "data_offset": 2048, 00:37:35.533 "data_size": 63488 00:37:35.533 } 00:37:35.533 ] 00:37:35.533 }' 00:37:35.533 11:48:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:37:35.533 11:48:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:37:36.098 11:48:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:37:36.098 11:48:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:37:36.098 11:48:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:37:36.098 11:48:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:37:36.098 11:48:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:37:36.098 11:48:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:36.098 11:48:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:36.357 11:48:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:37:36.357 "name": "raid_bdev1", 00:37:36.357 "uuid": "c9f857db-80d7-4879-bf6d-5916e608f35a", 00:37:36.357 "strip_size_kb": 0, 00:37:36.357 "state": "online", 00:37:36.357 "raid_level": "raid1", 00:37:36.357 "superblock": true, 00:37:36.357 "num_base_bdevs": 2, 00:37:36.357 "num_base_bdevs_discovered": 1, 00:37:36.357 "num_base_bdevs_operational": 1, 00:37:36.357 "base_bdevs_list": [ 00:37:36.357 { 00:37:36.357 "name": null, 00:37:36.357 "uuid": "00000000-0000-0000-0000-000000000000", 00:37:36.357 "is_configured": false, 00:37:36.357 "data_offset": 2048, 00:37:36.357 "data_size": 63488 00:37:36.357 }, 00:37:36.357 { 00:37:36.357 "name": "BaseBdev2", 00:37:36.357 "uuid": "99f0c359-1348-5101-acc1-9abbaf5a6325", 00:37:36.357 "is_configured": true, 00:37:36.357 "data_offset": 2048, 00:37:36.357 "data_size": 63488 00:37:36.357 } 00:37:36.357 ] 00:37:36.357 }' 00:37:36.357 11:48:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:37:36.357 11:48:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:37:36.357 11:48:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:37:36.357 11:48:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:37:36.357 11:48:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@782 -- # killprocess 218918 00:37:36.357 11:48:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@949 -- # '[' -z 218918 ']' 00:37:36.357 11:48:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # kill -0 218918 00:37:36.357 11:48:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # uname 00:37:36.357 11:48:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:37:36.357 11:48:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 218918 00:37:36.357 11:48:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:37:36.357 11:48:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:37:36.357 11:48:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@967 -- # echo 'killing process with pid 218918' 00:37:36.357 killing process with pid 218918 00:37:36.357 11:48:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@968 -- # kill 218918 00:37:36.357 Received shutdown signal, test time was about 60.000000 seconds 00:37:36.357 00:37:36.357 Latency(us) 00:37:36.357 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:37:36.357 =================================================================================================================== 00:37:36.357 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:37:36.357 [2024-06-10 11:48:20.156340] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:37:36.357 11:48:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@973 -- # wait 218918 00:37:36.357 [2024-06-10 11:48:20.156416] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:37:36.357 [2024-06-10 11:48:20.156450] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:37:36.357 [2024-06-10 11:48:20.156459] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20e46a0 name raid_bdev1, state offline 00:37:36.357 [2024-06-10 11:48:20.184031] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:37:36.616 11:48:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@784 -- # return 0 00:37:36.616 00:37:36.616 real 0m29.330s 00:37:36.616 user 0m41.404s 00:37:36.616 sys 0m5.411s 00:37:36.616 11:48:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1125 -- # xtrace_disable 00:37:36.616 11:48:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:37:36.616 ************************************ 00:37:36.616 END TEST raid_rebuild_test_sb 00:37:36.616 ************************************ 00:37:36.616 11:48:20 bdev_raid -- bdev/bdev_raid.sh@879 -- # run_test raid_rebuild_test_io raid_rebuild_test raid1 2 false true true 00:37:36.616 11:48:20 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:37:36.616 11:48:20 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:37:36.616 11:48:20 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:37:36.616 ************************************ 00:37:36.616 START TEST raid_rebuild_test_io 00:37:36.616 ************************************ 00:37:36.616 11:48:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1124 -- # raid_rebuild_test raid1 2 false true true 00:37:36.616 11:48:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:37:36.616 11:48:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:37:36.616 11:48:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:37:36.616 11:48:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:37:36.617 11:48:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:37:36.617 11:48:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:37:36.617 11:48:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:37:36.617 11:48:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:37:36.617 11:48:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:37:36.617 11:48:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:37:36.617 11:48:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:37:36.617 11:48:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:37:36.617 11:48:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:37:36.617 11:48:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:37:36.617 11:48:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:37:36.617 11:48:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:37:36.617 11:48:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:37:36.617 11:48:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:37:36.617 11:48:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:37:36.617 11:48:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:37:36.617 11:48:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:37:36.617 11:48:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:37:36.617 11:48:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:37:36.617 11:48:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@596 -- # raid_pid=223607 00:37:36.617 11:48:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 223607 /var/tmp/spdk-raid.sock 00:37:36.617 11:48:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@830 -- # '[' -z 223607 ']' 00:37:36.617 11:48:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:37:36.617 11:48:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@835 -- # local max_retries=100 00:37:36.617 11:48:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:37:36.617 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:37:36.617 11:48:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@839 -- # xtrace_disable 00:37:36.617 11:48:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:37:36.617 11:48:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:37:36.617 [2024-06-10 11:48:20.524315] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:37:36.617 [2024-06-10 11:48:20.524370] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid223607 ] 00:37:36.617 I/O size of 3145728 is greater than zero copy threshold (65536). 00:37:36.617 Zero copy mechanism will not be used. 00:37:36.877 [2024-06-10 11:48:20.610168] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:37:36.877 [2024-06-10 11:48:20.690432] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:37:36.877 [2024-06-10 11:48:20.749718] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:37:36.877 [2024-06-10 11:48:20.749746] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:37:37.444 11:48:21 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:37:37.444 11:48:21 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@863 -- # return 0 00:37:37.444 11:48:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:37:37.444 11:48:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:37:37.702 BaseBdev1_malloc 00:37:37.702 11:48:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:37:37.960 [2024-06-10 11:48:21.655702] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:37:37.960 [2024-06-10 11:48:21.655743] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:37:37.960 [2024-06-10 11:48:21.655776] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x8db780 00:37:37.960 [2024-06-10 11:48:21.655785] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:37:37.960 [2024-06-10 11:48:21.657067] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:37:37.960 [2024-06-10 11:48:21.657089] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:37:37.960 BaseBdev1 00:37:37.960 11:48:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:37:37.960 11:48:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:37:37.960 BaseBdev2_malloc 00:37:37.960 11:48:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:37:38.218 [2024-06-10 11:48:22.001638] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:37:38.218 [2024-06-10 11:48:22.001676] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:37:38.218 [2024-06-10 11:48:22.001690] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa86a50 00:37:38.218 [2024-06-10 11:48:22.001698] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:37:38.218 [2024-06-10 11:48:22.002846] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:37:38.218 [2024-06-10 11:48:22.002879] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:37:38.218 BaseBdev2 00:37:38.218 11:48:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:37:38.475 spare_malloc 00:37:38.475 11:48:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:37:38.475 spare_delay 00:37:38.475 11:48:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:37:38.732 [2024-06-10 11:48:22.507860] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:37:38.732 [2024-06-10 11:48:22.507898] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:37:38.732 [2024-06-10 11:48:22.507928] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa89970 00:37:38.732 [2024-06-10 11:48:22.507948] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:37:38.732 [2024-06-10 11:48:22.508911] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:37:38.732 [2024-06-10 11:48:22.508930] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:37:38.732 spare 00:37:38.732 11:48:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:37:38.732 [2024-06-10 11:48:22.676312] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:37:38.732 [2024-06-10 11:48:22.677291] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:37:38.732 [2024-06-10 11:48:22.677348] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xa8b410 00:37:38.732 [2024-06-10 11:48:22.677356] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:37:38.732 [2024-06-10 11:48:22.677514] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa8ad20 00:37:38.732 [2024-06-10 11:48:22.677612] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xa8b410 00:37:38.732 [2024-06-10 11:48:22.677619] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xa8b410 00:37:38.732 [2024-06-10 11:48:22.677699] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:37:38.989 11:48:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:37:38.989 11:48:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:37:38.989 11:48:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:37:38.990 11:48:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:37:38.990 11:48:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:37:38.990 11:48:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:37:38.990 11:48:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:37:38.990 11:48:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:37:38.990 11:48:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:37:38.990 11:48:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:37:38.990 11:48:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:38.990 11:48:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:38.990 11:48:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:37:38.990 "name": "raid_bdev1", 00:37:38.990 "uuid": "928b703b-71ea-4fb8-9a35-1d7e334625ae", 00:37:38.990 "strip_size_kb": 0, 00:37:38.990 "state": "online", 00:37:38.990 "raid_level": "raid1", 00:37:38.990 "superblock": false, 00:37:38.990 "num_base_bdevs": 2, 00:37:38.990 "num_base_bdevs_discovered": 2, 00:37:38.990 "num_base_bdevs_operational": 2, 00:37:38.990 "base_bdevs_list": [ 00:37:38.990 { 00:37:38.990 "name": "BaseBdev1", 00:37:38.990 "uuid": "c396853b-28bc-5c0e-8a98-d05b557007fc", 00:37:38.990 "is_configured": true, 00:37:38.990 "data_offset": 0, 00:37:38.990 "data_size": 65536 00:37:38.990 }, 00:37:38.990 { 00:37:38.990 "name": "BaseBdev2", 00:37:38.990 "uuid": "d6102de7-e352-5771-bd07-412d1a185519", 00:37:38.990 "is_configured": true, 00:37:38.990 "data_offset": 0, 00:37:38.990 "data_size": 65536 00:37:38.990 } 00:37:38.990 ] 00:37:38.990 }' 00:37:38.990 11:48:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:37:38.990 11:48:22 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:37:39.553 11:48:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:37:39.553 11:48:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:37:39.811 [2024-06-10 11:48:23.506562] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:37:39.811 11:48:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:37:39.811 11:48:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:39.811 11:48:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:37:39.811 11:48:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:37:39.811 11:48:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:37:39.811 11:48:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:37:39.811 11:48:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:37:40.070 [2024-06-10 11:48:23.773218] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa889e0 00:37:40.070 I/O size of 3145728 is greater than zero copy threshold (65536). 00:37:40.070 Zero copy mechanism will not be used. 00:37:40.070 Running I/O for 60 seconds... 00:37:40.070 [2024-06-10 11:48:23.850708] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:37:40.070 [2024-06-10 11:48:23.856517] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xa889e0 00:37:40.070 11:48:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:37:40.070 11:48:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:37:40.070 11:48:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:37:40.070 11:48:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:37:40.070 11:48:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:37:40.070 11:48:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:37:40.070 11:48:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:37:40.070 11:48:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:37:40.070 11:48:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:37:40.070 11:48:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:37:40.070 11:48:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:40.070 11:48:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:40.328 11:48:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:37:40.328 "name": "raid_bdev1", 00:37:40.328 "uuid": "928b703b-71ea-4fb8-9a35-1d7e334625ae", 00:37:40.328 "strip_size_kb": 0, 00:37:40.328 "state": "online", 00:37:40.328 "raid_level": "raid1", 00:37:40.328 "superblock": false, 00:37:40.328 "num_base_bdevs": 2, 00:37:40.328 "num_base_bdevs_discovered": 1, 00:37:40.328 "num_base_bdevs_operational": 1, 00:37:40.328 "base_bdevs_list": [ 00:37:40.328 { 00:37:40.328 "name": null, 00:37:40.328 "uuid": "00000000-0000-0000-0000-000000000000", 00:37:40.328 "is_configured": false, 00:37:40.328 "data_offset": 0, 00:37:40.328 "data_size": 65536 00:37:40.328 }, 00:37:40.328 { 00:37:40.328 "name": "BaseBdev2", 00:37:40.328 "uuid": "d6102de7-e352-5771-bd07-412d1a185519", 00:37:40.328 "is_configured": true, 00:37:40.328 "data_offset": 0, 00:37:40.328 "data_size": 65536 00:37:40.328 } 00:37:40.328 ] 00:37:40.328 }' 00:37:40.328 11:48:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:37:40.328 11:48:24 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:37:40.895 11:48:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:37:40.895 [2024-06-10 11:48:24.734159] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:37:40.895 11:48:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:37:40.895 [2024-06-10 11:48:24.794130] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa0e860 00:37:40.895 [2024-06-10 11:48:24.795975] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:37:41.153 [2024-06-10 11:48:24.904408] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:37:41.153 [2024-06-10 11:48:24.904796] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:37:41.412 [2024-06-10 11:48:25.106369] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:37:41.412 [2024-06-10 11:48:25.106469] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:37:41.670 [2024-06-10 11:48:25.433749] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:37:41.929 [2024-06-10 11:48:25.666950] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:37:41.929 [2024-06-10 11:48:25.667176] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:37:41.929 11:48:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:37:41.929 11:48:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:37:41.929 11:48:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:37:41.929 11:48:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:37:41.929 11:48:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:37:41.929 11:48:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:41.929 11:48:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:42.188 11:48:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:37:42.188 "name": "raid_bdev1", 00:37:42.188 "uuid": "928b703b-71ea-4fb8-9a35-1d7e334625ae", 00:37:42.188 "strip_size_kb": 0, 00:37:42.188 "state": "online", 00:37:42.188 "raid_level": "raid1", 00:37:42.188 "superblock": false, 00:37:42.188 "num_base_bdevs": 2, 00:37:42.188 "num_base_bdevs_discovered": 2, 00:37:42.188 "num_base_bdevs_operational": 2, 00:37:42.188 "process": { 00:37:42.188 "type": "rebuild", 00:37:42.188 "target": "spare", 00:37:42.188 "progress": { 00:37:42.188 "blocks": 12288, 00:37:42.188 "percent": 18 00:37:42.188 } 00:37:42.188 }, 00:37:42.188 "base_bdevs_list": [ 00:37:42.188 { 00:37:42.188 "name": "spare", 00:37:42.188 "uuid": "77bb05bb-d164-58b5-91fa-8ffb7c70cdee", 00:37:42.188 "is_configured": true, 00:37:42.188 "data_offset": 0, 00:37:42.188 "data_size": 65536 00:37:42.188 }, 00:37:42.188 { 00:37:42.188 "name": "BaseBdev2", 00:37:42.188 "uuid": "d6102de7-e352-5771-bd07-412d1a185519", 00:37:42.188 "is_configured": true, 00:37:42.188 "data_offset": 0, 00:37:42.188 "data_size": 65536 00:37:42.188 } 00:37:42.188 ] 00:37:42.188 }' 00:37:42.188 11:48:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:37:42.188 [2024-06-10 11:48:25.995843] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:37:42.188 11:48:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:37:42.188 11:48:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:37:42.188 11:48:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:37:42.188 11:48:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:37:42.446 [2024-06-10 11:48:26.208426] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:37:42.446 [2024-06-10 11:48:26.320539] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:37:42.446 [2024-06-10 11:48:26.322038] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:37:42.446 [2024-06-10 11:48:26.322058] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:37:42.446 [2024-06-10 11:48:26.322065] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:37:42.446 [2024-06-10 11:48:26.334520] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xa889e0 00:37:42.446 11:48:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:37:42.446 11:48:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:37:42.446 11:48:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:37:42.446 11:48:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:37:42.446 11:48:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:37:42.446 11:48:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:37:42.446 11:48:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:37:42.446 11:48:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:37:42.446 11:48:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:37:42.446 11:48:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:37:42.446 11:48:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:42.446 11:48:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:42.704 11:48:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:37:42.704 "name": "raid_bdev1", 00:37:42.704 "uuid": "928b703b-71ea-4fb8-9a35-1d7e334625ae", 00:37:42.704 "strip_size_kb": 0, 00:37:42.704 "state": "online", 00:37:42.704 "raid_level": "raid1", 00:37:42.704 "superblock": false, 00:37:42.704 "num_base_bdevs": 2, 00:37:42.704 "num_base_bdevs_discovered": 1, 00:37:42.704 "num_base_bdevs_operational": 1, 00:37:42.704 "base_bdevs_list": [ 00:37:42.704 { 00:37:42.704 "name": null, 00:37:42.704 "uuid": "00000000-0000-0000-0000-000000000000", 00:37:42.704 "is_configured": false, 00:37:42.704 "data_offset": 0, 00:37:42.704 "data_size": 65536 00:37:42.704 }, 00:37:42.704 { 00:37:42.704 "name": "BaseBdev2", 00:37:42.704 "uuid": "d6102de7-e352-5771-bd07-412d1a185519", 00:37:42.704 "is_configured": true, 00:37:42.704 "data_offset": 0, 00:37:42.704 "data_size": 65536 00:37:42.704 } 00:37:42.704 ] 00:37:42.704 }' 00:37:42.704 11:48:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:37:42.704 11:48:26 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:37:43.269 11:48:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:37:43.269 11:48:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:37:43.269 11:48:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:37:43.269 11:48:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:37:43.269 11:48:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:37:43.269 11:48:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:43.269 11:48:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:43.527 11:48:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:37:43.527 "name": "raid_bdev1", 00:37:43.527 "uuid": "928b703b-71ea-4fb8-9a35-1d7e334625ae", 00:37:43.527 "strip_size_kb": 0, 00:37:43.527 "state": "online", 00:37:43.527 "raid_level": "raid1", 00:37:43.527 "superblock": false, 00:37:43.527 "num_base_bdevs": 2, 00:37:43.527 "num_base_bdevs_discovered": 1, 00:37:43.527 "num_base_bdevs_operational": 1, 00:37:43.527 "base_bdevs_list": [ 00:37:43.527 { 00:37:43.527 "name": null, 00:37:43.527 "uuid": "00000000-0000-0000-0000-000000000000", 00:37:43.527 "is_configured": false, 00:37:43.527 "data_offset": 0, 00:37:43.527 "data_size": 65536 00:37:43.527 }, 00:37:43.527 { 00:37:43.527 "name": "BaseBdev2", 00:37:43.527 "uuid": "d6102de7-e352-5771-bd07-412d1a185519", 00:37:43.527 "is_configured": true, 00:37:43.527 "data_offset": 0, 00:37:43.527 "data_size": 65536 00:37:43.527 } 00:37:43.527 ] 00:37:43.527 }' 00:37:43.527 11:48:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:37:43.527 11:48:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:37:43.527 11:48:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:37:43.527 11:48:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:37:43.527 11:48:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:37:43.784 [2024-06-10 11:48:27.508047] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:37:43.785 [2024-06-10 11:48:27.550674] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa8b3e0 00:37:43.785 [2024-06-10 11:48:27.551792] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:37:43.785 11:48:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:37:43.785 [2024-06-10 11:48:27.661415] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:37:43.785 [2024-06-10 11:48:27.661672] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:37:44.054 [2024-06-10 11:48:27.778235] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:37:44.054 [2024-06-10 11:48:27.778350] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:37:44.311 [2024-06-10 11:48:28.101540] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:37:44.311 [2024-06-10 11:48:28.222075] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:37:44.311 [2024-06-10 11:48:28.222227] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:37:44.877 11:48:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:37:44.877 11:48:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:37:44.877 11:48:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:37:44.877 11:48:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:37:44.877 11:48:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:37:44.877 11:48:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:44.877 11:48:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:44.877 [2024-06-10 11:48:28.686046] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:37:44.877 11:48:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:37:44.877 "name": "raid_bdev1", 00:37:44.877 "uuid": "928b703b-71ea-4fb8-9a35-1d7e334625ae", 00:37:44.877 "strip_size_kb": 0, 00:37:44.877 "state": "online", 00:37:44.877 "raid_level": "raid1", 00:37:44.877 "superblock": false, 00:37:44.877 "num_base_bdevs": 2, 00:37:44.877 "num_base_bdevs_discovered": 2, 00:37:44.877 "num_base_bdevs_operational": 2, 00:37:44.877 "process": { 00:37:44.877 "type": "rebuild", 00:37:44.877 "target": "spare", 00:37:44.877 "progress": { 00:37:44.877 "blocks": 16384, 00:37:44.877 "percent": 25 00:37:44.877 } 00:37:44.877 }, 00:37:44.877 "base_bdevs_list": [ 00:37:44.877 { 00:37:44.877 "name": "spare", 00:37:44.877 "uuid": "77bb05bb-d164-58b5-91fa-8ffb7c70cdee", 00:37:44.877 "is_configured": true, 00:37:44.877 "data_offset": 0, 00:37:44.877 "data_size": 65536 00:37:44.877 }, 00:37:44.877 { 00:37:44.877 "name": "BaseBdev2", 00:37:44.877 "uuid": "d6102de7-e352-5771-bd07-412d1a185519", 00:37:44.877 "is_configured": true, 00:37:44.877 "data_offset": 0, 00:37:44.877 "data_size": 65536 00:37:44.877 } 00:37:44.877 ] 00:37:44.877 }' 00:37:44.877 11:48:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:37:44.877 11:48:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:37:44.877 11:48:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:37:45.135 11:48:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:37:45.135 11:48:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:37:45.135 11:48:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:37:45.135 11:48:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:37:45.135 11:48:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:37:45.135 11:48:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@705 -- # local timeout=638 00:37:45.135 11:48:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:37:45.135 11:48:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:37:45.135 11:48:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:37:45.135 11:48:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:37:45.135 11:48:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:37:45.135 11:48:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:37:45.135 11:48:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:45.135 11:48:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:45.135 [2024-06-10 11:48:28.922903] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:37:45.135 11:48:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:37:45.135 "name": "raid_bdev1", 00:37:45.135 "uuid": "928b703b-71ea-4fb8-9a35-1d7e334625ae", 00:37:45.135 "strip_size_kb": 0, 00:37:45.135 "state": "online", 00:37:45.135 "raid_level": "raid1", 00:37:45.135 "superblock": false, 00:37:45.135 "num_base_bdevs": 2, 00:37:45.135 "num_base_bdevs_discovered": 2, 00:37:45.135 "num_base_bdevs_operational": 2, 00:37:45.135 "process": { 00:37:45.135 "type": "rebuild", 00:37:45.135 "target": "spare", 00:37:45.135 "progress": { 00:37:45.135 "blocks": 20480, 00:37:45.135 "percent": 31 00:37:45.135 } 00:37:45.135 }, 00:37:45.135 "base_bdevs_list": [ 00:37:45.135 { 00:37:45.135 "name": "spare", 00:37:45.135 "uuid": "77bb05bb-d164-58b5-91fa-8ffb7c70cdee", 00:37:45.135 "is_configured": true, 00:37:45.135 "data_offset": 0, 00:37:45.135 "data_size": 65536 00:37:45.135 }, 00:37:45.135 { 00:37:45.135 "name": "BaseBdev2", 00:37:45.135 "uuid": "d6102de7-e352-5771-bd07-412d1a185519", 00:37:45.135 "is_configured": true, 00:37:45.135 "data_offset": 0, 00:37:45.135 "data_size": 65536 00:37:45.135 } 00:37:45.135 ] 00:37:45.135 }' 00:37:45.135 11:48:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:37:45.135 11:48:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:37:45.135 11:48:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:37:45.393 11:48:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:37:45.393 11:48:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:37:45.393 [2024-06-10 11:48:29.132967] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:37:45.393 [2024-06-10 11:48:29.133207] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:37:45.650 [2024-06-10 11:48:29.571956] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:37:46.216 11:48:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:37:46.216 11:48:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:37:46.216 11:48:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:37:46.216 11:48:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:37:46.216 11:48:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:37:46.216 11:48:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:37:46.216 11:48:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:46.216 11:48:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:46.473 [2024-06-10 11:48:30.246310] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 38912 offset_begin: 36864 offset_end: 43008 00:37:46.473 11:48:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:37:46.473 "name": "raid_bdev1", 00:37:46.473 "uuid": "928b703b-71ea-4fb8-9a35-1d7e334625ae", 00:37:46.473 "strip_size_kb": 0, 00:37:46.473 "state": "online", 00:37:46.473 "raid_level": "raid1", 00:37:46.473 "superblock": false, 00:37:46.473 "num_base_bdevs": 2, 00:37:46.473 "num_base_bdevs_discovered": 2, 00:37:46.473 "num_base_bdevs_operational": 2, 00:37:46.473 "process": { 00:37:46.473 "type": "rebuild", 00:37:46.473 "target": "spare", 00:37:46.473 "progress": { 00:37:46.473 "blocks": 38912, 00:37:46.473 "percent": 59 00:37:46.473 } 00:37:46.473 }, 00:37:46.473 "base_bdevs_list": [ 00:37:46.473 { 00:37:46.473 "name": "spare", 00:37:46.473 "uuid": "77bb05bb-d164-58b5-91fa-8ffb7c70cdee", 00:37:46.473 "is_configured": true, 00:37:46.473 "data_offset": 0, 00:37:46.474 "data_size": 65536 00:37:46.474 }, 00:37:46.474 { 00:37:46.474 "name": "BaseBdev2", 00:37:46.474 "uuid": "d6102de7-e352-5771-bd07-412d1a185519", 00:37:46.474 "is_configured": true, 00:37:46.474 "data_offset": 0, 00:37:46.474 "data_size": 65536 00:37:46.474 } 00:37:46.474 ] 00:37:46.474 }' 00:37:46.474 11:48:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:37:46.474 11:48:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:37:46.474 11:48:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:37:46.474 11:48:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:37:46.474 11:48:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:37:46.731 [2024-06-10 11:48:30.569277] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 45056 offset_begin: 43008 offset_end: 49152 00:37:46.989 [2024-06-10 11:48:30.904079] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 51200 offset_begin: 49152 offset_end: 55296 00:37:46.989 [2024-06-10 11:48:30.904496] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 51200 offset_begin: 49152 offset_end: 55296 00:37:47.553 11:48:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:37:47.553 11:48:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:37:47.553 11:48:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:37:47.553 11:48:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:37:47.553 11:48:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:37:47.553 11:48:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:37:47.553 11:48:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:47.553 11:48:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:47.809 11:48:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:37:47.809 "name": "raid_bdev1", 00:37:47.809 "uuid": "928b703b-71ea-4fb8-9a35-1d7e334625ae", 00:37:47.809 "strip_size_kb": 0, 00:37:47.809 "state": "online", 00:37:47.809 "raid_level": "raid1", 00:37:47.809 "superblock": false, 00:37:47.809 "num_base_bdevs": 2, 00:37:47.809 "num_base_bdevs_discovered": 2, 00:37:47.809 "num_base_bdevs_operational": 2, 00:37:47.809 "process": { 00:37:47.809 "type": "rebuild", 00:37:47.809 "target": "spare", 00:37:47.809 "progress": { 00:37:47.809 "blocks": 59392, 00:37:47.809 "percent": 90 00:37:47.809 } 00:37:47.809 }, 00:37:47.809 "base_bdevs_list": [ 00:37:47.809 { 00:37:47.809 "name": "spare", 00:37:47.809 "uuid": "77bb05bb-d164-58b5-91fa-8ffb7c70cdee", 00:37:47.809 "is_configured": true, 00:37:47.809 "data_offset": 0, 00:37:47.809 "data_size": 65536 00:37:47.809 }, 00:37:47.809 { 00:37:47.809 "name": "BaseBdev2", 00:37:47.809 "uuid": "d6102de7-e352-5771-bd07-412d1a185519", 00:37:47.810 "is_configured": true, 00:37:47.810 "data_offset": 0, 00:37:47.810 "data_size": 65536 00:37:47.810 } 00:37:47.810 ] 00:37:47.810 }' 00:37:47.810 11:48:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:37:47.810 11:48:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:37:47.810 11:48:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:37:47.810 11:48:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:37:47.810 11:48:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:37:48.066 [2024-06-10 11:48:31.778670] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:37:48.066 [2024-06-10 11:48:31.884052] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:37:48.066 [2024-06-10 11:48:31.886119] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:37:48.998 11:48:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:37:48.998 11:48:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:37:48.998 11:48:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:37:48.998 11:48:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:37:48.998 11:48:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:37:48.998 11:48:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:37:48.998 11:48:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:48.998 11:48:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:48.998 11:48:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:37:48.998 "name": "raid_bdev1", 00:37:48.998 "uuid": "928b703b-71ea-4fb8-9a35-1d7e334625ae", 00:37:48.998 "strip_size_kb": 0, 00:37:48.998 "state": "online", 00:37:48.998 "raid_level": "raid1", 00:37:48.998 "superblock": false, 00:37:48.998 "num_base_bdevs": 2, 00:37:48.998 "num_base_bdevs_discovered": 2, 00:37:48.998 "num_base_bdevs_operational": 2, 00:37:48.998 "base_bdevs_list": [ 00:37:48.998 { 00:37:48.998 "name": "spare", 00:37:48.998 "uuid": "77bb05bb-d164-58b5-91fa-8ffb7c70cdee", 00:37:48.998 "is_configured": true, 00:37:48.998 "data_offset": 0, 00:37:48.998 "data_size": 65536 00:37:48.998 }, 00:37:48.998 { 00:37:48.998 "name": "BaseBdev2", 00:37:48.998 "uuid": "d6102de7-e352-5771-bd07-412d1a185519", 00:37:48.998 "is_configured": true, 00:37:48.998 "data_offset": 0, 00:37:48.998 "data_size": 65536 00:37:48.998 } 00:37:48.998 ] 00:37:48.998 }' 00:37:48.998 11:48:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:37:48.998 11:48:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:37:48.998 11:48:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:37:48.998 11:48:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:37:48.998 11:48:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # break 00:37:48.998 11:48:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:37:48.998 11:48:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:37:48.998 11:48:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:37:48.998 11:48:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:37:48.998 11:48:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:37:48.998 11:48:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:48.998 11:48:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:49.254 11:48:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:37:49.254 "name": "raid_bdev1", 00:37:49.254 "uuid": "928b703b-71ea-4fb8-9a35-1d7e334625ae", 00:37:49.254 "strip_size_kb": 0, 00:37:49.254 "state": "online", 00:37:49.254 "raid_level": "raid1", 00:37:49.254 "superblock": false, 00:37:49.254 "num_base_bdevs": 2, 00:37:49.254 "num_base_bdevs_discovered": 2, 00:37:49.254 "num_base_bdevs_operational": 2, 00:37:49.254 "base_bdevs_list": [ 00:37:49.254 { 00:37:49.254 "name": "spare", 00:37:49.254 "uuid": "77bb05bb-d164-58b5-91fa-8ffb7c70cdee", 00:37:49.254 "is_configured": true, 00:37:49.254 "data_offset": 0, 00:37:49.254 "data_size": 65536 00:37:49.254 }, 00:37:49.254 { 00:37:49.254 "name": "BaseBdev2", 00:37:49.254 "uuid": "d6102de7-e352-5771-bd07-412d1a185519", 00:37:49.254 "is_configured": true, 00:37:49.254 "data_offset": 0, 00:37:49.254 "data_size": 65536 00:37:49.254 } 00:37:49.254 ] 00:37:49.254 }' 00:37:49.254 11:48:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:37:49.254 11:48:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:37:49.254 11:48:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:37:49.254 11:48:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:37:49.254 11:48:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:37:49.254 11:48:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:37:49.254 11:48:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:37:49.254 11:48:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:37:49.254 11:48:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:37:49.254 11:48:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:37:49.254 11:48:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:37:49.254 11:48:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:37:49.254 11:48:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:37:49.254 11:48:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:37:49.254 11:48:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:49.254 11:48:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:49.511 11:48:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:37:49.511 "name": "raid_bdev1", 00:37:49.511 "uuid": "928b703b-71ea-4fb8-9a35-1d7e334625ae", 00:37:49.511 "strip_size_kb": 0, 00:37:49.511 "state": "online", 00:37:49.511 "raid_level": "raid1", 00:37:49.511 "superblock": false, 00:37:49.511 "num_base_bdevs": 2, 00:37:49.511 "num_base_bdevs_discovered": 2, 00:37:49.511 "num_base_bdevs_operational": 2, 00:37:49.511 "base_bdevs_list": [ 00:37:49.511 { 00:37:49.511 "name": "spare", 00:37:49.511 "uuid": "77bb05bb-d164-58b5-91fa-8ffb7c70cdee", 00:37:49.511 "is_configured": true, 00:37:49.511 "data_offset": 0, 00:37:49.511 "data_size": 65536 00:37:49.511 }, 00:37:49.511 { 00:37:49.511 "name": "BaseBdev2", 00:37:49.511 "uuid": "d6102de7-e352-5771-bd07-412d1a185519", 00:37:49.511 "is_configured": true, 00:37:49.511 "data_offset": 0, 00:37:49.511 "data_size": 65536 00:37:49.511 } 00:37:49.511 ] 00:37:49.511 }' 00:37:49.511 11:48:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:37:49.511 11:48:33 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:37:50.075 11:48:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:37:50.075 [2024-06-10 11:48:34.005961] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:37:50.076 [2024-06-10 11:48:34.005988] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:37:50.333 00:37:50.333 Latency(us) 00:37:50.333 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:37:50.333 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:37:50.333 raid_bdev1 : 10.23 99.94 299.82 0.00 0.00 13604.32 238.64 108504.82 00:37:50.333 =================================================================================================================== 00:37:50.333 Total : 99.94 299.82 0.00 0.00 13604.32 238.64 108504.82 00:37:50.333 [2024-06-10 11:48:34.028733] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:37:50.333 [2024-06-10 11:48:34.028752] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:37:50.333 [2024-06-10 11:48:34.028800] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:37:50.333 [2024-06-10 11:48:34.028808] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa8b410 name raid_bdev1, state offline 00:37:50.333 0 00:37:50.333 11:48:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # jq length 00:37:50.333 11:48:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:50.333 11:48:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:37:50.333 11:48:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:37:50.333 11:48:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:37:50.333 11:48:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:37:50.333 11:48:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:37:50.333 11:48:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:37:50.333 11:48:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:37:50.333 11:48:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:37:50.333 11:48:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:37:50.333 11:48:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:37:50.333 11:48:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:37:50.333 11:48:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:37:50.333 11:48:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:37:50.591 /dev/nbd0 00:37:50.591 11:48:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:37:50.591 11:48:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:37:50.591 11:48:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:37:50.591 11:48:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@868 -- # local i 00:37:50.591 11:48:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:37:50.591 11:48:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:37:50.591 11:48:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:37:50.591 11:48:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@872 -- # break 00:37:50.591 11:48:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:37:50.591 11:48:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:37:50.591 11:48:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:37:50.591 1+0 records in 00:37:50.591 1+0 records out 00:37:50.591 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000231211 s, 17.7 MB/s 00:37:50.591 11:48:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:37:50.591 11:48:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # size=4096 00:37:50.591 11:48:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:37:50.591 11:48:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:37:50.591 11:48:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@888 -- # return 0 00:37:50.591 11:48:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:37:50.591 11:48:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:37:50.591 11:48:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:37:50.591 11:48:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev2 ']' 00:37:50.591 11:48:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev2 /dev/nbd1 00:37:50.591 11:48:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:37:50.591 11:48:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev2') 00:37:50.591 11:48:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:37:50.591 11:48:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:37:50.591 11:48:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:37:50.591 11:48:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:37:50.591 11:48:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:37:50.591 11:48:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:37:50.591 11:48:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev2 /dev/nbd1 00:37:50.849 /dev/nbd1 00:37:50.849 11:48:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:37:50.849 11:48:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:37:50.849 11:48:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:37:50.849 11:48:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@868 -- # local i 00:37:50.849 11:48:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:37:50.849 11:48:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:37:50.849 11:48:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:37:50.849 11:48:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@872 -- # break 00:37:50.849 11:48:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:37:50.849 11:48:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:37:50.849 11:48:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:37:50.849 1+0 records in 00:37:50.849 1+0 records out 00:37:50.849 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000251425 s, 16.3 MB/s 00:37:50.849 11:48:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:37:50.849 11:48:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # size=4096 00:37:50.849 11:48:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:37:50.849 11:48:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:37:50.849 11:48:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@888 -- # return 0 00:37:50.849 11:48:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:37:50.849 11:48:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:37:50.849 11:48:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:37:50.849 11:48:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:37:50.849 11:48:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:37:50.849 11:48:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:37:50.849 11:48:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:37:50.849 11:48:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:37:50.849 11:48:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:37:50.849 11:48:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:37:51.106 11:48:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:37:51.106 11:48:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:37:51.106 11:48:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:37:51.106 11:48:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:37:51.106 11:48:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:37:51.106 11:48:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:37:51.106 11:48:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:37:51.106 11:48:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:37:51.106 11:48:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:37:51.106 11:48:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:37:51.106 11:48:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:37:51.106 11:48:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:37:51.106 11:48:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:37:51.106 11:48:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:37:51.106 11:48:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:37:51.364 11:48:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:37:51.364 11:48:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:37:51.364 11:48:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:37:51.364 11:48:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:37:51.364 11:48:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:37:51.364 11:48:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:37:51.364 11:48:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:37:51.364 11:48:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:37:51.364 11:48:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:37:51.364 11:48:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@782 -- # killprocess 223607 00:37:51.364 11:48:35 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@949 -- # '[' -z 223607 ']' 00:37:51.364 11:48:35 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # kill -0 223607 00:37:51.364 11:48:35 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # uname 00:37:51.364 11:48:35 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:37:51.364 11:48:35 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 223607 00:37:51.364 11:48:35 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:37:51.364 11:48:35 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:37:51.364 11:48:35 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@967 -- # echo 'killing process with pid 223607' 00:37:51.364 killing process with pid 223607 00:37:51.364 11:48:35 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@968 -- # kill 223607 00:37:51.364 Received shutdown signal, test time was about 11.390165 seconds 00:37:51.364 00:37:51.364 Latency(us) 00:37:51.364 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:37:51.364 =================================================================================================================== 00:37:51.364 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:37:51.364 [2024-06-10 11:48:35.192935] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:37:51.364 11:48:35 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@973 -- # wait 223607 00:37:51.364 [2024-06-10 11:48:35.213807] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:37:51.622 11:48:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@784 -- # return 0 00:37:51.622 00:37:51.622 real 0m14.951s 00:37:51.622 user 0m21.880s 00:37:51.622 sys 0m2.349s 00:37:51.622 11:48:35 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1125 -- # xtrace_disable 00:37:51.622 11:48:35 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:37:51.622 ************************************ 00:37:51.622 END TEST raid_rebuild_test_io 00:37:51.622 ************************************ 00:37:51.622 11:48:35 bdev_raid -- bdev/bdev_raid.sh@880 -- # run_test raid_rebuild_test_sb_io raid_rebuild_test raid1 2 true true true 00:37:51.622 11:48:35 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:37:51.622 11:48:35 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:37:51.622 11:48:35 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:37:51.622 ************************************ 00:37:51.622 START TEST raid_rebuild_test_sb_io 00:37:51.622 ************************************ 00:37:51.622 11:48:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1124 -- # raid_rebuild_test raid1 2 true true true 00:37:51.622 11:48:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:37:51.622 11:48:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:37:51.622 11:48:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:37:51.622 11:48:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:37:51.622 11:48:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:37:51.622 11:48:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:37:51.622 11:48:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:37:51.622 11:48:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:37:51.622 11:48:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:37:51.622 11:48:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:37:51.622 11:48:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:37:51.622 11:48:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:37:51.622 11:48:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:37:51.622 11:48:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:37:51.622 11:48:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:37:51.622 11:48:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:37:51.622 11:48:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:37:51.622 11:48:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:37:51.622 11:48:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:37:51.622 11:48:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:37:51.622 11:48:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:37:51.622 11:48:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:37:51.622 11:48:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:37:51.622 11:48:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:37:51.622 11:48:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@596 -- # raid_pid=225786 00:37:51.622 11:48:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 225786 /var/tmp/spdk-raid.sock 00:37:51.622 11:48:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:37:51.622 11:48:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@830 -- # '[' -z 225786 ']' 00:37:51.622 11:48:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:37:51.622 11:48:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@835 -- # local max_retries=100 00:37:51.622 11:48:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:37:51.622 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:37:51.622 11:48:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@839 -- # xtrace_disable 00:37:51.622 11:48:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:37:51.622 [2024-06-10 11:48:35.562786] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:37:51.622 [2024-06-10 11:48:35.562839] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid225786 ] 00:37:51.622 I/O size of 3145728 is greater than zero copy threshold (65536). 00:37:51.622 Zero copy mechanism will not be used. 00:37:51.882 [2024-06-10 11:48:35.650661] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:37:51.882 [2024-06-10 11:48:35.731544] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:37:51.882 [2024-06-10 11:48:35.787769] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:37:51.882 [2024-06-10 11:48:35.787797] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:37:52.540 11:48:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:37:52.540 11:48:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@863 -- # return 0 00:37:52.540 11:48:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:37:52.540 11:48:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:37:52.797 BaseBdev1_malloc 00:37:52.797 11:48:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:37:52.797 [2024-06-10 11:48:36.689186] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:37:52.797 [2024-06-10 11:48:36.689231] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:37:52.797 [2024-06-10 11:48:36.689247] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16f9780 00:37:52.797 [2024-06-10 11:48:36.689256] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:37:52.797 [2024-06-10 11:48:36.690430] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:37:52.797 [2024-06-10 11:48:36.690451] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:37:52.797 BaseBdev1 00:37:52.797 11:48:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:37:52.797 11:48:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:37:53.054 BaseBdev2_malloc 00:37:53.054 11:48:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:37:53.312 [2024-06-10 11:48:37.041857] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:37:53.312 [2024-06-10 11:48:37.041900] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:37:53.312 [2024-06-10 11:48:37.041913] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18a4a50 00:37:53.312 [2024-06-10 11:48:37.041921] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:37:53.312 [2024-06-10 11:48:37.042861] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:37:53.312 [2024-06-10 11:48:37.042889] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:37:53.312 BaseBdev2 00:37:53.312 11:48:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:37:53.312 spare_malloc 00:37:53.312 11:48:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:37:53.569 spare_delay 00:37:53.569 11:48:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:37:53.826 [2024-06-10 11:48:37.574879] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:37:53.826 [2024-06-10 11:48:37.574913] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:37:53.826 [2024-06-10 11:48:37.574942] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18a7970 00:37:53.826 [2024-06-10 11:48:37.574951] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:37:53.826 [2024-06-10 11:48:37.575921] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:37:53.826 [2024-06-10 11:48:37.575941] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:37:53.826 spare 00:37:53.826 11:48:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:37:53.826 [2024-06-10 11:48:37.743335] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:37:53.826 [2024-06-10 11:48:37.744214] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:37:53.826 [2024-06-10 11:48:37.744331] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x18a9410 00:37:53.826 [2024-06-10 11:48:37.744340] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:37:53.826 [2024-06-10 11:48:37.744470] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18a8d20 00:37:53.826 [2024-06-10 11:48:37.744567] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x18a9410 00:37:53.826 [2024-06-10 11:48:37.744575] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x18a9410 00:37:53.826 [2024-06-10 11:48:37.744635] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:37:53.826 11:48:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:37:53.826 11:48:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:37:53.826 11:48:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:37:53.826 11:48:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:37:53.826 11:48:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:37:53.826 11:48:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:37:53.826 11:48:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:37:53.826 11:48:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:37:53.826 11:48:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:37:53.826 11:48:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:37:53.826 11:48:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:53.826 11:48:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:54.082 11:48:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:37:54.082 "name": "raid_bdev1", 00:37:54.082 "uuid": "86502149-56b6-42d9-be54-469021f334fc", 00:37:54.082 "strip_size_kb": 0, 00:37:54.082 "state": "online", 00:37:54.082 "raid_level": "raid1", 00:37:54.082 "superblock": true, 00:37:54.082 "num_base_bdevs": 2, 00:37:54.082 "num_base_bdevs_discovered": 2, 00:37:54.083 "num_base_bdevs_operational": 2, 00:37:54.083 "base_bdevs_list": [ 00:37:54.083 { 00:37:54.083 "name": "BaseBdev1", 00:37:54.083 "uuid": "2c71d769-ba76-5975-a112-7995f5f008eb", 00:37:54.083 "is_configured": true, 00:37:54.083 "data_offset": 2048, 00:37:54.083 "data_size": 63488 00:37:54.083 }, 00:37:54.083 { 00:37:54.083 "name": "BaseBdev2", 00:37:54.083 "uuid": "a2aaaab5-b9f7-5936-9177-c8fb74e893bc", 00:37:54.083 "is_configured": true, 00:37:54.083 "data_offset": 2048, 00:37:54.083 "data_size": 63488 00:37:54.083 } 00:37:54.083 ] 00:37:54.083 }' 00:37:54.083 11:48:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:37:54.083 11:48:37 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:37:54.646 11:48:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:37:54.646 11:48:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:37:54.903 [2024-06-10 11:48:38.597666] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:37:54.903 11:48:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:37:54.903 11:48:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:54.903 11:48:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:37:54.903 11:48:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:37:54.903 11:48:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:37:54.903 11:48:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:37:54.903 11:48:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:37:55.161 [2024-06-10 11:48:38.884215] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18aa250 00:37:55.161 I/O size of 3145728 is greater than zero copy threshold (65536). 00:37:55.161 Zero copy mechanism will not be used. 00:37:55.161 Running I/O for 60 seconds... 00:37:55.161 [2024-06-10 11:48:38.954856] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:37:55.161 [2024-06-10 11:48:38.964931] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x18aa250 00:37:55.161 11:48:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:37:55.161 11:48:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:37:55.161 11:48:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:37:55.161 11:48:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:37:55.161 11:48:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:37:55.161 11:48:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:37:55.161 11:48:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:37:55.161 11:48:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:37:55.161 11:48:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:37:55.161 11:48:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:37:55.162 11:48:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:55.162 11:48:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:55.419 11:48:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:37:55.420 "name": "raid_bdev1", 00:37:55.420 "uuid": "86502149-56b6-42d9-be54-469021f334fc", 00:37:55.420 "strip_size_kb": 0, 00:37:55.420 "state": "online", 00:37:55.420 "raid_level": "raid1", 00:37:55.420 "superblock": true, 00:37:55.420 "num_base_bdevs": 2, 00:37:55.420 "num_base_bdevs_discovered": 1, 00:37:55.420 "num_base_bdevs_operational": 1, 00:37:55.420 "base_bdevs_list": [ 00:37:55.420 { 00:37:55.420 "name": null, 00:37:55.420 "uuid": "00000000-0000-0000-0000-000000000000", 00:37:55.420 "is_configured": false, 00:37:55.420 "data_offset": 2048, 00:37:55.420 "data_size": 63488 00:37:55.420 }, 00:37:55.420 { 00:37:55.420 "name": "BaseBdev2", 00:37:55.420 "uuid": "a2aaaab5-b9f7-5936-9177-c8fb74e893bc", 00:37:55.420 "is_configured": true, 00:37:55.420 "data_offset": 2048, 00:37:55.420 "data_size": 63488 00:37:55.420 } 00:37:55.420 ] 00:37:55.420 }' 00:37:55.420 11:48:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:37:55.420 11:48:39 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:37:55.986 11:48:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:37:55.986 [2024-06-10 11:48:39.809691] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:37:55.986 11:48:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:37:55.986 [2024-06-10 11:48:39.870094] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x170bf70 00:37:55.986 [2024-06-10 11:48:39.871923] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:37:56.242 [2024-06-10 11:48:39.992216] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:37:56.242 [2024-06-10 11:48:39.992618] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:37:56.500 [2024-06-10 11:48:40.206436] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:37:56.500 [2024-06-10 11:48:40.206683] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:37:56.757 [2024-06-10 11:48:40.540137] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:37:57.014 [2024-06-10 11:48:40.768946] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:37:57.014 [2024-06-10 11:48:40.769106] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:37:57.014 11:48:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:37:57.014 11:48:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:37:57.014 11:48:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:37:57.014 11:48:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:37:57.014 11:48:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:37:57.014 11:48:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:57.014 11:48:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:57.271 11:48:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:37:57.271 "name": "raid_bdev1", 00:37:57.271 "uuid": "86502149-56b6-42d9-be54-469021f334fc", 00:37:57.271 "strip_size_kb": 0, 00:37:57.271 "state": "online", 00:37:57.271 "raid_level": "raid1", 00:37:57.271 "superblock": true, 00:37:57.271 "num_base_bdevs": 2, 00:37:57.271 "num_base_bdevs_discovered": 2, 00:37:57.271 "num_base_bdevs_operational": 2, 00:37:57.271 "process": { 00:37:57.271 "type": "rebuild", 00:37:57.271 "target": "spare", 00:37:57.271 "progress": { 00:37:57.271 "blocks": 12288, 00:37:57.271 "percent": 19 00:37:57.271 } 00:37:57.271 }, 00:37:57.271 "base_bdevs_list": [ 00:37:57.271 { 00:37:57.271 "name": "spare", 00:37:57.271 "uuid": "49a95011-b6d9-572f-8637-2335da95cd01", 00:37:57.271 "is_configured": true, 00:37:57.271 "data_offset": 2048, 00:37:57.271 "data_size": 63488 00:37:57.271 }, 00:37:57.271 { 00:37:57.271 "name": "BaseBdev2", 00:37:57.271 "uuid": "a2aaaab5-b9f7-5936-9177-c8fb74e893bc", 00:37:57.271 "is_configured": true, 00:37:57.271 "data_offset": 2048, 00:37:57.271 "data_size": 63488 00:37:57.271 } 00:37:57.271 ] 00:37:57.271 }' 00:37:57.271 11:48:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:37:57.271 11:48:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:37:57.271 11:48:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:37:57.271 11:48:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:37:57.271 11:48:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:37:57.528 [2024-06-10 11:48:41.295913] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:37:57.528 [2024-06-10 11:48:41.446655] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:37:57.528 [2024-06-10 11:48:41.453723] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:37:57.528 [2024-06-10 11:48:41.453746] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:37:57.528 [2024-06-10 11:48:41.453754] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:37:57.528 [2024-06-10 11:48:41.468807] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x18aa250 00:37:57.786 11:48:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:37:57.786 11:48:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:37:57.786 11:48:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:37:57.786 11:48:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:37:57.786 11:48:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:37:57.786 11:48:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:37:57.786 11:48:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:37:57.786 11:48:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:37:57.786 11:48:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:37:57.786 11:48:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:37:57.786 11:48:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:57.786 11:48:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:57.786 11:48:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:37:57.786 "name": "raid_bdev1", 00:37:57.786 "uuid": "86502149-56b6-42d9-be54-469021f334fc", 00:37:57.786 "strip_size_kb": 0, 00:37:57.786 "state": "online", 00:37:57.786 "raid_level": "raid1", 00:37:57.786 "superblock": true, 00:37:57.786 "num_base_bdevs": 2, 00:37:57.786 "num_base_bdevs_discovered": 1, 00:37:57.786 "num_base_bdevs_operational": 1, 00:37:57.786 "base_bdevs_list": [ 00:37:57.786 { 00:37:57.786 "name": null, 00:37:57.786 "uuid": "00000000-0000-0000-0000-000000000000", 00:37:57.786 "is_configured": false, 00:37:57.786 "data_offset": 2048, 00:37:57.786 "data_size": 63488 00:37:57.786 }, 00:37:57.786 { 00:37:57.786 "name": "BaseBdev2", 00:37:57.786 "uuid": "a2aaaab5-b9f7-5936-9177-c8fb74e893bc", 00:37:57.786 "is_configured": true, 00:37:57.786 "data_offset": 2048, 00:37:57.786 "data_size": 63488 00:37:57.786 } 00:37:57.786 ] 00:37:57.786 }' 00:37:57.786 11:48:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:37:57.786 11:48:41 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:37:58.356 11:48:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:37:58.356 11:48:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:37:58.356 11:48:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:37:58.356 11:48:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:37:58.356 11:48:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:37:58.356 11:48:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:58.356 11:48:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:58.625 11:48:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:37:58.625 "name": "raid_bdev1", 00:37:58.625 "uuid": "86502149-56b6-42d9-be54-469021f334fc", 00:37:58.625 "strip_size_kb": 0, 00:37:58.625 "state": "online", 00:37:58.625 "raid_level": "raid1", 00:37:58.625 "superblock": true, 00:37:58.625 "num_base_bdevs": 2, 00:37:58.625 "num_base_bdevs_discovered": 1, 00:37:58.625 "num_base_bdevs_operational": 1, 00:37:58.625 "base_bdevs_list": [ 00:37:58.625 { 00:37:58.625 "name": null, 00:37:58.625 "uuid": "00000000-0000-0000-0000-000000000000", 00:37:58.625 "is_configured": false, 00:37:58.625 "data_offset": 2048, 00:37:58.625 "data_size": 63488 00:37:58.625 }, 00:37:58.625 { 00:37:58.625 "name": "BaseBdev2", 00:37:58.625 "uuid": "a2aaaab5-b9f7-5936-9177-c8fb74e893bc", 00:37:58.625 "is_configured": true, 00:37:58.625 "data_offset": 2048, 00:37:58.625 "data_size": 63488 00:37:58.625 } 00:37:58.625 ] 00:37:58.625 }' 00:37:58.625 11:48:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:37:58.625 11:48:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:37:58.625 11:48:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:37:58.625 11:48:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:37:58.625 11:48:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:37:58.883 [2024-06-10 11:48:42.601966] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:37:58.883 11:48:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:37:58.883 [2024-06-10 11:48:42.665429] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18a9370 00:37:58.883 [2024-06-10 11:48:42.666584] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:37:58.883 [2024-06-10 11:48:42.784738] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:37:58.883 [2024-06-10 11:48:42.785130] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:37:59.140 [2024-06-10 11:48:42.993029] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:37:59.140 [2024-06-10 11:48:42.993166] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:37:59.398 [2024-06-10 11:48:43.233927] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:37:59.398 [2024-06-10 11:48:43.234169] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:37:59.656 [2024-06-10 11:48:43.357519] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:37:59.656 [2024-06-10 11:48:43.357656] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:37:59.914 11:48:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:37:59.914 11:48:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:37:59.914 11:48:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:37:59.914 11:48:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:37:59.914 11:48:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:37:59.914 11:48:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:59.914 11:48:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:59.914 [2024-06-10 11:48:43.668812] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:37:59.914 [2024-06-10 11:48:43.669304] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:37:59.914 11:48:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:37:59.914 "name": "raid_bdev1", 00:37:59.914 "uuid": "86502149-56b6-42d9-be54-469021f334fc", 00:37:59.914 "strip_size_kb": 0, 00:37:59.914 "state": "online", 00:37:59.914 "raid_level": "raid1", 00:37:59.914 "superblock": true, 00:37:59.914 "num_base_bdevs": 2, 00:37:59.914 "num_base_bdevs_discovered": 2, 00:37:59.914 "num_base_bdevs_operational": 2, 00:37:59.914 "process": { 00:37:59.914 "type": "rebuild", 00:37:59.914 "target": "spare", 00:37:59.914 "progress": { 00:37:59.914 "blocks": 16384, 00:37:59.914 "percent": 25 00:37:59.914 } 00:37:59.914 }, 00:37:59.914 "base_bdevs_list": [ 00:37:59.914 { 00:37:59.914 "name": "spare", 00:37:59.914 "uuid": "49a95011-b6d9-572f-8637-2335da95cd01", 00:37:59.914 "is_configured": true, 00:37:59.914 "data_offset": 2048, 00:37:59.914 "data_size": 63488 00:37:59.914 }, 00:37:59.914 { 00:37:59.914 "name": "BaseBdev2", 00:37:59.914 "uuid": "a2aaaab5-b9f7-5936-9177-c8fb74e893bc", 00:37:59.914 "is_configured": true, 00:37:59.914 "data_offset": 2048, 00:37:59.914 "data_size": 63488 00:37:59.914 } 00:37:59.914 ] 00:37:59.914 }' 00:37:59.914 11:48:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:38:00.172 11:48:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:38:00.172 11:48:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:38:00.172 11:48:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:38:00.172 11:48:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:38:00.172 11:48:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:38:00.172 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:38:00.172 11:48:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:38:00.172 11:48:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:38:00.173 11:48:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:38:00.173 11:48:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@705 -- # local timeout=653 00:38:00.173 11:48:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:38:00.173 11:48:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:38:00.173 11:48:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:38:00.173 11:48:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:38:00.173 11:48:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:38:00.173 11:48:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:38:00.173 11:48:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:00.173 11:48:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:00.173 11:48:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:38:00.173 "name": "raid_bdev1", 00:38:00.173 "uuid": "86502149-56b6-42d9-be54-469021f334fc", 00:38:00.173 "strip_size_kb": 0, 00:38:00.173 "state": "online", 00:38:00.173 "raid_level": "raid1", 00:38:00.173 "superblock": true, 00:38:00.173 "num_base_bdevs": 2, 00:38:00.173 "num_base_bdevs_discovered": 2, 00:38:00.173 "num_base_bdevs_operational": 2, 00:38:00.173 "process": { 00:38:00.173 "type": "rebuild", 00:38:00.173 "target": "spare", 00:38:00.173 "progress": { 00:38:00.173 "blocks": 20480, 00:38:00.173 "percent": 32 00:38:00.173 } 00:38:00.173 }, 00:38:00.173 "base_bdevs_list": [ 00:38:00.173 { 00:38:00.173 "name": "spare", 00:38:00.173 "uuid": "49a95011-b6d9-572f-8637-2335da95cd01", 00:38:00.173 "is_configured": true, 00:38:00.173 "data_offset": 2048, 00:38:00.173 "data_size": 63488 00:38:00.173 }, 00:38:00.173 { 00:38:00.173 "name": "BaseBdev2", 00:38:00.173 "uuid": "a2aaaab5-b9f7-5936-9177-c8fb74e893bc", 00:38:00.173 "is_configured": true, 00:38:00.173 "data_offset": 2048, 00:38:00.173 "data_size": 63488 00:38:00.173 } 00:38:00.173 ] 00:38:00.173 }' 00:38:00.173 11:48:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:38:00.430 11:48:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:38:00.430 11:48:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:38:00.430 11:48:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:38:00.430 11:48:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:38:00.430 [2024-06-10 11:48:44.313912] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:38:00.688 [2024-06-10 11:48:44.537537] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:38:00.945 [2024-06-10 11:48:44.868911] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 32768 offset_begin: 30720 offset_end: 36864 00:38:01.203 [2024-06-10 11:48:44.980847] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:38:01.460 11:48:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:38:01.461 11:48:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:38:01.461 11:48:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:38:01.461 11:48:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:38:01.461 11:48:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:38:01.461 11:48:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:38:01.461 11:48:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:01.461 11:48:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:01.461 11:48:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:38:01.461 "name": "raid_bdev1", 00:38:01.461 "uuid": "86502149-56b6-42d9-be54-469021f334fc", 00:38:01.461 "strip_size_kb": 0, 00:38:01.461 "state": "online", 00:38:01.461 "raid_level": "raid1", 00:38:01.461 "superblock": true, 00:38:01.461 "num_base_bdevs": 2, 00:38:01.461 "num_base_bdevs_discovered": 2, 00:38:01.461 "num_base_bdevs_operational": 2, 00:38:01.461 "process": { 00:38:01.461 "type": "rebuild", 00:38:01.461 "target": "spare", 00:38:01.461 "progress": { 00:38:01.461 "blocks": 38912, 00:38:01.461 "percent": 61 00:38:01.461 } 00:38:01.461 }, 00:38:01.461 "base_bdevs_list": [ 00:38:01.461 { 00:38:01.461 "name": "spare", 00:38:01.461 "uuid": "49a95011-b6d9-572f-8637-2335da95cd01", 00:38:01.461 "is_configured": true, 00:38:01.461 "data_offset": 2048, 00:38:01.461 "data_size": 63488 00:38:01.461 }, 00:38:01.461 { 00:38:01.461 "name": "BaseBdev2", 00:38:01.461 "uuid": "a2aaaab5-b9f7-5936-9177-c8fb74e893bc", 00:38:01.461 "is_configured": true, 00:38:01.461 "data_offset": 2048, 00:38:01.461 "data_size": 63488 00:38:01.461 } 00:38:01.461 ] 00:38:01.461 }' 00:38:01.461 11:48:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:38:01.461 11:48:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:38:01.461 11:48:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:38:01.719 11:48:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:38:01.719 11:48:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:38:01.719 [2024-06-10 11:48:45.629204] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 45056 offset_begin: 43008 offset_end: 49152 00:38:01.719 [2024-06-10 11:48:45.629425] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 45056 offset_begin: 43008 offset_end: 49152 00:38:02.652 [2024-06-10 11:48:46.399355] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 57344 offset_begin: 55296 offset_end: 61440 00:38:02.652 11:48:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:38:02.652 11:48:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:38:02.652 11:48:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:38:02.652 11:48:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:38:02.652 11:48:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:38:02.652 11:48:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:38:02.652 11:48:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:02.652 11:48:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:02.910 [2024-06-10 11:48:46.609356] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 59392 offset_begin: 55296 offset_end: 61440 00:38:02.910 11:48:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:38:02.910 "name": "raid_bdev1", 00:38:02.910 "uuid": "86502149-56b6-42d9-be54-469021f334fc", 00:38:02.910 "strip_size_kb": 0, 00:38:02.910 "state": "online", 00:38:02.910 "raid_level": "raid1", 00:38:02.910 "superblock": true, 00:38:02.910 "num_base_bdevs": 2, 00:38:02.910 "num_base_bdevs_discovered": 2, 00:38:02.910 "num_base_bdevs_operational": 2, 00:38:02.910 "process": { 00:38:02.910 "type": "rebuild", 00:38:02.910 "target": "spare", 00:38:02.910 "progress": { 00:38:02.910 "blocks": 59392, 00:38:02.910 "percent": 93 00:38:02.910 } 00:38:02.910 }, 00:38:02.910 "base_bdevs_list": [ 00:38:02.910 { 00:38:02.910 "name": "spare", 00:38:02.910 "uuid": "49a95011-b6d9-572f-8637-2335da95cd01", 00:38:02.910 "is_configured": true, 00:38:02.910 "data_offset": 2048, 00:38:02.910 "data_size": 63488 00:38:02.910 }, 00:38:02.910 { 00:38:02.910 "name": "BaseBdev2", 00:38:02.910 "uuid": "a2aaaab5-b9f7-5936-9177-c8fb74e893bc", 00:38:02.910 "is_configured": true, 00:38:02.910 "data_offset": 2048, 00:38:02.910 "data_size": 63488 00:38:02.910 } 00:38:02.910 ] 00:38:02.910 }' 00:38:02.910 11:48:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:38:02.910 11:48:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:38:02.910 11:48:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:38:02.910 11:48:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:38:02.910 11:48:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:38:03.168 [2024-06-10 11:48:46.939150] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:38:03.168 [2024-06-10 11:48:47.037137] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:38:03.168 [2024-06-10 11:48:47.038761] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:38:04.102 11:48:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:38:04.102 11:48:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:38:04.102 11:48:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:38:04.102 11:48:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:38:04.102 11:48:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:38:04.102 11:48:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:38:04.102 11:48:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:04.102 11:48:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:04.102 11:48:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:38:04.102 "name": "raid_bdev1", 00:38:04.102 "uuid": "86502149-56b6-42d9-be54-469021f334fc", 00:38:04.102 "strip_size_kb": 0, 00:38:04.102 "state": "online", 00:38:04.102 "raid_level": "raid1", 00:38:04.102 "superblock": true, 00:38:04.102 "num_base_bdevs": 2, 00:38:04.102 "num_base_bdevs_discovered": 2, 00:38:04.102 "num_base_bdevs_operational": 2, 00:38:04.102 "base_bdevs_list": [ 00:38:04.102 { 00:38:04.102 "name": "spare", 00:38:04.102 "uuid": "49a95011-b6d9-572f-8637-2335da95cd01", 00:38:04.102 "is_configured": true, 00:38:04.102 "data_offset": 2048, 00:38:04.102 "data_size": 63488 00:38:04.102 }, 00:38:04.102 { 00:38:04.102 "name": "BaseBdev2", 00:38:04.102 "uuid": "a2aaaab5-b9f7-5936-9177-c8fb74e893bc", 00:38:04.102 "is_configured": true, 00:38:04.102 "data_offset": 2048, 00:38:04.102 "data_size": 63488 00:38:04.102 } 00:38:04.102 ] 00:38:04.102 }' 00:38:04.102 11:48:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:38:04.102 11:48:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:38:04.102 11:48:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:38:04.102 11:48:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:38:04.102 11:48:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # break 00:38:04.102 11:48:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:38:04.102 11:48:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:38:04.102 11:48:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:38:04.102 11:48:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:38:04.102 11:48:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:38:04.102 11:48:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:04.102 11:48:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:04.360 11:48:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:38:04.360 "name": "raid_bdev1", 00:38:04.360 "uuid": "86502149-56b6-42d9-be54-469021f334fc", 00:38:04.360 "strip_size_kb": 0, 00:38:04.360 "state": "online", 00:38:04.360 "raid_level": "raid1", 00:38:04.360 "superblock": true, 00:38:04.360 "num_base_bdevs": 2, 00:38:04.360 "num_base_bdevs_discovered": 2, 00:38:04.360 "num_base_bdevs_operational": 2, 00:38:04.360 "base_bdevs_list": [ 00:38:04.360 { 00:38:04.360 "name": "spare", 00:38:04.360 "uuid": "49a95011-b6d9-572f-8637-2335da95cd01", 00:38:04.360 "is_configured": true, 00:38:04.360 "data_offset": 2048, 00:38:04.360 "data_size": 63488 00:38:04.360 }, 00:38:04.360 { 00:38:04.360 "name": "BaseBdev2", 00:38:04.360 "uuid": "a2aaaab5-b9f7-5936-9177-c8fb74e893bc", 00:38:04.360 "is_configured": true, 00:38:04.360 "data_offset": 2048, 00:38:04.360 "data_size": 63488 00:38:04.360 } 00:38:04.361 ] 00:38:04.361 }' 00:38:04.361 11:48:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:38:04.361 11:48:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:38:04.361 11:48:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:38:04.361 11:48:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:38:04.361 11:48:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:38:04.361 11:48:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:38:04.361 11:48:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:38:04.361 11:48:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:38:04.361 11:48:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:38:04.361 11:48:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:38:04.361 11:48:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:38:04.361 11:48:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:38:04.361 11:48:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:38:04.361 11:48:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:38:04.361 11:48:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:04.361 11:48:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:04.619 11:48:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:38:04.619 "name": "raid_bdev1", 00:38:04.619 "uuid": "86502149-56b6-42d9-be54-469021f334fc", 00:38:04.619 "strip_size_kb": 0, 00:38:04.619 "state": "online", 00:38:04.619 "raid_level": "raid1", 00:38:04.619 "superblock": true, 00:38:04.619 "num_base_bdevs": 2, 00:38:04.619 "num_base_bdevs_discovered": 2, 00:38:04.619 "num_base_bdevs_operational": 2, 00:38:04.619 "base_bdevs_list": [ 00:38:04.619 { 00:38:04.619 "name": "spare", 00:38:04.619 "uuid": "49a95011-b6d9-572f-8637-2335da95cd01", 00:38:04.619 "is_configured": true, 00:38:04.619 "data_offset": 2048, 00:38:04.619 "data_size": 63488 00:38:04.619 }, 00:38:04.619 { 00:38:04.619 "name": "BaseBdev2", 00:38:04.619 "uuid": "a2aaaab5-b9f7-5936-9177-c8fb74e893bc", 00:38:04.619 "is_configured": true, 00:38:04.619 "data_offset": 2048, 00:38:04.619 "data_size": 63488 00:38:04.619 } 00:38:04.619 ] 00:38:04.619 }' 00:38:04.619 11:48:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:38:04.619 11:48:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:38:05.183 11:48:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:38:05.183 [2024-06-10 11:48:49.083077] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:38:05.183 [2024-06-10 11:48:49.083110] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:38:05.452 00:38:05.452 Latency(us) 00:38:05.452 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:38:05.452 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:38:05.452 raid_bdev1 : 10.27 116.53 349.58 0.00 0.00 11914.17 242.20 110784.33 00:38:05.452 =================================================================================================================== 00:38:05.452 Total : 116.53 349.58 0.00 0.00 11914.17 242.20 110784.33 00:38:05.452 [2024-06-10 11:48:49.186219] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:38:05.452 [2024-06-10 11:48:49.186242] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:38:05.452 [2024-06-10 11:48:49.186291] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:38:05.452 [2024-06-10 11:48:49.186299] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x18a9410 name raid_bdev1, state offline 00:38:05.452 0 00:38:05.452 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:05.452 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # jq length 00:38:05.452 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:38:05.452 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:38:05.452 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:38:05.452 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:38:05.452 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:38:05.453 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:38:05.453 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:38:05.453 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:38:05.453 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:38:05.453 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:38:05.453 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:38:05.453 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:38:05.709 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:38:05.709 /dev/nbd0 00:38:05.709 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:38:05.709 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:38:05.709 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:38:05.709 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@868 -- # local i 00:38:05.709 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:38:05.709 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:38:05.709 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:38:05.709 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@872 -- # break 00:38:05.709 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:38:05.709 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:38:05.709 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:38:05.709 1+0 records in 00:38:05.709 1+0 records out 00:38:05.709 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000251993 s, 16.3 MB/s 00:38:05.709 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:38:05.709 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # size=4096 00:38:05.709 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:38:05.709 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:38:05.709 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@888 -- # return 0 00:38:05.709 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:38:05.709 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:38:05.709 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:38:05.709 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev2 ']' 00:38:05.709 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev2 /dev/nbd1 00:38:05.709 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:38:05.709 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev2') 00:38:05.709 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:38:05.709 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:38:05.709 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:38:05.709 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:38:05.709 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:38:05.709 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:38:05.709 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev2 /dev/nbd1 00:38:05.967 /dev/nbd1 00:38:05.967 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:38:05.967 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:38:05.967 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:38:05.967 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@868 -- # local i 00:38:05.967 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:38:05.967 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:38:05.967 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:38:05.967 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@872 -- # break 00:38:05.967 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:38:05.967 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:38:05.967 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:38:05.967 1+0 records in 00:38:05.967 1+0 records out 00:38:05.967 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000246215 s, 16.6 MB/s 00:38:05.967 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:38:05.967 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # size=4096 00:38:05.967 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:38:05.967 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:38:05.967 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@888 -- # return 0 00:38:05.967 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:38:05.967 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:38:05.967 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:38:05.967 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:38:05.967 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:38:05.967 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:38:05.967 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:38:05.967 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:38:05.967 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:38:05.967 11:48:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:38:06.224 11:48:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:38:06.224 11:48:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:38:06.224 11:48:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:38:06.224 11:48:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:38:06.224 11:48:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:38:06.224 11:48:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:38:06.224 11:48:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:38:06.224 11:48:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:38:06.224 11:48:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:38:06.224 11:48:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:38:06.224 11:48:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:38:06.224 11:48:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:38:06.224 11:48:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:38:06.224 11:48:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:38:06.224 11:48:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:38:06.481 11:48:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:38:06.481 11:48:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:38:06.481 11:48:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:38:06.481 11:48:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:38:06.481 11:48:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:38:06.481 11:48:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:38:06.481 11:48:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:38:06.481 11:48:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:38:06.481 11:48:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:38:06.481 11:48:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:38:06.739 11:48:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:38:06.739 [2024-06-10 11:48:50.611490] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:38:06.739 [2024-06-10 11:48:50.611531] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:38:06.739 [2024-06-10 11:48:50.611563] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x170a610 00:38:06.739 [2024-06-10 11:48:50.611572] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:38:06.739 [2024-06-10 11:48:50.612797] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:38:06.739 [2024-06-10 11:48:50.612822] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:38:06.739 [2024-06-10 11:48:50.612890] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:38:06.739 [2024-06-10 11:48:50.612911] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:38:06.739 [2024-06-10 11:48:50.612987] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:38:06.739 spare 00:38:06.739 11:48:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:38:06.739 11:48:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:38:06.739 11:48:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:38:06.739 11:48:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:38:06.739 11:48:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:38:06.739 11:48:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:38:06.739 11:48:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:38:06.739 11:48:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:38:06.739 11:48:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:38:06.739 11:48:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:38:06.739 11:48:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:06.739 11:48:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:06.997 [2024-06-10 11:48:50.713288] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x16f89c0 00:38:06.997 [2024-06-10 11:48:50.713305] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:38:06.997 [2024-06-10 11:48:50.713454] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16f9040 00:38:06.997 [2024-06-10 11:48:50.713573] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x16f89c0 00:38:06.997 [2024-06-10 11:48:50.713580] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x16f89c0 00:38:06.997 [2024-06-10 11:48:50.713662] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:38:06.997 11:48:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:38:06.997 "name": "raid_bdev1", 00:38:06.997 "uuid": "86502149-56b6-42d9-be54-469021f334fc", 00:38:06.997 "strip_size_kb": 0, 00:38:06.997 "state": "online", 00:38:06.997 "raid_level": "raid1", 00:38:06.997 "superblock": true, 00:38:06.997 "num_base_bdevs": 2, 00:38:06.997 "num_base_bdevs_discovered": 2, 00:38:06.997 "num_base_bdevs_operational": 2, 00:38:06.997 "base_bdevs_list": [ 00:38:06.997 { 00:38:06.997 "name": "spare", 00:38:06.997 "uuid": "49a95011-b6d9-572f-8637-2335da95cd01", 00:38:06.997 "is_configured": true, 00:38:06.997 "data_offset": 2048, 00:38:06.997 "data_size": 63488 00:38:06.997 }, 00:38:06.997 { 00:38:06.997 "name": "BaseBdev2", 00:38:06.997 "uuid": "a2aaaab5-b9f7-5936-9177-c8fb74e893bc", 00:38:06.997 "is_configured": true, 00:38:06.997 "data_offset": 2048, 00:38:06.997 "data_size": 63488 00:38:06.997 } 00:38:06.997 ] 00:38:06.997 }' 00:38:06.997 11:48:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:38:06.997 11:48:50 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:38:07.562 11:48:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:38:07.562 11:48:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:38:07.562 11:48:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:38:07.562 11:48:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:38:07.562 11:48:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:38:07.562 11:48:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:07.562 11:48:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:07.562 11:48:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:38:07.562 "name": "raid_bdev1", 00:38:07.562 "uuid": "86502149-56b6-42d9-be54-469021f334fc", 00:38:07.562 "strip_size_kb": 0, 00:38:07.562 "state": "online", 00:38:07.562 "raid_level": "raid1", 00:38:07.562 "superblock": true, 00:38:07.562 "num_base_bdevs": 2, 00:38:07.562 "num_base_bdevs_discovered": 2, 00:38:07.562 "num_base_bdevs_operational": 2, 00:38:07.562 "base_bdevs_list": [ 00:38:07.562 { 00:38:07.562 "name": "spare", 00:38:07.562 "uuid": "49a95011-b6d9-572f-8637-2335da95cd01", 00:38:07.562 "is_configured": true, 00:38:07.562 "data_offset": 2048, 00:38:07.562 "data_size": 63488 00:38:07.562 }, 00:38:07.562 { 00:38:07.562 "name": "BaseBdev2", 00:38:07.562 "uuid": "a2aaaab5-b9f7-5936-9177-c8fb74e893bc", 00:38:07.562 "is_configured": true, 00:38:07.562 "data_offset": 2048, 00:38:07.562 "data_size": 63488 00:38:07.562 } 00:38:07.562 ] 00:38:07.562 }' 00:38:07.562 11:48:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:38:07.563 11:48:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:38:07.821 11:48:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:38:07.821 11:48:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:38:07.821 11:48:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:38:07.821 11:48:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:07.821 11:48:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:38:07.821 11:48:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:38:08.078 [2024-06-10 11:48:51.894978] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:38:08.078 11:48:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:38:08.078 11:48:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:38:08.078 11:48:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:38:08.078 11:48:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:38:08.078 11:48:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:38:08.078 11:48:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:38:08.078 11:48:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:38:08.078 11:48:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:38:08.078 11:48:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:38:08.078 11:48:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:38:08.078 11:48:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:08.078 11:48:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:08.334 11:48:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:38:08.334 "name": "raid_bdev1", 00:38:08.334 "uuid": "86502149-56b6-42d9-be54-469021f334fc", 00:38:08.334 "strip_size_kb": 0, 00:38:08.334 "state": "online", 00:38:08.334 "raid_level": "raid1", 00:38:08.334 "superblock": true, 00:38:08.334 "num_base_bdevs": 2, 00:38:08.334 "num_base_bdevs_discovered": 1, 00:38:08.334 "num_base_bdevs_operational": 1, 00:38:08.334 "base_bdevs_list": [ 00:38:08.334 { 00:38:08.334 "name": null, 00:38:08.334 "uuid": "00000000-0000-0000-0000-000000000000", 00:38:08.334 "is_configured": false, 00:38:08.334 "data_offset": 2048, 00:38:08.334 "data_size": 63488 00:38:08.334 }, 00:38:08.334 { 00:38:08.334 "name": "BaseBdev2", 00:38:08.334 "uuid": "a2aaaab5-b9f7-5936-9177-c8fb74e893bc", 00:38:08.334 "is_configured": true, 00:38:08.334 "data_offset": 2048, 00:38:08.334 "data_size": 63488 00:38:08.334 } 00:38:08.334 ] 00:38:08.334 }' 00:38:08.334 11:48:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:38:08.334 11:48:52 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:38:08.897 11:48:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:38:08.897 [2024-06-10 11:48:52.749271] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:38:08.897 [2024-06-10 11:48:52.749391] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:38:08.897 [2024-06-10 11:48:52.749402] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:38:08.897 [2024-06-10 11:48:52.749423] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:38:08.897 [2024-06-10 11:48:52.754277] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14011c0 00:38:08.897 [2024-06-10 11:48:52.755976] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:38:08.897 11:48:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@755 -- # sleep 1 00:38:09.828 11:48:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:38:09.829 11:48:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:38:09.829 11:48:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:38:09.829 11:48:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:38:09.829 11:48:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:38:10.086 11:48:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:10.086 11:48:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:10.086 11:48:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:38:10.086 "name": "raid_bdev1", 00:38:10.086 "uuid": "86502149-56b6-42d9-be54-469021f334fc", 00:38:10.086 "strip_size_kb": 0, 00:38:10.086 "state": "online", 00:38:10.086 "raid_level": "raid1", 00:38:10.086 "superblock": true, 00:38:10.086 "num_base_bdevs": 2, 00:38:10.086 "num_base_bdevs_discovered": 2, 00:38:10.086 "num_base_bdevs_operational": 2, 00:38:10.086 "process": { 00:38:10.086 "type": "rebuild", 00:38:10.087 "target": "spare", 00:38:10.087 "progress": { 00:38:10.087 "blocks": 22528, 00:38:10.087 "percent": 35 00:38:10.087 } 00:38:10.087 }, 00:38:10.087 "base_bdevs_list": [ 00:38:10.087 { 00:38:10.087 "name": "spare", 00:38:10.087 "uuid": "49a95011-b6d9-572f-8637-2335da95cd01", 00:38:10.087 "is_configured": true, 00:38:10.087 "data_offset": 2048, 00:38:10.087 "data_size": 63488 00:38:10.087 }, 00:38:10.087 { 00:38:10.087 "name": "BaseBdev2", 00:38:10.087 "uuid": "a2aaaab5-b9f7-5936-9177-c8fb74e893bc", 00:38:10.087 "is_configured": true, 00:38:10.087 "data_offset": 2048, 00:38:10.087 "data_size": 63488 00:38:10.087 } 00:38:10.087 ] 00:38:10.087 }' 00:38:10.087 11:48:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:38:10.087 11:48:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:38:10.087 11:48:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:38:10.363 11:48:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:38:10.363 11:48:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:38:10.363 [2024-06-10 11:48:54.198212] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:38:10.363 [2024-06-10 11:48:54.266814] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:38:10.363 [2024-06-10 11:48:54.266847] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:38:10.363 [2024-06-10 11:48:54.266862] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:38:10.363 [2024-06-10 11:48:54.266870] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:38:10.363 11:48:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:38:10.363 11:48:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:38:10.363 11:48:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:38:10.363 11:48:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:38:10.363 11:48:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:38:10.363 11:48:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:38:10.363 11:48:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:38:10.363 11:48:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:38:10.363 11:48:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:38:10.363 11:48:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:38:10.363 11:48:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:10.363 11:48:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:10.619 11:48:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:38:10.619 "name": "raid_bdev1", 00:38:10.619 "uuid": "86502149-56b6-42d9-be54-469021f334fc", 00:38:10.619 "strip_size_kb": 0, 00:38:10.619 "state": "online", 00:38:10.619 "raid_level": "raid1", 00:38:10.619 "superblock": true, 00:38:10.619 "num_base_bdevs": 2, 00:38:10.619 "num_base_bdevs_discovered": 1, 00:38:10.619 "num_base_bdevs_operational": 1, 00:38:10.619 "base_bdevs_list": [ 00:38:10.619 { 00:38:10.619 "name": null, 00:38:10.619 "uuid": "00000000-0000-0000-0000-000000000000", 00:38:10.620 "is_configured": false, 00:38:10.620 "data_offset": 2048, 00:38:10.620 "data_size": 63488 00:38:10.620 }, 00:38:10.620 { 00:38:10.620 "name": "BaseBdev2", 00:38:10.620 "uuid": "a2aaaab5-b9f7-5936-9177-c8fb74e893bc", 00:38:10.620 "is_configured": true, 00:38:10.620 "data_offset": 2048, 00:38:10.620 "data_size": 63488 00:38:10.620 } 00:38:10.620 ] 00:38:10.620 }' 00:38:10.620 11:48:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:38:10.620 11:48:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:38:11.183 11:48:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:38:11.183 [2024-06-10 11:48:55.090057] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:38:11.183 [2024-06-10 11:48:55.090096] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:38:11.183 [2024-06-10 11:48:55.090127] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x170bf10 00:38:11.183 [2024-06-10 11:48:55.090136] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:38:11.183 [2024-06-10 11:48:55.090415] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:38:11.183 [2024-06-10 11:48:55.090428] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:38:11.183 [2024-06-10 11:48:55.090490] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:38:11.183 [2024-06-10 11:48:55.090498] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:38:11.183 [2024-06-10 11:48:55.090506] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:38:11.183 [2024-06-10 11:48:55.090520] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:38:11.183 [2024-06-10 11:48:55.095250] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18a9080 00:38:11.183 [2024-06-10 11:48:55.096326] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:38:11.183 spare 00:38:11.183 11:48:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@762 -- # sleep 1 00:38:12.556 11:48:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:38:12.556 11:48:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:38:12.556 11:48:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:38:12.556 11:48:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:38:12.556 11:48:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:38:12.556 11:48:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:12.556 11:48:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:12.556 11:48:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:38:12.556 "name": "raid_bdev1", 00:38:12.556 "uuid": "86502149-56b6-42d9-be54-469021f334fc", 00:38:12.556 "strip_size_kb": 0, 00:38:12.556 "state": "online", 00:38:12.556 "raid_level": "raid1", 00:38:12.556 "superblock": true, 00:38:12.556 "num_base_bdevs": 2, 00:38:12.556 "num_base_bdevs_discovered": 2, 00:38:12.556 "num_base_bdevs_operational": 2, 00:38:12.556 "process": { 00:38:12.556 "type": "rebuild", 00:38:12.556 "target": "spare", 00:38:12.556 "progress": { 00:38:12.556 "blocks": 22528, 00:38:12.556 "percent": 35 00:38:12.556 } 00:38:12.556 }, 00:38:12.556 "base_bdevs_list": [ 00:38:12.556 { 00:38:12.556 "name": "spare", 00:38:12.556 "uuid": "49a95011-b6d9-572f-8637-2335da95cd01", 00:38:12.556 "is_configured": true, 00:38:12.556 "data_offset": 2048, 00:38:12.556 "data_size": 63488 00:38:12.556 }, 00:38:12.556 { 00:38:12.556 "name": "BaseBdev2", 00:38:12.556 "uuid": "a2aaaab5-b9f7-5936-9177-c8fb74e893bc", 00:38:12.556 "is_configured": true, 00:38:12.556 "data_offset": 2048, 00:38:12.556 "data_size": 63488 00:38:12.556 } 00:38:12.556 ] 00:38:12.556 }' 00:38:12.556 11:48:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:38:12.556 11:48:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:38:12.556 11:48:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:38:12.556 11:48:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:38:12.556 11:48:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:38:12.817 [2024-06-10 11:48:56.547836] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:38:12.817 [2024-06-10 11:48:56.607550] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:38:12.817 [2024-06-10 11:48:56.607597] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:38:12.817 [2024-06-10 11:48:56.607607] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:38:12.817 [2024-06-10 11:48:56.607613] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:38:12.817 11:48:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:38:12.817 11:48:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:38:12.817 11:48:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:38:12.817 11:48:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:38:12.817 11:48:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:38:12.817 11:48:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:38:12.817 11:48:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:38:12.817 11:48:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:38:12.817 11:48:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:38:12.817 11:48:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:38:12.817 11:48:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:12.817 11:48:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:13.074 11:48:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:38:13.074 "name": "raid_bdev1", 00:38:13.074 "uuid": "86502149-56b6-42d9-be54-469021f334fc", 00:38:13.074 "strip_size_kb": 0, 00:38:13.074 "state": "online", 00:38:13.074 "raid_level": "raid1", 00:38:13.074 "superblock": true, 00:38:13.075 "num_base_bdevs": 2, 00:38:13.075 "num_base_bdevs_discovered": 1, 00:38:13.075 "num_base_bdevs_operational": 1, 00:38:13.075 "base_bdevs_list": [ 00:38:13.075 { 00:38:13.075 "name": null, 00:38:13.075 "uuid": "00000000-0000-0000-0000-000000000000", 00:38:13.075 "is_configured": false, 00:38:13.075 "data_offset": 2048, 00:38:13.075 "data_size": 63488 00:38:13.075 }, 00:38:13.075 { 00:38:13.075 "name": "BaseBdev2", 00:38:13.075 "uuid": "a2aaaab5-b9f7-5936-9177-c8fb74e893bc", 00:38:13.075 "is_configured": true, 00:38:13.075 "data_offset": 2048, 00:38:13.075 "data_size": 63488 00:38:13.075 } 00:38:13.075 ] 00:38:13.075 }' 00:38:13.075 11:48:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:38:13.075 11:48:56 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:38:13.641 11:48:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:38:13.641 11:48:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:38:13.641 11:48:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:38:13.641 11:48:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:38:13.641 11:48:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:38:13.641 11:48:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:13.641 11:48:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:13.641 11:48:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:38:13.641 "name": "raid_bdev1", 00:38:13.641 "uuid": "86502149-56b6-42d9-be54-469021f334fc", 00:38:13.641 "strip_size_kb": 0, 00:38:13.641 "state": "online", 00:38:13.641 "raid_level": "raid1", 00:38:13.641 "superblock": true, 00:38:13.641 "num_base_bdevs": 2, 00:38:13.641 "num_base_bdevs_discovered": 1, 00:38:13.641 "num_base_bdevs_operational": 1, 00:38:13.641 "base_bdevs_list": [ 00:38:13.641 { 00:38:13.641 "name": null, 00:38:13.641 "uuid": "00000000-0000-0000-0000-000000000000", 00:38:13.641 "is_configured": false, 00:38:13.641 "data_offset": 2048, 00:38:13.641 "data_size": 63488 00:38:13.641 }, 00:38:13.641 { 00:38:13.641 "name": "BaseBdev2", 00:38:13.641 "uuid": "a2aaaab5-b9f7-5936-9177-c8fb74e893bc", 00:38:13.641 "is_configured": true, 00:38:13.641 "data_offset": 2048, 00:38:13.641 "data_size": 63488 00:38:13.641 } 00:38:13.641 ] 00:38:13.641 }' 00:38:13.641 11:48:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:38:13.641 11:48:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:38:13.641 11:48:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:38:13.641 11:48:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:38:13.641 11:48:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:38:13.899 11:48:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:38:14.157 [2024-06-10 11:48:57.879500] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:38:14.157 [2024-06-10 11:48:57.879546] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:38:14.157 [2024-06-10 11:48:57.879563] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16f8d40 00:38:14.157 [2024-06-10 11:48:57.879572] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:38:14.157 [2024-06-10 11:48:57.879851] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:38:14.157 [2024-06-10 11:48:57.879863] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:38:14.157 [2024-06-10 11:48:57.879933] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:38:14.157 [2024-06-10 11:48:57.879943] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:38:14.157 [2024-06-10 11:48:57.879950] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:38:14.157 BaseBdev1 00:38:14.157 11:48:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@773 -- # sleep 1 00:38:15.093 11:48:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:38:15.093 11:48:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:38:15.093 11:48:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:38:15.093 11:48:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:38:15.093 11:48:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:38:15.093 11:48:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:38:15.093 11:48:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:38:15.093 11:48:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:38:15.093 11:48:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:38:15.093 11:48:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:38:15.093 11:48:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:15.093 11:48:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:15.350 11:48:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:38:15.350 "name": "raid_bdev1", 00:38:15.350 "uuid": "86502149-56b6-42d9-be54-469021f334fc", 00:38:15.350 "strip_size_kb": 0, 00:38:15.351 "state": "online", 00:38:15.351 "raid_level": "raid1", 00:38:15.351 "superblock": true, 00:38:15.351 "num_base_bdevs": 2, 00:38:15.351 "num_base_bdevs_discovered": 1, 00:38:15.351 "num_base_bdevs_operational": 1, 00:38:15.351 "base_bdevs_list": [ 00:38:15.351 { 00:38:15.351 "name": null, 00:38:15.351 "uuid": "00000000-0000-0000-0000-000000000000", 00:38:15.351 "is_configured": false, 00:38:15.351 "data_offset": 2048, 00:38:15.351 "data_size": 63488 00:38:15.351 }, 00:38:15.351 { 00:38:15.351 "name": "BaseBdev2", 00:38:15.351 "uuid": "a2aaaab5-b9f7-5936-9177-c8fb74e893bc", 00:38:15.351 "is_configured": true, 00:38:15.351 "data_offset": 2048, 00:38:15.351 "data_size": 63488 00:38:15.351 } 00:38:15.351 ] 00:38:15.351 }' 00:38:15.351 11:48:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:38:15.351 11:48:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:38:15.916 11:48:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:38:15.916 11:48:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:38:15.916 11:48:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:38:15.916 11:48:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:38:15.916 11:48:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:38:15.916 11:48:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:15.916 11:48:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:15.916 11:48:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:38:15.916 "name": "raid_bdev1", 00:38:15.916 "uuid": "86502149-56b6-42d9-be54-469021f334fc", 00:38:15.916 "strip_size_kb": 0, 00:38:15.916 "state": "online", 00:38:15.916 "raid_level": "raid1", 00:38:15.916 "superblock": true, 00:38:15.916 "num_base_bdevs": 2, 00:38:15.916 "num_base_bdevs_discovered": 1, 00:38:15.916 "num_base_bdevs_operational": 1, 00:38:15.916 "base_bdevs_list": [ 00:38:15.916 { 00:38:15.916 "name": null, 00:38:15.916 "uuid": "00000000-0000-0000-0000-000000000000", 00:38:15.916 "is_configured": false, 00:38:15.916 "data_offset": 2048, 00:38:15.916 "data_size": 63488 00:38:15.916 }, 00:38:15.916 { 00:38:15.916 "name": "BaseBdev2", 00:38:15.916 "uuid": "a2aaaab5-b9f7-5936-9177-c8fb74e893bc", 00:38:15.916 "is_configured": true, 00:38:15.916 "data_offset": 2048, 00:38:15.916 "data_size": 63488 00:38:15.916 } 00:38:15.916 ] 00:38:15.916 }' 00:38:15.916 11:48:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:38:15.916 11:48:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:38:15.916 11:48:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:38:15.916 11:48:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:38:15.916 11:48:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:38:15.916 11:48:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@649 -- # local es=0 00:38:15.916 11:48:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:38:15.916 11:48:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:38:15.916 11:48:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:38:15.916 11:48:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:38:15.916 11:48:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:38:15.916 11:48:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:38:15.916 11:48:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:38:15.916 11:48:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:38:15.916 11:48:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:38:15.916 11:48:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:38:16.175 [2024-06-10 11:48:59.977124] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:38:16.175 [2024-06-10 11:48:59.977229] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:38:16.175 [2024-06-10 11:48:59.977240] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:38:16.175 request: 00:38:16.175 { 00:38:16.175 "raid_bdev": "raid_bdev1", 00:38:16.175 "base_bdev": "BaseBdev1", 00:38:16.175 "method": "bdev_raid_add_base_bdev", 00:38:16.175 "req_id": 1 00:38:16.175 } 00:38:16.175 Got JSON-RPC error response 00:38:16.175 response: 00:38:16.175 { 00:38:16.175 "code": -22, 00:38:16.175 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:38:16.175 } 00:38:16.175 11:48:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@652 -- # es=1 00:38:16.175 11:48:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:38:16.175 11:48:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:38:16.175 11:48:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:38:16.175 11:48:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@777 -- # sleep 1 00:38:17.181 11:49:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:38:17.181 11:49:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:38:17.181 11:49:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:38:17.181 11:49:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:38:17.181 11:49:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:38:17.181 11:49:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:38:17.181 11:49:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:38:17.181 11:49:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:38:17.181 11:49:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:38:17.181 11:49:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:38:17.181 11:49:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:17.181 11:49:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:17.439 11:49:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:38:17.439 "name": "raid_bdev1", 00:38:17.439 "uuid": "86502149-56b6-42d9-be54-469021f334fc", 00:38:17.439 "strip_size_kb": 0, 00:38:17.439 "state": "online", 00:38:17.439 "raid_level": "raid1", 00:38:17.439 "superblock": true, 00:38:17.439 "num_base_bdevs": 2, 00:38:17.439 "num_base_bdevs_discovered": 1, 00:38:17.439 "num_base_bdevs_operational": 1, 00:38:17.439 "base_bdevs_list": [ 00:38:17.439 { 00:38:17.439 "name": null, 00:38:17.439 "uuid": "00000000-0000-0000-0000-000000000000", 00:38:17.439 "is_configured": false, 00:38:17.439 "data_offset": 2048, 00:38:17.439 "data_size": 63488 00:38:17.439 }, 00:38:17.439 { 00:38:17.439 "name": "BaseBdev2", 00:38:17.439 "uuid": "a2aaaab5-b9f7-5936-9177-c8fb74e893bc", 00:38:17.439 "is_configured": true, 00:38:17.439 "data_offset": 2048, 00:38:17.439 "data_size": 63488 00:38:17.439 } 00:38:17.439 ] 00:38:17.439 }' 00:38:17.439 11:49:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:38:17.439 11:49:01 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:38:18.004 11:49:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:38:18.004 11:49:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:38:18.004 11:49:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:38:18.004 11:49:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:38:18.004 11:49:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:38:18.004 11:49:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:18.004 11:49:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:18.004 11:49:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:38:18.004 "name": "raid_bdev1", 00:38:18.004 "uuid": "86502149-56b6-42d9-be54-469021f334fc", 00:38:18.004 "strip_size_kb": 0, 00:38:18.004 "state": "online", 00:38:18.004 "raid_level": "raid1", 00:38:18.004 "superblock": true, 00:38:18.004 "num_base_bdevs": 2, 00:38:18.004 "num_base_bdevs_discovered": 1, 00:38:18.004 "num_base_bdevs_operational": 1, 00:38:18.004 "base_bdevs_list": [ 00:38:18.004 { 00:38:18.004 "name": null, 00:38:18.004 "uuid": "00000000-0000-0000-0000-000000000000", 00:38:18.004 "is_configured": false, 00:38:18.004 "data_offset": 2048, 00:38:18.004 "data_size": 63488 00:38:18.004 }, 00:38:18.004 { 00:38:18.004 "name": "BaseBdev2", 00:38:18.004 "uuid": "a2aaaab5-b9f7-5936-9177-c8fb74e893bc", 00:38:18.004 "is_configured": true, 00:38:18.004 "data_offset": 2048, 00:38:18.004 "data_size": 63488 00:38:18.004 } 00:38:18.004 ] 00:38:18.004 }' 00:38:18.004 11:49:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:38:18.004 11:49:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:38:18.005 11:49:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:38:18.264 11:49:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:38:18.264 11:49:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@782 -- # killprocess 225786 00:38:18.264 11:49:01 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@949 -- # '[' -z 225786 ']' 00:38:18.264 11:49:01 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # kill -0 225786 00:38:18.264 11:49:01 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # uname 00:38:18.264 11:49:01 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:38:18.264 11:49:01 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 225786 00:38:18.264 11:49:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:38:18.264 11:49:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:38:18.264 11:49:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@967 -- # echo 'killing process with pid 225786' 00:38:18.264 killing process with pid 225786 00:38:18.264 11:49:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@968 -- # kill 225786 00:38:18.264 Received shutdown signal, test time was about 23.066114 seconds 00:38:18.264 00:38:18.264 Latency(us) 00:38:18.264 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:38:18.264 =================================================================================================================== 00:38:18.264 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:38:18.264 [2024-06-10 11:49:02.008413] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:38:18.265 [2024-06-10 11:49:02.008485] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:38:18.265 [2024-06-10 11:49:02.008516] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:38:18.265 [2024-06-10 11:49:02.008524] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16f89c0 name raid_bdev1, state offline 00:38:18.265 11:49:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@973 -- # wait 225786 00:38:18.265 [2024-06-10 11:49:02.028016] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:38:18.523 11:49:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@784 -- # return 0 00:38:18.523 00:38:18.523 real 0m26.723s 00:38:18.523 user 0m40.222s 00:38:18.523 sys 0m3.850s 00:38:18.523 11:49:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1125 -- # xtrace_disable 00:38:18.523 11:49:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:38:18.523 ************************************ 00:38:18.523 END TEST raid_rebuild_test_sb_io 00:38:18.523 ************************************ 00:38:18.523 11:49:02 bdev_raid -- bdev/bdev_raid.sh@876 -- # for n in 2 4 00:38:18.523 11:49:02 bdev_raid -- bdev/bdev_raid.sh@877 -- # run_test raid_rebuild_test raid_rebuild_test raid1 4 false false true 00:38:18.523 11:49:02 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:38:18.523 11:49:02 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:38:18.523 11:49:02 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:38:18.523 ************************************ 00:38:18.523 START TEST raid_rebuild_test 00:38:18.523 ************************************ 00:38:18.523 11:49:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1124 -- # raid_rebuild_test raid1 4 false false true 00:38:18.523 11:49:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:38:18.523 11:49:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:38:18.523 11:49:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:38:18.523 11:49:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:38:18.523 11:49:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@572 -- # local verify=true 00:38:18.523 11:49:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:38:18.523 11:49:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:38:18.523 11:49:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:38:18.523 11:49:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:38:18.523 11:49:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:38:18.523 11:49:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:38:18.523 11:49:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:38:18.523 11:49:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:38:18.523 11:49:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:38:18.523 11:49:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:38:18.523 11:49:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:38:18.523 11:49:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:38:18.523 11:49:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:38:18.523 11:49:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:38:18.523 11:49:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:38:18.523 11:49:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:38:18.523 11:49:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:38:18.523 11:49:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # local strip_size 00:38:18.523 11:49:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@576 -- # local create_arg 00:38:18.523 11:49:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:38:18.523 11:49:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@578 -- # local data_offset 00:38:18.523 11:49:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:38:18.523 11:49:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:38:18.523 11:49:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:38:18.523 11:49:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@596 -- # raid_pid=229764 00:38:18.523 11:49:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@597 -- # waitforlisten 229764 /var/tmp/spdk-raid.sock 00:38:18.523 11:49:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:38:18.523 11:49:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@830 -- # '[' -z 229764 ']' 00:38:18.523 11:49:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:38:18.523 11:49:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:38:18.523 11:49:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:38:18.523 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:38:18.523 11:49:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:38:18.523 11:49:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:38:18.523 [2024-06-10 11:49:02.373612] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:38:18.523 [2024-06-10 11:49:02.373665] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid229764 ] 00:38:18.523 I/O size of 3145728 is greater than zero copy threshold (65536). 00:38:18.523 Zero copy mechanism will not be used. 00:38:18.523 [2024-06-10 11:49:02.461097] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:38:18.781 [2024-06-10 11:49:02.546681] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:38:18.781 [2024-06-10 11:49:02.601479] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:38:18.781 [2024-06-10 11:49:02.601512] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:38:19.345 11:49:03 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:38:19.345 11:49:03 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@863 -- # return 0 00:38:19.345 11:49:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:38:19.345 11:49:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:38:19.602 BaseBdev1_malloc 00:38:19.602 11:49:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:38:19.602 [2024-06-10 11:49:03.504904] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:38:19.602 [2024-06-10 11:49:03.504952] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:38:19.602 [2024-06-10 11:49:03.504968] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1fcb780 00:38:19.602 [2024-06-10 11:49:03.504977] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:38:19.602 [2024-06-10 11:49:03.506163] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:38:19.602 [2024-06-10 11:49:03.506186] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:38:19.602 BaseBdev1 00:38:19.602 11:49:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:38:19.602 11:49:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:38:19.860 BaseBdev2_malloc 00:38:19.860 11:49:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:38:20.118 [2024-06-10 11:49:03.845769] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:38:20.118 [2024-06-10 11:49:03.845808] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:38:20.118 [2024-06-10 11:49:03.845837] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2176a50 00:38:20.118 [2024-06-10 11:49:03.845846] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:38:20.118 [2024-06-10 11:49:03.846909] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:38:20.118 [2024-06-10 11:49:03.846932] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:38:20.118 BaseBdev2 00:38:20.118 11:49:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:38:20.118 11:49:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:38:20.118 BaseBdev3_malloc 00:38:20.118 11:49:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:38:20.376 [2024-06-10 11:49:04.198315] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:38:20.376 [2024-06-10 11:49:04.198354] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:38:20.376 [2024-06-10 11:49:04.198384] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2175720 00:38:20.376 [2024-06-10 11:49:04.198392] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:38:20.376 [2024-06-10 11:49:04.199355] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:38:20.376 [2024-06-10 11:49:04.199390] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:38:20.376 BaseBdev3 00:38:20.376 11:49:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:38:20.376 11:49:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:38:20.634 BaseBdev4_malloc 00:38:20.634 11:49:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:38:20.634 [2024-06-10 11:49:04.558847] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:38:20.634 [2024-06-10 11:49:04.558888] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:38:20.634 [2024-06-10 11:49:04.558919] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2179970 00:38:20.634 [2024-06-10 11:49:04.558928] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:38:20.634 [2024-06-10 11:49:04.559904] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:38:20.634 [2024-06-10 11:49:04.559925] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:38:20.634 BaseBdev4 00:38:20.634 11:49:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:38:20.892 spare_malloc 00:38:20.892 11:49:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:38:21.150 spare_delay 00:38:21.150 11:49:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:38:21.150 [2024-06-10 11:49:05.079766] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:38:21.150 [2024-06-10 11:49:05.079800] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:38:21.150 [2024-06-10 11:49:05.079814] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2178060 00:38:21.150 [2024-06-10 11:49:05.079823] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:38:21.150 [2024-06-10 11:49:05.080787] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:38:21.150 [2024-06-10 11:49:05.080807] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:38:21.150 spare 00:38:21.408 11:49:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:38:21.408 [2024-06-10 11:49:05.244216] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:38:21.408 [2024-06-10 11:49:05.245047] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:38:21.408 [2024-06-10 11:49:05.245085] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:38:21.408 [2024-06-10 11:49:05.245113] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:38:21.408 [2024-06-10 11:49:05.245165] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x217c790 00:38:21.408 [2024-06-10 11:49:05.245171] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:38:21.409 [2024-06-10 11:49:05.245306] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21765f0 00:38:21.409 [2024-06-10 11:49:05.245405] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x217c790 00:38:21.409 [2024-06-10 11:49:05.245411] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x217c790 00:38:21.409 [2024-06-10 11:49:05.245481] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:38:21.409 11:49:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:38:21.409 11:49:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:38:21.409 11:49:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:38:21.409 11:49:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:38:21.409 11:49:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:38:21.409 11:49:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:38:21.409 11:49:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:38:21.409 11:49:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:38:21.409 11:49:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:38:21.409 11:49:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:38:21.409 11:49:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:21.409 11:49:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:21.667 11:49:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:38:21.667 "name": "raid_bdev1", 00:38:21.667 "uuid": "d72929cc-7584-442c-a738-1e6d9939af35", 00:38:21.667 "strip_size_kb": 0, 00:38:21.667 "state": "online", 00:38:21.667 "raid_level": "raid1", 00:38:21.667 "superblock": false, 00:38:21.667 "num_base_bdevs": 4, 00:38:21.667 "num_base_bdevs_discovered": 4, 00:38:21.667 "num_base_bdevs_operational": 4, 00:38:21.667 "base_bdevs_list": [ 00:38:21.667 { 00:38:21.667 "name": "BaseBdev1", 00:38:21.667 "uuid": "6077c913-b512-537c-80d8-162f9c8f7eef", 00:38:21.667 "is_configured": true, 00:38:21.667 "data_offset": 0, 00:38:21.667 "data_size": 65536 00:38:21.667 }, 00:38:21.667 { 00:38:21.667 "name": "BaseBdev2", 00:38:21.667 "uuid": "6fcff2b7-300f-5b08-9a1e-08ba2bd1eae4", 00:38:21.667 "is_configured": true, 00:38:21.667 "data_offset": 0, 00:38:21.667 "data_size": 65536 00:38:21.667 }, 00:38:21.667 { 00:38:21.667 "name": "BaseBdev3", 00:38:21.667 "uuid": "f6551c10-ec29-5d36-8454-da725b747b49", 00:38:21.667 "is_configured": true, 00:38:21.667 "data_offset": 0, 00:38:21.667 "data_size": 65536 00:38:21.667 }, 00:38:21.667 { 00:38:21.667 "name": "BaseBdev4", 00:38:21.667 "uuid": "21dc4bee-0d25-588c-84cd-71b599f5618d", 00:38:21.667 "is_configured": true, 00:38:21.667 "data_offset": 0, 00:38:21.667 "data_size": 65536 00:38:21.667 } 00:38:21.667 ] 00:38:21.667 }' 00:38:21.667 11:49:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:38:21.667 11:49:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:38:22.234 11:49:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:38:22.234 11:49:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:38:22.234 [2024-06-10 11:49:06.090558] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:38:22.234 11:49:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:38:22.234 11:49:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:22.234 11:49:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:38:22.492 11:49:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:38:22.492 11:49:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:38:22.492 11:49:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:38:22.492 11:49:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:38:22.492 11:49:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:38:22.492 11:49:06 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:38:22.492 11:49:06 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:38:22.492 11:49:06 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:38:22.492 11:49:06 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:38:22.492 11:49:06 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:38:22.492 11:49:06 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:38:22.492 11:49:06 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:38:22.492 11:49:06 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:38:22.492 11:49:06 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:38:22.751 [2024-06-10 11:49:06.443331] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x217d330 00:38:22.751 /dev/nbd0 00:38:22.751 11:49:06 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:38:22.751 11:49:06 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:38:22.751 11:49:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:38:22.751 11:49:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # local i 00:38:22.751 11:49:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:38:22.751 11:49:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:38:22.751 11:49:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:38:22.751 11:49:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@872 -- # break 00:38:22.751 11:49:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:38:22.751 11:49:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:38:22.751 11:49:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:38:22.751 1+0 records in 00:38:22.751 1+0 records out 00:38:22.751 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000235438 s, 17.4 MB/s 00:38:22.751 11:49:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:38:22.751 11:49:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # size=4096 00:38:22.751 11:49:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:38:22.751 11:49:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:38:22.751 11:49:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@888 -- # return 0 00:38:22.751 11:49:06 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:38:22.751 11:49:06 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:38:22.751 11:49:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:38:22.751 11:49:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:38:22.751 11:49:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=65536 oflag=direct 00:38:28.015 65536+0 records in 00:38:28.015 65536+0 records out 00:38:28.015 33554432 bytes (34 MB, 32 MiB) copied, 5.21348 s, 6.4 MB/s 00:38:28.015 11:49:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:38:28.015 11:49:11 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:38:28.015 11:49:11 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:38:28.015 11:49:11 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:38:28.015 11:49:11 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:38:28.015 11:49:11 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:38:28.015 11:49:11 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:38:28.015 11:49:11 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:38:28.015 [2024-06-10 11:49:11.912913] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:38:28.015 11:49:11 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:38:28.015 11:49:11 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:38:28.015 11:49:11 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:38:28.015 11:49:11 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:38:28.015 11:49:11 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:38:28.015 11:49:11 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:38:28.015 11:49:11 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:38:28.015 11:49:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:38:28.273 [2024-06-10 11:49:12.069358] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:38:28.273 11:49:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:38:28.273 11:49:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:38:28.273 11:49:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:38:28.273 11:49:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:38:28.273 11:49:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:38:28.273 11:49:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:38:28.273 11:49:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:38:28.273 11:49:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:38:28.273 11:49:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:38:28.273 11:49:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:38:28.273 11:49:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:28.273 11:49:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:28.530 11:49:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:38:28.530 "name": "raid_bdev1", 00:38:28.530 "uuid": "d72929cc-7584-442c-a738-1e6d9939af35", 00:38:28.530 "strip_size_kb": 0, 00:38:28.530 "state": "online", 00:38:28.530 "raid_level": "raid1", 00:38:28.530 "superblock": false, 00:38:28.530 "num_base_bdevs": 4, 00:38:28.530 "num_base_bdevs_discovered": 3, 00:38:28.530 "num_base_bdevs_operational": 3, 00:38:28.530 "base_bdevs_list": [ 00:38:28.530 { 00:38:28.530 "name": null, 00:38:28.530 "uuid": "00000000-0000-0000-0000-000000000000", 00:38:28.530 "is_configured": false, 00:38:28.530 "data_offset": 0, 00:38:28.530 "data_size": 65536 00:38:28.530 }, 00:38:28.530 { 00:38:28.530 "name": "BaseBdev2", 00:38:28.530 "uuid": "6fcff2b7-300f-5b08-9a1e-08ba2bd1eae4", 00:38:28.530 "is_configured": true, 00:38:28.530 "data_offset": 0, 00:38:28.530 "data_size": 65536 00:38:28.530 }, 00:38:28.530 { 00:38:28.530 "name": "BaseBdev3", 00:38:28.530 "uuid": "f6551c10-ec29-5d36-8454-da725b747b49", 00:38:28.530 "is_configured": true, 00:38:28.530 "data_offset": 0, 00:38:28.530 "data_size": 65536 00:38:28.530 }, 00:38:28.530 { 00:38:28.530 "name": "BaseBdev4", 00:38:28.530 "uuid": "21dc4bee-0d25-588c-84cd-71b599f5618d", 00:38:28.530 "is_configured": true, 00:38:28.530 "data_offset": 0, 00:38:28.530 "data_size": 65536 00:38:28.530 } 00:38:28.530 ] 00:38:28.530 }' 00:38:28.531 11:49:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:38:28.531 11:49:12 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:38:29.096 11:49:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:38:29.096 [2024-06-10 11:49:12.907537] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:38:29.096 [2024-06-10 11:49:12.911191] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2100310 00:38:29.096 [2024-06-10 11:49:12.912845] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:38:29.096 11:49:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@646 -- # sleep 1 00:38:30.029 11:49:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:38:30.029 11:49:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:38:30.029 11:49:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:38:30.029 11:49:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:38:30.029 11:49:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:38:30.029 11:49:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:30.029 11:49:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:30.287 11:49:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:38:30.287 "name": "raid_bdev1", 00:38:30.287 "uuid": "d72929cc-7584-442c-a738-1e6d9939af35", 00:38:30.287 "strip_size_kb": 0, 00:38:30.287 "state": "online", 00:38:30.287 "raid_level": "raid1", 00:38:30.287 "superblock": false, 00:38:30.287 "num_base_bdevs": 4, 00:38:30.287 "num_base_bdevs_discovered": 4, 00:38:30.287 "num_base_bdevs_operational": 4, 00:38:30.287 "process": { 00:38:30.287 "type": "rebuild", 00:38:30.287 "target": "spare", 00:38:30.287 "progress": { 00:38:30.287 "blocks": 22528, 00:38:30.287 "percent": 34 00:38:30.287 } 00:38:30.287 }, 00:38:30.287 "base_bdevs_list": [ 00:38:30.287 { 00:38:30.287 "name": "spare", 00:38:30.287 "uuid": "86c565a2-d57e-5b5f-b462-1b26edf482d3", 00:38:30.287 "is_configured": true, 00:38:30.287 "data_offset": 0, 00:38:30.287 "data_size": 65536 00:38:30.287 }, 00:38:30.287 { 00:38:30.287 "name": "BaseBdev2", 00:38:30.287 "uuid": "6fcff2b7-300f-5b08-9a1e-08ba2bd1eae4", 00:38:30.287 "is_configured": true, 00:38:30.287 "data_offset": 0, 00:38:30.287 "data_size": 65536 00:38:30.287 }, 00:38:30.287 { 00:38:30.287 "name": "BaseBdev3", 00:38:30.287 "uuid": "f6551c10-ec29-5d36-8454-da725b747b49", 00:38:30.287 "is_configured": true, 00:38:30.287 "data_offset": 0, 00:38:30.287 "data_size": 65536 00:38:30.287 }, 00:38:30.287 { 00:38:30.287 "name": "BaseBdev4", 00:38:30.287 "uuid": "21dc4bee-0d25-588c-84cd-71b599f5618d", 00:38:30.287 "is_configured": true, 00:38:30.287 "data_offset": 0, 00:38:30.287 "data_size": 65536 00:38:30.287 } 00:38:30.287 ] 00:38:30.287 }' 00:38:30.287 11:49:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:38:30.287 11:49:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:38:30.287 11:49:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:38:30.287 11:49:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:38:30.287 11:49:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:38:30.546 [2024-06-10 11:49:14.348953] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:38:30.546 [2024-06-10 11:49:14.423591] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:38:30.546 [2024-06-10 11:49:14.423626] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:38:30.546 [2024-06-10 11:49:14.423638] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:38:30.546 [2024-06-10 11:49:14.423643] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:38:30.546 11:49:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:38:30.546 11:49:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:38:30.546 11:49:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:38:30.546 11:49:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:38:30.546 11:49:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:38:30.546 11:49:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:38:30.546 11:49:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:38:30.546 11:49:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:38:30.546 11:49:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:38:30.546 11:49:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:38:30.546 11:49:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:30.546 11:49:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:30.804 11:49:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:38:30.804 "name": "raid_bdev1", 00:38:30.804 "uuid": "d72929cc-7584-442c-a738-1e6d9939af35", 00:38:30.804 "strip_size_kb": 0, 00:38:30.804 "state": "online", 00:38:30.804 "raid_level": "raid1", 00:38:30.804 "superblock": false, 00:38:30.804 "num_base_bdevs": 4, 00:38:30.804 "num_base_bdevs_discovered": 3, 00:38:30.804 "num_base_bdevs_operational": 3, 00:38:30.804 "base_bdevs_list": [ 00:38:30.804 { 00:38:30.804 "name": null, 00:38:30.804 "uuid": "00000000-0000-0000-0000-000000000000", 00:38:30.804 "is_configured": false, 00:38:30.804 "data_offset": 0, 00:38:30.804 "data_size": 65536 00:38:30.804 }, 00:38:30.804 { 00:38:30.804 "name": "BaseBdev2", 00:38:30.804 "uuid": "6fcff2b7-300f-5b08-9a1e-08ba2bd1eae4", 00:38:30.804 "is_configured": true, 00:38:30.804 "data_offset": 0, 00:38:30.804 "data_size": 65536 00:38:30.804 }, 00:38:30.804 { 00:38:30.804 "name": "BaseBdev3", 00:38:30.804 "uuid": "f6551c10-ec29-5d36-8454-da725b747b49", 00:38:30.804 "is_configured": true, 00:38:30.804 "data_offset": 0, 00:38:30.804 "data_size": 65536 00:38:30.804 }, 00:38:30.804 { 00:38:30.804 "name": "BaseBdev4", 00:38:30.804 "uuid": "21dc4bee-0d25-588c-84cd-71b599f5618d", 00:38:30.804 "is_configured": true, 00:38:30.804 "data_offset": 0, 00:38:30.804 "data_size": 65536 00:38:30.804 } 00:38:30.804 ] 00:38:30.804 }' 00:38:30.804 11:49:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:38:30.804 11:49:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:38:31.369 11:49:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:38:31.369 11:49:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:38:31.369 11:49:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:38:31.369 11:49:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:38:31.369 11:49:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:38:31.369 11:49:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:31.369 11:49:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:31.627 11:49:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:38:31.627 "name": "raid_bdev1", 00:38:31.627 "uuid": "d72929cc-7584-442c-a738-1e6d9939af35", 00:38:31.627 "strip_size_kb": 0, 00:38:31.627 "state": "online", 00:38:31.627 "raid_level": "raid1", 00:38:31.627 "superblock": false, 00:38:31.627 "num_base_bdevs": 4, 00:38:31.627 "num_base_bdevs_discovered": 3, 00:38:31.627 "num_base_bdevs_operational": 3, 00:38:31.627 "base_bdevs_list": [ 00:38:31.627 { 00:38:31.627 "name": null, 00:38:31.627 "uuid": "00000000-0000-0000-0000-000000000000", 00:38:31.627 "is_configured": false, 00:38:31.627 "data_offset": 0, 00:38:31.627 "data_size": 65536 00:38:31.627 }, 00:38:31.627 { 00:38:31.627 "name": "BaseBdev2", 00:38:31.627 "uuid": "6fcff2b7-300f-5b08-9a1e-08ba2bd1eae4", 00:38:31.627 "is_configured": true, 00:38:31.627 "data_offset": 0, 00:38:31.627 "data_size": 65536 00:38:31.627 }, 00:38:31.627 { 00:38:31.627 "name": "BaseBdev3", 00:38:31.627 "uuid": "f6551c10-ec29-5d36-8454-da725b747b49", 00:38:31.627 "is_configured": true, 00:38:31.627 "data_offset": 0, 00:38:31.627 "data_size": 65536 00:38:31.627 }, 00:38:31.627 { 00:38:31.627 "name": "BaseBdev4", 00:38:31.627 "uuid": "21dc4bee-0d25-588c-84cd-71b599f5618d", 00:38:31.627 "is_configured": true, 00:38:31.627 "data_offset": 0, 00:38:31.627 "data_size": 65536 00:38:31.627 } 00:38:31.627 ] 00:38:31.627 }' 00:38:31.627 11:49:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:38:31.627 11:49:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:38:31.627 11:49:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:38:31.627 11:49:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:38:31.627 11:49:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:38:31.627 [2024-06-10 11:49:15.562140] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:38:31.627 [2024-06-10 11:49:15.565809] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x217d330 00:38:31.627 [2024-06-10 11:49:15.566897] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:38:31.884 11:49:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@662 -- # sleep 1 00:38:32.819 11:49:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:38:32.819 11:49:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:38:32.819 11:49:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:38:32.819 11:49:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:38:32.819 11:49:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:38:32.819 11:49:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:32.819 11:49:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:33.078 11:49:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:38:33.078 "name": "raid_bdev1", 00:38:33.078 "uuid": "d72929cc-7584-442c-a738-1e6d9939af35", 00:38:33.078 "strip_size_kb": 0, 00:38:33.078 "state": "online", 00:38:33.078 "raid_level": "raid1", 00:38:33.078 "superblock": false, 00:38:33.078 "num_base_bdevs": 4, 00:38:33.078 "num_base_bdevs_discovered": 4, 00:38:33.078 "num_base_bdevs_operational": 4, 00:38:33.078 "process": { 00:38:33.078 "type": "rebuild", 00:38:33.078 "target": "spare", 00:38:33.078 "progress": { 00:38:33.078 "blocks": 22528, 00:38:33.078 "percent": 34 00:38:33.078 } 00:38:33.078 }, 00:38:33.078 "base_bdevs_list": [ 00:38:33.078 { 00:38:33.078 "name": "spare", 00:38:33.078 "uuid": "86c565a2-d57e-5b5f-b462-1b26edf482d3", 00:38:33.078 "is_configured": true, 00:38:33.078 "data_offset": 0, 00:38:33.078 "data_size": 65536 00:38:33.078 }, 00:38:33.078 { 00:38:33.078 "name": "BaseBdev2", 00:38:33.078 "uuid": "6fcff2b7-300f-5b08-9a1e-08ba2bd1eae4", 00:38:33.078 "is_configured": true, 00:38:33.078 "data_offset": 0, 00:38:33.078 "data_size": 65536 00:38:33.078 }, 00:38:33.078 { 00:38:33.078 "name": "BaseBdev3", 00:38:33.078 "uuid": "f6551c10-ec29-5d36-8454-da725b747b49", 00:38:33.078 "is_configured": true, 00:38:33.078 "data_offset": 0, 00:38:33.078 "data_size": 65536 00:38:33.078 }, 00:38:33.078 { 00:38:33.078 "name": "BaseBdev4", 00:38:33.078 "uuid": "21dc4bee-0d25-588c-84cd-71b599f5618d", 00:38:33.078 "is_configured": true, 00:38:33.078 "data_offset": 0, 00:38:33.078 "data_size": 65536 00:38:33.078 } 00:38:33.078 ] 00:38:33.078 }' 00:38:33.078 11:49:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:38:33.078 11:49:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:38:33.078 11:49:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:38:33.078 11:49:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:38:33.078 11:49:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:38:33.078 11:49:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:38:33.078 11:49:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:38:33.078 11:49:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:38:33.078 11:49:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:38:33.078 [2024-06-10 11:49:17.007186] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:38:33.337 [2024-06-10 11:49:17.077973] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x217d330 00:38:33.337 11:49:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:38:33.337 11:49:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:38:33.337 11:49:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:38:33.337 11:49:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:38:33.337 11:49:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:38:33.337 11:49:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:38:33.337 11:49:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:38:33.337 11:49:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:33.337 11:49:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:33.337 11:49:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:38:33.337 "name": "raid_bdev1", 00:38:33.337 "uuid": "d72929cc-7584-442c-a738-1e6d9939af35", 00:38:33.337 "strip_size_kb": 0, 00:38:33.337 "state": "online", 00:38:33.337 "raid_level": "raid1", 00:38:33.337 "superblock": false, 00:38:33.337 "num_base_bdevs": 4, 00:38:33.337 "num_base_bdevs_discovered": 3, 00:38:33.337 "num_base_bdevs_operational": 3, 00:38:33.337 "process": { 00:38:33.337 "type": "rebuild", 00:38:33.337 "target": "spare", 00:38:33.337 "progress": { 00:38:33.337 "blocks": 32768, 00:38:33.337 "percent": 50 00:38:33.337 } 00:38:33.337 }, 00:38:33.337 "base_bdevs_list": [ 00:38:33.337 { 00:38:33.337 "name": "spare", 00:38:33.337 "uuid": "86c565a2-d57e-5b5f-b462-1b26edf482d3", 00:38:33.337 "is_configured": true, 00:38:33.337 "data_offset": 0, 00:38:33.337 "data_size": 65536 00:38:33.337 }, 00:38:33.337 { 00:38:33.337 "name": null, 00:38:33.337 "uuid": "00000000-0000-0000-0000-000000000000", 00:38:33.337 "is_configured": false, 00:38:33.337 "data_offset": 0, 00:38:33.337 "data_size": 65536 00:38:33.337 }, 00:38:33.337 { 00:38:33.337 "name": "BaseBdev3", 00:38:33.337 "uuid": "f6551c10-ec29-5d36-8454-da725b747b49", 00:38:33.337 "is_configured": true, 00:38:33.337 "data_offset": 0, 00:38:33.337 "data_size": 65536 00:38:33.337 }, 00:38:33.337 { 00:38:33.337 "name": "BaseBdev4", 00:38:33.337 "uuid": "21dc4bee-0d25-588c-84cd-71b599f5618d", 00:38:33.337 "is_configured": true, 00:38:33.337 "data_offset": 0, 00:38:33.337 "data_size": 65536 00:38:33.337 } 00:38:33.337 ] 00:38:33.337 }' 00:38:33.337 11:49:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:38:33.596 11:49:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:38:33.596 11:49:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:38:33.596 11:49:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:38:33.596 11:49:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@705 -- # local timeout=687 00:38:33.596 11:49:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:38:33.596 11:49:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:38:33.596 11:49:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:38:33.596 11:49:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:38:33.596 11:49:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:38:33.596 11:49:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:38:33.596 11:49:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:33.596 11:49:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:33.596 11:49:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:38:33.596 "name": "raid_bdev1", 00:38:33.596 "uuid": "d72929cc-7584-442c-a738-1e6d9939af35", 00:38:33.596 "strip_size_kb": 0, 00:38:33.596 "state": "online", 00:38:33.596 "raid_level": "raid1", 00:38:33.596 "superblock": false, 00:38:33.596 "num_base_bdevs": 4, 00:38:33.596 "num_base_bdevs_discovered": 3, 00:38:33.596 "num_base_bdevs_operational": 3, 00:38:33.596 "process": { 00:38:33.596 "type": "rebuild", 00:38:33.596 "target": "spare", 00:38:33.596 "progress": { 00:38:33.596 "blocks": 36864, 00:38:33.596 "percent": 56 00:38:33.596 } 00:38:33.596 }, 00:38:33.596 "base_bdevs_list": [ 00:38:33.596 { 00:38:33.596 "name": "spare", 00:38:33.596 "uuid": "86c565a2-d57e-5b5f-b462-1b26edf482d3", 00:38:33.596 "is_configured": true, 00:38:33.596 "data_offset": 0, 00:38:33.596 "data_size": 65536 00:38:33.596 }, 00:38:33.596 { 00:38:33.596 "name": null, 00:38:33.596 "uuid": "00000000-0000-0000-0000-000000000000", 00:38:33.596 "is_configured": false, 00:38:33.596 "data_offset": 0, 00:38:33.596 "data_size": 65536 00:38:33.596 }, 00:38:33.596 { 00:38:33.596 "name": "BaseBdev3", 00:38:33.596 "uuid": "f6551c10-ec29-5d36-8454-da725b747b49", 00:38:33.596 "is_configured": true, 00:38:33.596 "data_offset": 0, 00:38:33.596 "data_size": 65536 00:38:33.596 }, 00:38:33.596 { 00:38:33.596 "name": "BaseBdev4", 00:38:33.596 "uuid": "21dc4bee-0d25-588c-84cd-71b599f5618d", 00:38:33.596 "is_configured": true, 00:38:33.596 "data_offset": 0, 00:38:33.596 "data_size": 65536 00:38:33.596 } 00:38:33.596 ] 00:38:33.596 }' 00:38:33.596 11:49:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:38:33.596 11:49:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:38:33.596 11:49:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:38:33.855 11:49:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:38:33.855 11:49:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:38:34.790 11:49:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:38:34.791 11:49:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:38:34.791 11:49:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:38:34.791 11:49:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:38:34.791 11:49:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:38:34.791 11:49:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:38:34.791 11:49:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:34.791 11:49:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:35.049 11:49:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:38:35.049 "name": "raid_bdev1", 00:38:35.049 "uuid": "d72929cc-7584-442c-a738-1e6d9939af35", 00:38:35.049 "strip_size_kb": 0, 00:38:35.049 "state": "online", 00:38:35.049 "raid_level": "raid1", 00:38:35.049 "superblock": false, 00:38:35.049 "num_base_bdevs": 4, 00:38:35.049 "num_base_bdevs_discovered": 3, 00:38:35.049 "num_base_bdevs_operational": 3, 00:38:35.049 "process": { 00:38:35.049 "type": "rebuild", 00:38:35.049 "target": "spare", 00:38:35.049 "progress": { 00:38:35.049 "blocks": 63488, 00:38:35.049 "percent": 96 00:38:35.049 } 00:38:35.049 }, 00:38:35.049 "base_bdevs_list": [ 00:38:35.049 { 00:38:35.049 "name": "spare", 00:38:35.049 "uuid": "86c565a2-d57e-5b5f-b462-1b26edf482d3", 00:38:35.049 "is_configured": true, 00:38:35.049 "data_offset": 0, 00:38:35.049 "data_size": 65536 00:38:35.049 }, 00:38:35.049 { 00:38:35.049 "name": null, 00:38:35.049 "uuid": "00000000-0000-0000-0000-000000000000", 00:38:35.049 "is_configured": false, 00:38:35.049 "data_offset": 0, 00:38:35.049 "data_size": 65536 00:38:35.049 }, 00:38:35.049 { 00:38:35.049 "name": "BaseBdev3", 00:38:35.049 "uuid": "f6551c10-ec29-5d36-8454-da725b747b49", 00:38:35.049 "is_configured": true, 00:38:35.049 "data_offset": 0, 00:38:35.049 "data_size": 65536 00:38:35.049 }, 00:38:35.049 { 00:38:35.049 "name": "BaseBdev4", 00:38:35.049 "uuid": "21dc4bee-0d25-588c-84cd-71b599f5618d", 00:38:35.049 "is_configured": true, 00:38:35.049 "data_offset": 0, 00:38:35.049 "data_size": 65536 00:38:35.049 } 00:38:35.049 ] 00:38:35.049 }' 00:38:35.049 11:49:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:38:35.049 [2024-06-10 11:49:18.790527] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:38:35.049 [2024-06-10 11:49:18.790571] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:38:35.049 [2024-06-10 11:49:18.790596] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:38:35.049 11:49:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:38:35.049 11:49:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:38:35.049 11:49:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:38:35.049 11:49:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:38:35.981 11:49:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:38:35.981 11:49:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:38:35.981 11:49:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:38:35.981 11:49:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:38:35.981 11:49:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:38:35.981 11:49:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:38:35.981 11:49:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:35.981 11:49:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:36.240 11:49:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:38:36.240 "name": "raid_bdev1", 00:38:36.240 "uuid": "d72929cc-7584-442c-a738-1e6d9939af35", 00:38:36.240 "strip_size_kb": 0, 00:38:36.240 "state": "online", 00:38:36.240 "raid_level": "raid1", 00:38:36.240 "superblock": false, 00:38:36.240 "num_base_bdevs": 4, 00:38:36.240 "num_base_bdevs_discovered": 3, 00:38:36.240 "num_base_bdevs_operational": 3, 00:38:36.240 "base_bdevs_list": [ 00:38:36.240 { 00:38:36.240 "name": "spare", 00:38:36.240 "uuid": "86c565a2-d57e-5b5f-b462-1b26edf482d3", 00:38:36.240 "is_configured": true, 00:38:36.240 "data_offset": 0, 00:38:36.240 "data_size": 65536 00:38:36.240 }, 00:38:36.240 { 00:38:36.240 "name": null, 00:38:36.240 "uuid": "00000000-0000-0000-0000-000000000000", 00:38:36.240 "is_configured": false, 00:38:36.240 "data_offset": 0, 00:38:36.240 "data_size": 65536 00:38:36.240 }, 00:38:36.240 { 00:38:36.240 "name": "BaseBdev3", 00:38:36.240 "uuid": "f6551c10-ec29-5d36-8454-da725b747b49", 00:38:36.240 "is_configured": true, 00:38:36.240 "data_offset": 0, 00:38:36.240 "data_size": 65536 00:38:36.240 }, 00:38:36.240 { 00:38:36.240 "name": "BaseBdev4", 00:38:36.240 "uuid": "21dc4bee-0d25-588c-84cd-71b599f5618d", 00:38:36.240 "is_configured": true, 00:38:36.240 "data_offset": 0, 00:38:36.240 "data_size": 65536 00:38:36.240 } 00:38:36.240 ] 00:38:36.240 }' 00:38:36.240 11:49:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:38:36.240 11:49:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:38:36.240 11:49:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:38:36.240 11:49:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:38:36.240 11:49:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # break 00:38:36.240 11:49:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:38:36.240 11:49:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:38:36.240 11:49:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:38:36.240 11:49:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:38:36.240 11:49:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:38:36.240 11:49:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:36.240 11:49:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:36.499 11:49:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:38:36.499 "name": "raid_bdev1", 00:38:36.499 "uuid": "d72929cc-7584-442c-a738-1e6d9939af35", 00:38:36.499 "strip_size_kb": 0, 00:38:36.499 "state": "online", 00:38:36.499 "raid_level": "raid1", 00:38:36.499 "superblock": false, 00:38:36.499 "num_base_bdevs": 4, 00:38:36.499 "num_base_bdevs_discovered": 3, 00:38:36.499 "num_base_bdevs_operational": 3, 00:38:36.499 "base_bdevs_list": [ 00:38:36.499 { 00:38:36.499 "name": "spare", 00:38:36.499 "uuid": "86c565a2-d57e-5b5f-b462-1b26edf482d3", 00:38:36.499 "is_configured": true, 00:38:36.499 "data_offset": 0, 00:38:36.499 "data_size": 65536 00:38:36.499 }, 00:38:36.499 { 00:38:36.499 "name": null, 00:38:36.499 "uuid": "00000000-0000-0000-0000-000000000000", 00:38:36.499 "is_configured": false, 00:38:36.499 "data_offset": 0, 00:38:36.499 "data_size": 65536 00:38:36.499 }, 00:38:36.499 { 00:38:36.499 "name": "BaseBdev3", 00:38:36.499 "uuid": "f6551c10-ec29-5d36-8454-da725b747b49", 00:38:36.499 "is_configured": true, 00:38:36.499 "data_offset": 0, 00:38:36.499 "data_size": 65536 00:38:36.499 }, 00:38:36.499 { 00:38:36.499 "name": "BaseBdev4", 00:38:36.499 "uuid": "21dc4bee-0d25-588c-84cd-71b599f5618d", 00:38:36.499 "is_configured": true, 00:38:36.499 "data_offset": 0, 00:38:36.499 "data_size": 65536 00:38:36.499 } 00:38:36.499 ] 00:38:36.499 }' 00:38:36.499 11:49:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:38:36.499 11:49:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:38:36.499 11:49:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:38:36.499 11:49:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:38:36.499 11:49:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:38:36.499 11:49:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:38:36.499 11:49:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:38:36.499 11:49:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:38:36.499 11:49:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:38:36.499 11:49:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:38:36.499 11:49:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:38:36.499 11:49:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:38:36.499 11:49:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:38:36.499 11:49:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:38:36.499 11:49:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:36.499 11:49:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:36.757 11:49:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:38:36.757 "name": "raid_bdev1", 00:38:36.757 "uuid": "d72929cc-7584-442c-a738-1e6d9939af35", 00:38:36.757 "strip_size_kb": 0, 00:38:36.757 "state": "online", 00:38:36.757 "raid_level": "raid1", 00:38:36.757 "superblock": false, 00:38:36.757 "num_base_bdevs": 4, 00:38:36.757 "num_base_bdevs_discovered": 3, 00:38:36.757 "num_base_bdevs_operational": 3, 00:38:36.757 "base_bdevs_list": [ 00:38:36.757 { 00:38:36.757 "name": "spare", 00:38:36.757 "uuid": "86c565a2-d57e-5b5f-b462-1b26edf482d3", 00:38:36.757 "is_configured": true, 00:38:36.757 "data_offset": 0, 00:38:36.757 "data_size": 65536 00:38:36.757 }, 00:38:36.757 { 00:38:36.757 "name": null, 00:38:36.757 "uuid": "00000000-0000-0000-0000-000000000000", 00:38:36.757 "is_configured": false, 00:38:36.757 "data_offset": 0, 00:38:36.757 "data_size": 65536 00:38:36.757 }, 00:38:36.757 { 00:38:36.757 "name": "BaseBdev3", 00:38:36.757 "uuid": "f6551c10-ec29-5d36-8454-da725b747b49", 00:38:36.757 "is_configured": true, 00:38:36.757 "data_offset": 0, 00:38:36.757 "data_size": 65536 00:38:36.757 }, 00:38:36.757 { 00:38:36.757 "name": "BaseBdev4", 00:38:36.757 "uuid": "21dc4bee-0d25-588c-84cd-71b599f5618d", 00:38:36.757 "is_configured": true, 00:38:36.757 "data_offset": 0, 00:38:36.757 "data_size": 65536 00:38:36.757 } 00:38:36.757 ] 00:38:36.757 }' 00:38:36.757 11:49:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:38:36.757 11:49:20 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:38:37.323 11:49:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:38:37.323 [2024-06-10 11:49:21.180625] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:38:37.323 [2024-06-10 11:49:21.180650] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:38:37.323 [2024-06-10 11:49:21.180693] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:38:37.323 [2024-06-10 11:49:21.180740] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:38:37.323 [2024-06-10 11:49:21.180748] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x217c790 name raid_bdev1, state offline 00:38:37.323 11:49:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:37.323 11:49:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # jq length 00:38:37.581 11:49:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:38:37.581 11:49:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:38:37.581 11:49:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:38:37.581 11:49:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:38:37.581 11:49:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:38:37.581 11:49:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:38:37.581 11:49:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:38:37.581 11:49:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:38:37.581 11:49:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:38:37.581 11:49:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:38:37.581 11:49:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:38:37.581 11:49:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:38:37.581 11:49:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:38:37.839 /dev/nbd0 00:38:37.839 11:49:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:38:37.839 11:49:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:38:37.839 11:49:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:38:37.839 11:49:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # local i 00:38:37.839 11:49:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:38:37.839 11:49:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:38:37.839 11:49:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:38:37.839 11:49:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@872 -- # break 00:38:37.839 11:49:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:38:37.839 11:49:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:38:37.840 11:49:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:38:37.840 1+0 records in 00:38:37.840 1+0 records out 00:38:37.840 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000252444 s, 16.2 MB/s 00:38:37.840 11:49:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:38:37.840 11:49:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # size=4096 00:38:37.840 11:49:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:38:37.840 11:49:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:38:37.840 11:49:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@888 -- # return 0 00:38:37.840 11:49:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:38:37.840 11:49:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:38:37.840 11:49:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:38:37.840 /dev/nbd1 00:38:37.840 11:49:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:38:37.840 11:49:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:38:37.840 11:49:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:38:37.840 11:49:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # local i 00:38:37.840 11:49:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:38:37.840 11:49:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:38:37.840 11:49:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:38:37.840 11:49:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@872 -- # break 00:38:37.840 11:49:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:38:37.840 11:49:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:38:37.840 11:49:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:38:37.840 1+0 records in 00:38:37.840 1+0 records out 00:38:37.840 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000268419 s, 15.3 MB/s 00:38:37.840 11:49:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:38:38.098 11:49:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # size=4096 00:38:38.098 11:49:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:38:38.098 11:49:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:38:38.098 11:49:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@888 -- # return 0 00:38:38.098 11:49:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:38:38.098 11:49:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:38:38.098 11:49:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@737 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:38:38.098 11:49:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:38:38.098 11:49:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:38:38.098 11:49:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:38:38.098 11:49:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:38:38.098 11:49:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:38:38.098 11:49:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:38:38.098 11:49:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:38:38.365 11:49:22 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:38:38.365 11:49:22 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:38:38.365 11:49:22 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:38:38.365 11:49:22 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:38:38.365 11:49:22 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:38:38.365 11:49:22 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:38:38.365 11:49:22 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:38:38.365 11:49:22 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:38:38.365 11:49:22 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:38:38.365 11:49:22 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:38:38.365 11:49:22 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:38:38.365 11:49:22 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:38:38.365 11:49:22 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:38:38.365 11:49:22 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:38:38.365 11:49:22 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:38:38.365 11:49:22 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:38:38.365 11:49:22 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:38:38.365 11:49:22 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:38:38.365 11:49:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:38:38.365 11:49:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@782 -- # killprocess 229764 00:38:38.365 11:49:22 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@949 -- # '[' -z 229764 ']' 00:38:38.365 11:49:22 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # kill -0 229764 00:38:38.365 11:49:22 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # uname 00:38:38.365 11:49:22 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:38:38.365 11:49:22 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 229764 00:38:38.365 11:49:22 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:38:38.365 11:49:22 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:38:38.365 11:49:22 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 229764' 00:38:38.365 killing process with pid 229764 00:38:38.365 11:49:22 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@968 -- # kill 229764 00:38:38.365 Received shutdown signal, test time was about 60.000000 seconds 00:38:38.365 00:38:38.365 Latency(us) 00:38:38.365 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:38:38.365 =================================================================================================================== 00:38:38.365 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:38:38.365 [2024-06-10 11:49:22.302651] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:38:38.365 11:49:22 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@973 -- # wait 229764 00:38:38.631 [2024-06-10 11:49:22.351251] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:38:38.631 11:49:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@784 -- # return 0 00:38:38.631 00:38:38.631 real 0m20.244s 00:38:38.631 user 0m26.693s 00:38:38.631 sys 0m4.187s 00:38:38.631 11:49:22 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:38:38.631 11:49:22 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:38:38.631 ************************************ 00:38:38.631 END TEST raid_rebuild_test 00:38:38.631 ************************************ 00:38:38.890 11:49:22 bdev_raid -- bdev/bdev_raid.sh@878 -- # run_test raid_rebuild_test_sb raid_rebuild_test raid1 4 true false true 00:38:38.890 11:49:22 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:38:38.890 11:49:22 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:38:38.890 11:49:22 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:38:38.890 ************************************ 00:38:38.890 START TEST raid_rebuild_test_sb 00:38:38.890 ************************************ 00:38:38.890 11:49:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1124 -- # raid_rebuild_test raid1 4 true false true 00:38:38.890 11:49:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:38:38.890 11:49:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:38:38.890 11:49:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:38:38.890 11:49:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:38:38.890 11:49:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@572 -- # local verify=true 00:38:38.890 11:49:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:38:38.890 11:49:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:38:38.890 11:49:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:38:38.890 11:49:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:38:38.890 11:49:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:38:38.890 11:49:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:38:38.890 11:49:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:38:38.890 11:49:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:38:38.890 11:49:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:38:38.890 11:49:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:38:38.890 11:49:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:38:38.890 11:49:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:38:38.890 11:49:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:38:38.890 11:49:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:38:38.890 11:49:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:38:38.890 11:49:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:38:38.890 11:49:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:38:38.890 11:49:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # local strip_size 00:38:38.890 11:49:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@576 -- # local create_arg 00:38:38.890 11:49:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:38:38.890 11:49:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@578 -- # local data_offset 00:38:38.890 11:49:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:38:38.890 11:49:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:38:38.890 11:49:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:38:38.890 11:49:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:38:38.890 11:49:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@596 -- # raid_pid=232668 00:38:38.890 11:49:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@597 -- # waitforlisten 232668 /var/tmp/spdk-raid.sock 00:38:38.890 11:49:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:38:38.890 11:49:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@830 -- # '[' -z 232668 ']' 00:38:38.890 11:49:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:38:38.890 11:49:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@835 -- # local max_retries=100 00:38:38.890 11:49:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:38:38.890 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:38:38.890 11:49:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@839 -- # xtrace_disable 00:38:38.890 11:49:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:38:38.890 [2024-06-10 11:49:22.706311] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:38:38.890 [2024-06-10 11:49:22.706362] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid232668 ] 00:38:38.890 I/O size of 3145728 is greater than zero copy threshold (65536). 00:38:38.890 Zero copy mechanism will not be used. 00:38:38.890 [2024-06-10 11:49:22.790823] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:38:39.149 [2024-06-10 11:49:22.873111] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:38:39.149 [2024-06-10 11:49:22.932270] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:38:39.149 [2024-06-10 11:49:22.932303] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:38:39.754 11:49:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:38:39.754 11:49:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@863 -- # return 0 00:38:39.754 11:49:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:38:39.754 11:49:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:38:39.754 BaseBdev1_malloc 00:38:40.050 11:49:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:38:40.050 [2024-06-10 11:49:23.835463] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:38:40.050 [2024-06-10 11:49:23.835505] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:38:40.050 [2024-06-10 11:49:23.835521] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ecf780 00:38:40.050 [2024-06-10 11:49:23.835529] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:38:40.050 [2024-06-10 11:49:23.836622] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:38:40.050 [2024-06-10 11:49:23.836643] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:38:40.050 BaseBdev1 00:38:40.050 11:49:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:38:40.050 11:49:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:38:40.315 BaseBdev2_malloc 00:38:40.315 11:49:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:38:40.315 [2024-06-10 11:49:24.184179] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:38:40.315 [2024-06-10 11:49:24.184214] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:38:40.315 [2024-06-10 11:49:24.184227] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x207aa50 00:38:40.315 [2024-06-10 11:49:24.184235] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:38:40.315 [2024-06-10 11:49:24.185145] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:38:40.315 [2024-06-10 11:49:24.185166] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:38:40.315 BaseBdev2 00:38:40.315 11:49:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:38:40.315 11:49:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:38:40.573 BaseBdev3_malloc 00:38:40.573 11:49:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:38:40.832 [2024-06-10 11:49:24.532678] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:38:40.832 [2024-06-10 11:49:24.532714] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:38:40.832 [2024-06-10 11:49:24.532728] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2079720 00:38:40.832 [2024-06-10 11:49:24.532740] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:38:40.832 [2024-06-10 11:49:24.533720] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:38:40.832 [2024-06-10 11:49:24.533741] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:38:40.832 BaseBdev3 00:38:40.832 11:49:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:38:40.832 11:49:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:38:40.832 BaseBdev4_malloc 00:38:40.832 11:49:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:38:41.090 [2024-06-10 11:49:24.889272] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:38:41.090 [2024-06-10 11:49:24.889306] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:38:41.090 [2024-06-10 11:49:24.889322] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x207d970 00:38:41.090 [2024-06-10 11:49:24.889330] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:38:41.090 [2024-06-10 11:49:24.890246] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:38:41.090 [2024-06-10 11:49:24.890268] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:38:41.090 BaseBdev4 00:38:41.090 11:49:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:38:41.349 spare_malloc 00:38:41.349 11:49:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:38:41.349 spare_delay 00:38:41.349 11:49:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:38:41.607 [2024-06-10 11:49:25.382201] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:38:41.607 [2024-06-10 11:49:25.382234] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:38:41.607 [2024-06-10 11:49:25.382248] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x207c060 00:38:41.607 [2024-06-10 11:49:25.382256] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:38:41.607 [2024-06-10 11:49:25.383229] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:38:41.607 [2024-06-10 11:49:25.383250] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:38:41.607 spare 00:38:41.607 11:49:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:38:41.607 [2024-06-10 11:49:25.546646] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:38:41.607 [2024-06-10 11:49:25.547405] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:38:41.607 [2024-06-10 11:49:25.547442] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:38:41.607 [2024-06-10 11:49:25.547469] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:38:41.607 [2024-06-10 11:49:25.547598] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2080790 00:38:41.607 [2024-06-10 11:49:25.547606] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:38:41.607 [2024-06-10 11:49:25.547719] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x207a5f0 00:38:41.607 [2024-06-10 11:49:25.547813] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2080790 00:38:41.607 [2024-06-10 11:49:25.547819] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2080790 00:38:41.607 [2024-06-10 11:49:25.547904] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:38:41.865 11:49:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:38:41.865 11:49:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:38:41.865 11:49:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:38:41.865 11:49:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:38:41.865 11:49:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:38:41.865 11:49:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:38:41.865 11:49:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:38:41.865 11:49:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:38:41.865 11:49:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:38:41.865 11:49:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:38:41.865 11:49:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:41.865 11:49:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:41.865 11:49:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:38:41.865 "name": "raid_bdev1", 00:38:41.865 "uuid": "7c748a3e-53d2-452c-9ff2-1e220f0c07d2", 00:38:41.865 "strip_size_kb": 0, 00:38:41.866 "state": "online", 00:38:41.866 "raid_level": "raid1", 00:38:41.866 "superblock": true, 00:38:41.866 "num_base_bdevs": 4, 00:38:41.866 "num_base_bdevs_discovered": 4, 00:38:41.866 "num_base_bdevs_operational": 4, 00:38:41.866 "base_bdevs_list": [ 00:38:41.866 { 00:38:41.866 "name": "BaseBdev1", 00:38:41.866 "uuid": "2c330546-24b8-54d9-9807-c73eaa043ee4", 00:38:41.866 "is_configured": true, 00:38:41.866 "data_offset": 2048, 00:38:41.866 "data_size": 63488 00:38:41.866 }, 00:38:41.866 { 00:38:41.866 "name": "BaseBdev2", 00:38:41.866 "uuid": "afd02b12-a4f3-5953-8005-83ddb12cd394", 00:38:41.866 "is_configured": true, 00:38:41.866 "data_offset": 2048, 00:38:41.866 "data_size": 63488 00:38:41.866 }, 00:38:41.866 { 00:38:41.866 "name": "BaseBdev3", 00:38:41.866 "uuid": "bea53c42-8e6a-5af8-9747-3cb5fa5b6f38", 00:38:41.866 "is_configured": true, 00:38:41.866 "data_offset": 2048, 00:38:41.866 "data_size": 63488 00:38:41.866 }, 00:38:41.866 { 00:38:41.866 "name": "BaseBdev4", 00:38:41.866 "uuid": "a8ef8abb-56e0-50ec-88a2-d78c32f80e05", 00:38:41.866 "is_configured": true, 00:38:41.866 "data_offset": 2048, 00:38:41.866 "data_size": 63488 00:38:41.866 } 00:38:41.866 ] 00:38:41.866 }' 00:38:41.866 11:49:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:38:41.866 11:49:25 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:38:42.432 11:49:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:38:42.432 11:49:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:38:42.432 [2024-06-10 11:49:26.376979] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:38:42.690 11:49:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:38:42.690 11:49:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:42.690 11:49:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:38:42.690 11:49:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:38:42.690 11:49:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:38:42.690 11:49:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:38:42.690 11:49:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:38:42.690 11:49:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:38:42.690 11:49:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:38:42.690 11:49:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:38:42.690 11:49:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:38:42.690 11:49:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:38:42.690 11:49:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:38:42.690 11:49:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:38:42.690 11:49:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:38:42.690 11:49:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:38:42.690 11:49:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:38:42.949 [2024-06-10 11:49:26.733744] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2081330 00:38:42.949 /dev/nbd0 00:38:42.949 11:49:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:38:42.949 11:49:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:38:42.949 11:49:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:38:42.949 11:49:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # local i 00:38:42.949 11:49:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:38:42.949 11:49:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:38:42.949 11:49:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:38:42.949 11:49:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@872 -- # break 00:38:42.949 11:49:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:38:42.949 11:49:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:38:42.949 11:49:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:38:42.949 1+0 records in 00:38:42.949 1+0 records out 00:38:42.949 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000272781 s, 15.0 MB/s 00:38:42.949 11:49:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:38:42.949 11:49:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # size=4096 00:38:42.949 11:49:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:38:42.949 11:49:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:38:42.949 11:49:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@888 -- # return 0 00:38:42.949 11:49:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:38:42.949 11:49:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:38:42.949 11:49:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:38:42.949 11:49:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:38:42.949 11:49:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=63488 oflag=direct 00:38:48.215 63488+0 records in 00:38:48.215 63488+0 records out 00:38:48.215 32505856 bytes (33 MB, 31 MiB) copied, 5.06991 s, 6.4 MB/s 00:38:48.215 11:49:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:38:48.215 11:49:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:38:48.215 11:49:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:38:48.215 11:49:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:38:48.215 11:49:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:38:48.215 11:49:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:38:48.215 11:49:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:38:48.215 [2024-06-10 11:49:32.051376] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:38:48.215 11:49:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:38:48.215 11:49:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:38:48.215 11:49:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:38:48.215 11:49:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:38:48.215 11:49:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:38:48.215 11:49:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:38:48.215 11:49:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:38:48.215 11:49:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:38:48.215 11:49:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:38:48.472 [2024-06-10 11:49:32.219843] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:38:48.472 11:49:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:38:48.472 11:49:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:38:48.472 11:49:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:38:48.472 11:49:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:38:48.472 11:49:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:38:48.472 11:49:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:38:48.472 11:49:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:38:48.473 11:49:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:38:48.473 11:49:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:38:48.473 11:49:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:38:48.473 11:49:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:48.473 11:49:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:48.732 11:49:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:38:48.732 "name": "raid_bdev1", 00:38:48.732 "uuid": "7c748a3e-53d2-452c-9ff2-1e220f0c07d2", 00:38:48.732 "strip_size_kb": 0, 00:38:48.732 "state": "online", 00:38:48.732 "raid_level": "raid1", 00:38:48.732 "superblock": true, 00:38:48.732 "num_base_bdevs": 4, 00:38:48.732 "num_base_bdevs_discovered": 3, 00:38:48.732 "num_base_bdevs_operational": 3, 00:38:48.732 "base_bdevs_list": [ 00:38:48.732 { 00:38:48.732 "name": null, 00:38:48.732 "uuid": "00000000-0000-0000-0000-000000000000", 00:38:48.732 "is_configured": false, 00:38:48.732 "data_offset": 2048, 00:38:48.732 "data_size": 63488 00:38:48.732 }, 00:38:48.732 { 00:38:48.732 "name": "BaseBdev2", 00:38:48.732 "uuid": "afd02b12-a4f3-5953-8005-83ddb12cd394", 00:38:48.732 "is_configured": true, 00:38:48.732 "data_offset": 2048, 00:38:48.732 "data_size": 63488 00:38:48.732 }, 00:38:48.732 { 00:38:48.732 "name": "BaseBdev3", 00:38:48.732 "uuid": "bea53c42-8e6a-5af8-9747-3cb5fa5b6f38", 00:38:48.732 "is_configured": true, 00:38:48.732 "data_offset": 2048, 00:38:48.732 "data_size": 63488 00:38:48.732 }, 00:38:48.732 { 00:38:48.732 "name": "BaseBdev4", 00:38:48.732 "uuid": "a8ef8abb-56e0-50ec-88a2-d78c32f80e05", 00:38:48.732 "is_configured": true, 00:38:48.732 "data_offset": 2048, 00:38:48.732 "data_size": 63488 00:38:48.732 } 00:38:48.732 ] 00:38:48.732 }' 00:38:48.732 11:49:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:38:48.732 11:49:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:38:48.991 11:49:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:38:49.250 [2024-06-10 11:49:33.086100] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:38:49.250 [2024-06-10 11:49:33.089728] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1eceba0 00:38:49.250 [2024-06-10 11:49:33.091347] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:38:49.250 11:49:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@646 -- # sleep 1 00:38:50.186 11:49:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:38:50.186 11:49:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:38:50.186 11:49:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:38:50.186 11:49:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:38:50.186 11:49:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:38:50.186 11:49:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:50.186 11:49:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:50.445 11:49:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:38:50.445 "name": "raid_bdev1", 00:38:50.445 "uuid": "7c748a3e-53d2-452c-9ff2-1e220f0c07d2", 00:38:50.445 "strip_size_kb": 0, 00:38:50.445 "state": "online", 00:38:50.445 "raid_level": "raid1", 00:38:50.445 "superblock": true, 00:38:50.445 "num_base_bdevs": 4, 00:38:50.445 "num_base_bdevs_discovered": 4, 00:38:50.445 "num_base_bdevs_operational": 4, 00:38:50.445 "process": { 00:38:50.445 "type": "rebuild", 00:38:50.445 "target": "spare", 00:38:50.445 "progress": { 00:38:50.445 "blocks": 22528, 00:38:50.445 "percent": 35 00:38:50.445 } 00:38:50.445 }, 00:38:50.445 "base_bdevs_list": [ 00:38:50.445 { 00:38:50.445 "name": "spare", 00:38:50.445 "uuid": "789ab350-1002-5be6-8367-50d753b78c40", 00:38:50.445 "is_configured": true, 00:38:50.445 "data_offset": 2048, 00:38:50.445 "data_size": 63488 00:38:50.445 }, 00:38:50.445 { 00:38:50.445 "name": "BaseBdev2", 00:38:50.445 "uuid": "afd02b12-a4f3-5953-8005-83ddb12cd394", 00:38:50.445 "is_configured": true, 00:38:50.445 "data_offset": 2048, 00:38:50.445 "data_size": 63488 00:38:50.445 }, 00:38:50.445 { 00:38:50.445 "name": "BaseBdev3", 00:38:50.445 "uuid": "bea53c42-8e6a-5af8-9747-3cb5fa5b6f38", 00:38:50.445 "is_configured": true, 00:38:50.445 "data_offset": 2048, 00:38:50.445 "data_size": 63488 00:38:50.445 }, 00:38:50.445 { 00:38:50.445 "name": "BaseBdev4", 00:38:50.445 "uuid": "a8ef8abb-56e0-50ec-88a2-d78c32f80e05", 00:38:50.445 "is_configured": true, 00:38:50.445 "data_offset": 2048, 00:38:50.445 "data_size": 63488 00:38:50.445 } 00:38:50.445 ] 00:38:50.445 }' 00:38:50.445 11:49:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:38:50.445 11:49:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:38:50.445 11:49:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:38:50.445 11:49:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:38:50.445 11:49:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:38:50.705 [2024-06-10 11:49:34.536124] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:38:50.705 [2024-06-10 11:49:34.602490] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:38:50.705 [2024-06-10 11:49:34.602525] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:38:50.705 [2024-06-10 11:49:34.602537] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:38:50.705 [2024-06-10 11:49:34.602543] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:38:50.705 11:49:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:38:50.705 11:49:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:38:50.705 11:49:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:38:50.705 11:49:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:38:50.705 11:49:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:38:50.705 11:49:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:38:50.705 11:49:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:38:50.705 11:49:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:38:50.705 11:49:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:38:50.705 11:49:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:38:50.705 11:49:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:50.705 11:49:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:50.964 11:49:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:38:50.964 "name": "raid_bdev1", 00:38:50.964 "uuid": "7c748a3e-53d2-452c-9ff2-1e220f0c07d2", 00:38:50.964 "strip_size_kb": 0, 00:38:50.964 "state": "online", 00:38:50.964 "raid_level": "raid1", 00:38:50.964 "superblock": true, 00:38:50.964 "num_base_bdevs": 4, 00:38:50.964 "num_base_bdevs_discovered": 3, 00:38:50.964 "num_base_bdevs_operational": 3, 00:38:50.964 "base_bdevs_list": [ 00:38:50.964 { 00:38:50.964 "name": null, 00:38:50.964 "uuid": "00000000-0000-0000-0000-000000000000", 00:38:50.964 "is_configured": false, 00:38:50.964 "data_offset": 2048, 00:38:50.964 "data_size": 63488 00:38:50.964 }, 00:38:50.964 { 00:38:50.964 "name": "BaseBdev2", 00:38:50.964 "uuid": "afd02b12-a4f3-5953-8005-83ddb12cd394", 00:38:50.964 "is_configured": true, 00:38:50.964 "data_offset": 2048, 00:38:50.964 "data_size": 63488 00:38:50.964 }, 00:38:50.964 { 00:38:50.964 "name": "BaseBdev3", 00:38:50.964 "uuid": "bea53c42-8e6a-5af8-9747-3cb5fa5b6f38", 00:38:50.964 "is_configured": true, 00:38:50.964 "data_offset": 2048, 00:38:50.964 "data_size": 63488 00:38:50.964 }, 00:38:50.964 { 00:38:50.964 "name": "BaseBdev4", 00:38:50.964 "uuid": "a8ef8abb-56e0-50ec-88a2-d78c32f80e05", 00:38:50.964 "is_configured": true, 00:38:50.964 "data_offset": 2048, 00:38:50.964 "data_size": 63488 00:38:50.964 } 00:38:50.964 ] 00:38:50.964 }' 00:38:50.964 11:49:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:38:50.964 11:49:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:38:51.533 11:49:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:38:51.533 11:49:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:38:51.533 11:49:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:38:51.533 11:49:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:38:51.533 11:49:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:38:51.533 11:49:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:51.533 11:49:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:51.533 11:49:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:38:51.533 "name": "raid_bdev1", 00:38:51.533 "uuid": "7c748a3e-53d2-452c-9ff2-1e220f0c07d2", 00:38:51.533 "strip_size_kb": 0, 00:38:51.533 "state": "online", 00:38:51.533 "raid_level": "raid1", 00:38:51.533 "superblock": true, 00:38:51.533 "num_base_bdevs": 4, 00:38:51.533 "num_base_bdevs_discovered": 3, 00:38:51.533 "num_base_bdevs_operational": 3, 00:38:51.533 "base_bdevs_list": [ 00:38:51.533 { 00:38:51.533 "name": null, 00:38:51.533 "uuid": "00000000-0000-0000-0000-000000000000", 00:38:51.533 "is_configured": false, 00:38:51.533 "data_offset": 2048, 00:38:51.533 "data_size": 63488 00:38:51.533 }, 00:38:51.533 { 00:38:51.533 "name": "BaseBdev2", 00:38:51.533 "uuid": "afd02b12-a4f3-5953-8005-83ddb12cd394", 00:38:51.533 "is_configured": true, 00:38:51.533 "data_offset": 2048, 00:38:51.533 "data_size": 63488 00:38:51.533 }, 00:38:51.533 { 00:38:51.533 "name": "BaseBdev3", 00:38:51.533 "uuid": "bea53c42-8e6a-5af8-9747-3cb5fa5b6f38", 00:38:51.533 "is_configured": true, 00:38:51.533 "data_offset": 2048, 00:38:51.533 "data_size": 63488 00:38:51.533 }, 00:38:51.533 { 00:38:51.533 "name": "BaseBdev4", 00:38:51.533 "uuid": "a8ef8abb-56e0-50ec-88a2-d78c32f80e05", 00:38:51.533 "is_configured": true, 00:38:51.533 "data_offset": 2048, 00:38:51.533 "data_size": 63488 00:38:51.533 } 00:38:51.533 ] 00:38:51.533 }' 00:38:51.533 11:49:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:38:51.792 11:49:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:38:51.792 11:49:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:38:51.792 11:49:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:38:51.792 11:49:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:38:51.792 [2024-06-10 11:49:35.720946] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:38:51.792 [2024-06-10 11:49:35.725130] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x207a5f0 00:38:51.792 [2024-06-10 11:49:35.726291] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:38:51.792 11:49:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@662 -- # sleep 1 00:38:53.170 11:49:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:38:53.170 11:49:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:38:53.170 11:49:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:38:53.170 11:49:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:38:53.170 11:49:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:38:53.170 11:49:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:53.170 11:49:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:53.170 11:49:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:38:53.170 "name": "raid_bdev1", 00:38:53.170 "uuid": "7c748a3e-53d2-452c-9ff2-1e220f0c07d2", 00:38:53.170 "strip_size_kb": 0, 00:38:53.170 "state": "online", 00:38:53.170 "raid_level": "raid1", 00:38:53.170 "superblock": true, 00:38:53.170 "num_base_bdevs": 4, 00:38:53.170 "num_base_bdevs_discovered": 4, 00:38:53.170 "num_base_bdevs_operational": 4, 00:38:53.170 "process": { 00:38:53.170 "type": "rebuild", 00:38:53.170 "target": "spare", 00:38:53.170 "progress": { 00:38:53.170 "blocks": 22528, 00:38:53.170 "percent": 35 00:38:53.170 } 00:38:53.170 }, 00:38:53.170 "base_bdevs_list": [ 00:38:53.170 { 00:38:53.170 "name": "spare", 00:38:53.170 "uuid": "789ab350-1002-5be6-8367-50d753b78c40", 00:38:53.170 "is_configured": true, 00:38:53.170 "data_offset": 2048, 00:38:53.170 "data_size": 63488 00:38:53.170 }, 00:38:53.170 { 00:38:53.170 "name": "BaseBdev2", 00:38:53.170 "uuid": "afd02b12-a4f3-5953-8005-83ddb12cd394", 00:38:53.170 "is_configured": true, 00:38:53.170 "data_offset": 2048, 00:38:53.170 "data_size": 63488 00:38:53.170 }, 00:38:53.170 { 00:38:53.170 "name": "BaseBdev3", 00:38:53.170 "uuid": "bea53c42-8e6a-5af8-9747-3cb5fa5b6f38", 00:38:53.170 "is_configured": true, 00:38:53.170 "data_offset": 2048, 00:38:53.170 "data_size": 63488 00:38:53.170 }, 00:38:53.170 { 00:38:53.170 "name": "BaseBdev4", 00:38:53.170 "uuid": "a8ef8abb-56e0-50ec-88a2-d78c32f80e05", 00:38:53.170 "is_configured": true, 00:38:53.170 "data_offset": 2048, 00:38:53.170 "data_size": 63488 00:38:53.170 } 00:38:53.170 ] 00:38:53.170 }' 00:38:53.170 11:49:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:38:53.170 11:49:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:38:53.170 11:49:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:38:53.170 11:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:38:53.170 11:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:38:53.170 11:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:38:53.170 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:38:53.170 11:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:38:53.170 11:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:38:53.170 11:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:38:53.170 11:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:38:53.429 [2024-06-10 11:49:37.161515] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:38:53.429 [2024-06-10 11:49:37.337576] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x207a5f0 00:38:53.429 11:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:38:53.429 11:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:38:53.429 11:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:38:53.429 11:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:38:53.429 11:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:38:53.429 11:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:38:53.429 11:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:38:53.429 11:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:53.429 11:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:53.689 11:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:38:53.689 "name": "raid_bdev1", 00:38:53.689 "uuid": "7c748a3e-53d2-452c-9ff2-1e220f0c07d2", 00:38:53.689 "strip_size_kb": 0, 00:38:53.689 "state": "online", 00:38:53.689 "raid_level": "raid1", 00:38:53.689 "superblock": true, 00:38:53.689 "num_base_bdevs": 4, 00:38:53.689 "num_base_bdevs_discovered": 3, 00:38:53.689 "num_base_bdevs_operational": 3, 00:38:53.689 "process": { 00:38:53.689 "type": "rebuild", 00:38:53.689 "target": "spare", 00:38:53.689 "progress": { 00:38:53.689 "blocks": 32768, 00:38:53.689 "percent": 51 00:38:53.689 } 00:38:53.689 }, 00:38:53.689 "base_bdevs_list": [ 00:38:53.689 { 00:38:53.689 "name": "spare", 00:38:53.689 "uuid": "789ab350-1002-5be6-8367-50d753b78c40", 00:38:53.689 "is_configured": true, 00:38:53.689 "data_offset": 2048, 00:38:53.689 "data_size": 63488 00:38:53.689 }, 00:38:53.689 { 00:38:53.689 "name": null, 00:38:53.689 "uuid": "00000000-0000-0000-0000-000000000000", 00:38:53.689 "is_configured": false, 00:38:53.689 "data_offset": 2048, 00:38:53.689 "data_size": 63488 00:38:53.689 }, 00:38:53.689 { 00:38:53.689 "name": "BaseBdev3", 00:38:53.689 "uuid": "bea53c42-8e6a-5af8-9747-3cb5fa5b6f38", 00:38:53.689 "is_configured": true, 00:38:53.689 "data_offset": 2048, 00:38:53.689 "data_size": 63488 00:38:53.689 }, 00:38:53.689 { 00:38:53.689 "name": "BaseBdev4", 00:38:53.689 "uuid": "a8ef8abb-56e0-50ec-88a2-d78c32f80e05", 00:38:53.689 "is_configured": true, 00:38:53.689 "data_offset": 2048, 00:38:53.689 "data_size": 63488 00:38:53.689 } 00:38:53.689 ] 00:38:53.689 }' 00:38:53.689 11:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:38:53.689 11:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:38:53.689 11:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:38:53.689 11:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:38:53.689 11:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@705 -- # local timeout=707 00:38:53.689 11:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:38:53.689 11:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:38:53.689 11:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:38:53.689 11:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:38:53.689 11:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:38:53.689 11:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:38:53.689 11:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:53.689 11:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:53.948 11:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:38:53.948 "name": "raid_bdev1", 00:38:53.948 "uuid": "7c748a3e-53d2-452c-9ff2-1e220f0c07d2", 00:38:53.948 "strip_size_kb": 0, 00:38:53.948 "state": "online", 00:38:53.948 "raid_level": "raid1", 00:38:53.948 "superblock": true, 00:38:53.948 "num_base_bdevs": 4, 00:38:53.948 "num_base_bdevs_discovered": 3, 00:38:53.948 "num_base_bdevs_operational": 3, 00:38:53.948 "process": { 00:38:53.948 "type": "rebuild", 00:38:53.948 "target": "spare", 00:38:53.948 "progress": { 00:38:53.948 "blocks": 38912, 00:38:53.948 "percent": 61 00:38:53.948 } 00:38:53.948 }, 00:38:53.948 "base_bdevs_list": [ 00:38:53.948 { 00:38:53.948 "name": "spare", 00:38:53.948 "uuid": "789ab350-1002-5be6-8367-50d753b78c40", 00:38:53.948 "is_configured": true, 00:38:53.948 "data_offset": 2048, 00:38:53.948 "data_size": 63488 00:38:53.948 }, 00:38:53.948 { 00:38:53.948 "name": null, 00:38:53.948 "uuid": "00000000-0000-0000-0000-000000000000", 00:38:53.948 "is_configured": false, 00:38:53.948 "data_offset": 2048, 00:38:53.948 "data_size": 63488 00:38:53.948 }, 00:38:53.948 { 00:38:53.948 "name": "BaseBdev3", 00:38:53.948 "uuid": "bea53c42-8e6a-5af8-9747-3cb5fa5b6f38", 00:38:53.948 "is_configured": true, 00:38:53.949 "data_offset": 2048, 00:38:53.949 "data_size": 63488 00:38:53.949 }, 00:38:53.949 { 00:38:53.949 "name": "BaseBdev4", 00:38:53.949 "uuid": "a8ef8abb-56e0-50ec-88a2-d78c32f80e05", 00:38:53.949 "is_configured": true, 00:38:53.949 "data_offset": 2048, 00:38:53.949 "data_size": 63488 00:38:53.949 } 00:38:53.949 ] 00:38:53.949 }' 00:38:53.949 11:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:38:53.949 11:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:38:53.949 11:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:38:53.949 11:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:38:53.949 11:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:38:55.326 11:49:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:38:55.326 11:49:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:38:55.326 11:49:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:38:55.326 11:49:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:38:55.326 11:49:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:38:55.326 11:49:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:38:55.327 11:49:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:55.327 11:49:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:55.327 [2024-06-10 11:49:38.949358] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:38:55.327 [2024-06-10 11:49:38.949410] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:38:55.327 [2024-06-10 11:49:38.949492] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:38:55.327 11:49:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:38:55.327 "name": "raid_bdev1", 00:38:55.327 "uuid": "7c748a3e-53d2-452c-9ff2-1e220f0c07d2", 00:38:55.327 "strip_size_kb": 0, 00:38:55.327 "state": "online", 00:38:55.327 "raid_level": "raid1", 00:38:55.327 "superblock": true, 00:38:55.327 "num_base_bdevs": 4, 00:38:55.327 "num_base_bdevs_discovered": 3, 00:38:55.327 "num_base_bdevs_operational": 3, 00:38:55.327 "base_bdevs_list": [ 00:38:55.327 { 00:38:55.327 "name": "spare", 00:38:55.327 "uuid": "789ab350-1002-5be6-8367-50d753b78c40", 00:38:55.327 "is_configured": true, 00:38:55.327 "data_offset": 2048, 00:38:55.327 "data_size": 63488 00:38:55.327 }, 00:38:55.327 { 00:38:55.327 "name": null, 00:38:55.327 "uuid": "00000000-0000-0000-0000-000000000000", 00:38:55.327 "is_configured": false, 00:38:55.327 "data_offset": 2048, 00:38:55.327 "data_size": 63488 00:38:55.327 }, 00:38:55.327 { 00:38:55.327 "name": "BaseBdev3", 00:38:55.327 "uuid": "bea53c42-8e6a-5af8-9747-3cb5fa5b6f38", 00:38:55.327 "is_configured": true, 00:38:55.327 "data_offset": 2048, 00:38:55.327 "data_size": 63488 00:38:55.327 }, 00:38:55.327 { 00:38:55.327 "name": "BaseBdev4", 00:38:55.327 "uuid": "a8ef8abb-56e0-50ec-88a2-d78c32f80e05", 00:38:55.327 "is_configured": true, 00:38:55.327 "data_offset": 2048, 00:38:55.327 "data_size": 63488 00:38:55.327 } 00:38:55.327 ] 00:38:55.327 }' 00:38:55.327 11:49:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:38:55.327 11:49:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:38:55.327 11:49:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:38:55.327 11:49:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:38:55.327 11:49:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # break 00:38:55.327 11:49:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:38:55.327 11:49:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:38:55.327 11:49:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:38:55.327 11:49:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:38:55.327 11:49:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:38:55.327 11:49:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:55.327 11:49:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:55.585 11:49:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:38:55.585 "name": "raid_bdev1", 00:38:55.585 "uuid": "7c748a3e-53d2-452c-9ff2-1e220f0c07d2", 00:38:55.585 "strip_size_kb": 0, 00:38:55.585 "state": "online", 00:38:55.585 "raid_level": "raid1", 00:38:55.585 "superblock": true, 00:38:55.585 "num_base_bdevs": 4, 00:38:55.585 "num_base_bdevs_discovered": 3, 00:38:55.585 "num_base_bdevs_operational": 3, 00:38:55.585 "base_bdevs_list": [ 00:38:55.585 { 00:38:55.585 "name": "spare", 00:38:55.585 "uuid": "789ab350-1002-5be6-8367-50d753b78c40", 00:38:55.585 "is_configured": true, 00:38:55.585 "data_offset": 2048, 00:38:55.585 "data_size": 63488 00:38:55.585 }, 00:38:55.585 { 00:38:55.585 "name": null, 00:38:55.585 "uuid": "00000000-0000-0000-0000-000000000000", 00:38:55.585 "is_configured": false, 00:38:55.585 "data_offset": 2048, 00:38:55.585 "data_size": 63488 00:38:55.585 }, 00:38:55.585 { 00:38:55.585 "name": "BaseBdev3", 00:38:55.585 "uuid": "bea53c42-8e6a-5af8-9747-3cb5fa5b6f38", 00:38:55.585 "is_configured": true, 00:38:55.585 "data_offset": 2048, 00:38:55.585 "data_size": 63488 00:38:55.585 }, 00:38:55.585 { 00:38:55.585 "name": "BaseBdev4", 00:38:55.585 "uuid": "a8ef8abb-56e0-50ec-88a2-d78c32f80e05", 00:38:55.585 "is_configured": true, 00:38:55.585 "data_offset": 2048, 00:38:55.585 "data_size": 63488 00:38:55.585 } 00:38:55.585 ] 00:38:55.585 }' 00:38:55.585 11:49:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:38:55.585 11:49:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:38:55.585 11:49:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:38:55.585 11:49:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:38:55.585 11:49:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:38:55.585 11:49:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:38:55.585 11:49:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:38:55.585 11:49:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:38:55.585 11:49:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:38:55.585 11:49:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:38:55.585 11:49:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:38:55.585 11:49:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:38:55.585 11:49:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:38:55.585 11:49:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:38:55.585 11:49:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:55.585 11:49:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:55.844 11:49:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:38:55.844 "name": "raid_bdev1", 00:38:55.844 "uuid": "7c748a3e-53d2-452c-9ff2-1e220f0c07d2", 00:38:55.844 "strip_size_kb": 0, 00:38:55.844 "state": "online", 00:38:55.844 "raid_level": "raid1", 00:38:55.844 "superblock": true, 00:38:55.844 "num_base_bdevs": 4, 00:38:55.844 "num_base_bdevs_discovered": 3, 00:38:55.844 "num_base_bdevs_operational": 3, 00:38:55.844 "base_bdevs_list": [ 00:38:55.844 { 00:38:55.844 "name": "spare", 00:38:55.844 "uuid": "789ab350-1002-5be6-8367-50d753b78c40", 00:38:55.844 "is_configured": true, 00:38:55.844 "data_offset": 2048, 00:38:55.844 "data_size": 63488 00:38:55.844 }, 00:38:55.844 { 00:38:55.844 "name": null, 00:38:55.844 "uuid": "00000000-0000-0000-0000-000000000000", 00:38:55.844 "is_configured": false, 00:38:55.844 "data_offset": 2048, 00:38:55.844 "data_size": 63488 00:38:55.844 }, 00:38:55.844 { 00:38:55.844 "name": "BaseBdev3", 00:38:55.844 "uuid": "bea53c42-8e6a-5af8-9747-3cb5fa5b6f38", 00:38:55.844 "is_configured": true, 00:38:55.844 "data_offset": 2048, 00:38:55.844 "data_size": 63488 00:38:55.844 }, 00:38:55.844 { 00:38:55.844 "name": "BaseBdev4", 00:38:55.844 "uuid": "a8ef8abb-56e0-50ec-88a2-d78c32f80e05", 00:38:55.844 "is_configured": true, 00:38:55.844 "data_offset": 2048, 00:38:55.844 "data_size": 63488 00:38:55.844 } 00:38:55.844 ] 00:38:55.844 }' 00:38:55.844 11:49:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:38:55.844 11:49:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:38:56.410 11:49:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:38:56.410 [2024-06-10 11:49:40.217186] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:38:56.410 [2024-06-10 11:49:40.217215] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:38:56.410 [2024-06-10 11:49:40.217266] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:38:56.410 [2024-06-10 11:49:40.217314] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:38:56.410 [2024-06-10 11:49:40.217324] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2080790 name raid_bdev1, state offline 00:38:56.410 11:49:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:56.410 11:49:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # jq length 00:38:56.668 11:49:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:38:56.668 11:49:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:38:56.668 11:49:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:38:56.668 11:49:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:38:56.668 11:49:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:38:56.668 11:49:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:38:56.668 11:49:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:38:56.668 11:49:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:38:56.668 11:49:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:38:56.668 11:49:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:38:56.668 11:49:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:38:56.668 11:49:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:38:56.668 11:49:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:38:56.668 /dev/nbd0 00:38:56.668 11:49:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:38:56.668 11:49:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:38:56.668 11:49:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:38:56.668 11:49:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # local i 00:38:56.668 11:49:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:38:56.668 11:49:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:38:56.668 11:49:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:38:56.668 11:49:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@872 -- # break 00:38:56.668 11:49:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:38:56.668 11:49:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:38:56.668 11:49:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:38:56.668 1+0 records in 00:38:56.668 1+0 records out 00:38:56.668 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000224525 s, 18.2 MB/s 00:38:56.668 11:49:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:38:56.926 11:49:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # size=4096 00:38:56.927 11:49:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:38:56.927 11:49:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:38:56.927 11:49:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@888 -- # return 0 00:38:56.927 11:49:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:38:56.927 11:49:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:38:56.927 11:49:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:38:56.927 /dev/nbd1 00:38:56.927 11:49:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:38:56.927 11:49:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:38:56.927 11:49:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:38:56.927 11:49:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # local i 00:38:56.927 11:49:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:38:56.927 11:49:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:38:56.927 11:49:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:38:56.927 11:49:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@872 -- # break 00:38:56.927 11:49:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:38:56.927 11:49:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:38:56.927 11:49:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:38:56.927 1+0 records in 00:38:56.927 1+0 records out 00:38:56.927 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000281234 s, 14.6 MB/s 00:38:56.927 11:49:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:38:56.927 11:49:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # size=4096 00:38:56.927 11:49:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:38:56.927 11:49:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:38:56.927 11:49:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@888 -- # return 0 00:38:56.927 11:49:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:38:56.927 11:49:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:38:56.927 11:49:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:38:57.185 11:49:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:38:57.185 11:49:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:38:57.185 11:49:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:38:57.185 11:49:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:38:57.185 11:49:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:38:57.185 11:49:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:38:57.185 11:49:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:38:57.185 11:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:38:57.185 11:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:38:57.185 11:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:38:57.185 11:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:38:57.185 11:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:38:57.185 11:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:38:57.185 11:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:38:57.185 11:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:38:57.185 11:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:38:57.185 11:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:38:57.443 11:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:38:57.443 11:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:38:57.443 11:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:38:57.443 11:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:38:57.443 11:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:38:57.443 11:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:38:57.443 11:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:38:57.443 11:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:38:57.443 11:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:38:57.443 11:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:38:57.702 11:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:38:57.702 [2024-06-10 11:49:41.614658] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:38:57.702 [2024-06-10 11:49:41.614700] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:38:57.702 [2024-06-10 11:49:41.614716] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2005680 00:38:57.702 [2024-06-10 11:49:41.614724] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:38:57.702 [2024-06-10 11:49:41.615951] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:38:57.702 [2024-06-10 11:49:41.615976] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:38:57.702 [2024-06-10 11:49:41.616036] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:38:57.702 [2024-06-10 11:49:41.616058] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:38:57.702 [2024-06-10 11:49:41.616134] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:38:57.702 [2024-06-10 11:49:41.616183] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:38:57.702 spare 00:38:57.702 11:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:38:57.702 11:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:38:57.702 11:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:38:57.702 11:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:38:57.702 11:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:38:57.702 11:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:38:57.702 11:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:38:57.702 11:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:38:57.702 11:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:38:57.702 11:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:38:57.702 11:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:57.702 11:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:57.961 [2024-06-10 11:49:41.716478] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2004110 00:38:57.961 [2024-06-10 11:49:41.716494] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:38:57.961 [2024-06-10 11:49:41.716648] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2080760 00:38:57.961 [2024-06-10 11:49:41.716763] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2004110 00:38:57.961 [2024-06-10 11:49:41.716769] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2004110 00:38:57.961 [2024-06-10 11:49:41.716846] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:38:57.961 11:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:38:57.961 "name": "raid_bdev1", 00:38:57.961 "uuid": "7c748a3e-53d2-452c-9ff2-1e220f0c07d2", 00:38:57.961 "strip_size_kb": 0, 00:38:57.961 "state": "online", 00:38:57.961 "raid_level": "raid1", 00:38:57.961 "superblock": true, 00:38:57.961 "num_base_bdevs": 4, 00:38:57.961 "num_base_bdevs_discovered": 3, 00:38:57.961 "num_base_bdevs_operational": 3, 00:38:57.961 "base_bdevs_list": [ 00:38:57.961 { 00:38:57.961 "name": "spare", 00:38:57.961 "uuid": "789ab350-1002-5be6-8367-50d753b78c40", 00:38:57.961 "is_configured": true, 00:38:57.961 "data_offset": 2048, 00:38:57.961 "data_size": 63488 00:38:57.961 }, 00:38:57.961 { 00:38:57.961 "name": null, 00:38:57.961 "uuid": "00000000-0000-0000-0000-000000000000", 00:38:57.961 "is_configured": false, 00:38:57.961 "data_offset": 2048, 00:38:57.961 "data_size": 63488 00:38:57.961 }, 00:38:57.961 { 00:38:57.961 "name": "BaseBdev3", 00:38:57.961 "uuid": "bea53c42-8e6a-5af8-9747-3cb5fa5b6f38", 00:38:57.961 "is_configured": true, 00:38:57.961 "data_offset": 2048, 00:38:57.961 "data_size": 63488 00:38:57.961 }, 00:38:57.961 { 00:38:57.961 "name": "BaseBdev4", 00:38:57.961 "uuid": "a8ef8abb-56e0-50ec-88a2-d78c32f80e05", 00:38:57.961 "is_configured": true, 00:38:57.961 "data_offset": 2048, 00:38:57.961 "data_size": 63488 00:38:57.961 } 00:38:57.961 ] 00:38:57.961 }' 00:38:57.961 11:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:38:57.961 11:49:41 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:38:58.526 11:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:38:58.526 11:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:38:58.526 11:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:38:58.526 11:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:38:58.526 11:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:38:58.526 11:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:58.526 11:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:58.783 11:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:38:58.783 "name": "raid_bdev1", 00:38:58.783 "uuid": "7c748a3e-53d2-452c-9ff2-1e220f0c07d2", 00:38:58.783 "strip_size_kb": 0, 00:38:58.783 "state": "online", 00:38:58.783 "raid_level": "raid1", 00:38:58.783 "superblock": true, 00:38:58.783 "num_base_bdevs": 4, 00:38:58.783 "num_base_bdevs_discovered": 3, 00:38:58.783 "num_base_bdevs_operational": 3, 00:38:58.783 "base_bdevs_list": [ 00:38:58.783 { 00:38:58.783 "name": "spare", 00:38:58.783 "uuid": "789ab350-1002-5be6-8367-50d753b78c40", 00:38:58.783 "is_configured": true, 00:38:58.783 "data_offset": 2048, 00:38:58.783 "data_size": 63488 00:38:58.783 }, 00:38:58.783 { 00:38:58.783 "name": null, 00:38:58.783 "uuid": "00000000-0000-0000-0000-000000000000", 00:38:58.783 "is_configured": false, 00:38:58.783 "data_offset": 2048, 00:38:58.783 "data_size": 63488 00:38:58.783 }, 00:38:58.783 { 00:38:58.783 "name": "BaseBdev3", 00:38:58.783 "uuid": "bea53c42-8e6a-5af8-9747-3cb5fa5b6f38", 00:38:58.783 "is_configured": true, 00:38:58.783 "data_offset": 2048, 00:38:58.783 "data_size": 63488 00:38:58.783 }, 00:38:58.783 { 00:38:58.783 "name": "BaseBdev4", 00:38:58.783 "uuid": "a8ef8abb-56e0-50ec-88a2-d78c32f80e05", 00:38:58.783 "is_configured": true, 00:38:58.783 "data_offset": 2048, 00:38:58.783 "data_size": 63488 00:38:58.783 } 00:38:58.783 ] 00:38:58.784 }' 00:38:58.784 11:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:38:58.784 11:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:38:58.784 11:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:38:58.784 11:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:38:58.784 11:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:58.784 11:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:38:58.784 11:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:38:59.042 11:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:38:59.042 [2024-06-10 11:49:42.886011] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:38:59.042 11:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:38:59.042 11:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:38:59.042 11:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:38:59.042 11:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:38:59.042 11:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:38:59.042 11:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:38:59.042 11:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:38:59.042 11:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:38:59.042 11:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:38:59.042 11:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:38:59.042 11:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:59.042 11:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:59.300 11:49:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:38:59.300 "name": "raid_bdev1", 00:38:59.300 "uuid": "7c748a3e-53d2-452c-9ff2-1e220f0c07d2", 00:38:59.300 "strip_size_kb": 0, 00:38:59.300 "state": "online", 00:38:59.300 "raid_level": "raid1", 00:38:59.300 "superblock": true, 00:38:59.300 "num_base_bdevs": 4, 00:38:59.300 "num_base_bdevs_discovered": 2, 00:38:59.300 "num_base_bdevs_operational": 2, 00:38:59.300 "base_bdevs_list": [ 00:38:59.300 { 00:38:59.300 "name": null, 00:38:59.300 "uuid": "00000000-0000-0000-0000-000000000000", 00:38:59.300 "is_configured": false, 00:38:59.300 "data_offset": 2048, 00:38:59.300 "data_size": 63488 00:38:59.300 }, 00:38:59.300 { 00:38:59.300 "name": null, 00:38:59.300 "uuid": "00000000-0000-0000-0000-000000000000", 00:38:59.300 "is_configured": false, 00:38:59.300 "data_offset": 2048, 00:38:59.300 "data_size": 63488 00:38:59.300 }, 00:38:59.300 { 00:38:59.300 "name": "BaseBdev3", 00:38:59.300 "uuid": "bea53c42-8e6a-5af8-9747-3cb5fa5b6f38", 00:38:59.300 "is_configured": true, 00:38:59.300 "data_offset": 2048, 00:38:59.300 "data_size": 63488 00:38:59.300 }, 00:38:59.300 { 00:38:59.300 "name": "BaseBdev4", 00:38:59.300 "uuid": "a8ef8abb-56e0-50ec-88a2-d78c32f80e05", 00:38:59.300 "is_configured": true, 00:38:59.300 "data_offset": 2048, 00:38:59.300 "data_size": 63488 00:38:59.300 } 00:38:59.300 ] 00:38:59.300 }' 00:38:59.300 11:49:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:38:59.300 11:49:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:38:59.557 11:49:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:38:59.815 [2024-06-10 11:49:43.656017] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:38:59.815 [2024-06-10 11:49:43.656147] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:38:59.815 [2024-06-10 11:49:43.656160] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:38:59.815 [2024-06-10 11:49:43.656183] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:38:59.815 [2024-06-10 11:49:43.660257] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x206cb30 00:38:59.815 [2024-06-10 11:49:43.662018] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:38:59.815 11:49:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@755 -- # sleep 1 00:39:00.750 11:49:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:39:00.750 11:49:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:39:00.750 11:49:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:39:00.750 11:49:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:39:00.750 11:49:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:39:00.750 11:49:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:00.750 11:49:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:01.009 11:49:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:39:01.009 "name": "raid_bdev1", 00:39:01.009 "uuid": "7c748a3e-53d2-452c-9ff2-1e220f0c07d2", 00:39:01.009 "strip_size_kb": 0, 00:39:01.009 "state": "online", 00:39:01.009 "raid_level": "raid1", 00:39:01.009 "superblock": true, 00:39:01.009 "num_base_bdevs": 4, 00:39:01.009 "num_base_bdevs_discovered": 3, 00:39:01.009 "num_base_bdevs_operational": 3, 00:39:01.009 "process": { 00:39:01.009 "type": "rebuild", 00:39:01.009 "target": "spare", 00:39:01.009 "progress": { 00:39:01.009 "blocks": 22528, 00:39:01.009 "percent": 35 00:39:01.009 } 00:39:01.009 }, 00:39:01.009 "base_bdevs_list": [ 00:39:01.009 { 00:39:01.009 "name": "spare", 00:39:01.009 "uuid": "789ab350-1002-5be6-8367-50d753b78c40", 00:39:01.009 "is_configured": true, 00:39:01.009 "data_offset": 2048, 00:39:01.009 "data_size": 63488 00:39:01.009 }, 00:39:01.009 { 00:39:01.009 "name": null, 00:39:01.009 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:01.009 "is_configured": false, 00:39:01.009 "data_offset": 2048, 00:39:01.009 "data_size": 63488 00:39:01.009 }, 00:39:01.009 { 00:39:01.009 "name": "BaseBdev3", 00:39:01.009 "uuid": "bea53c42-8e6a-5af8-9747-3cb5fa5b6f38", 00:39:01.009 "is_configured": true, 00:39:01.009 "data_offset": 2048, 00:39:01.009 "data_size": 63488 00:39:01.009 }, 00:39:01.009 { 00:39:01.009 "name": "BaseBdev4", 00:39:01.009 "uuid": "a8ef8abb-56e0-50ec-88a2-d78c32f80e05", 00:39:01.009 "is_configured": true, 00:39:01.009 "data_offset": 2048, 00:39:01.009 "data_size": 63488 00:39:01.009 } 00:39:01.009 ] 00:39:01.009 }' 00:39:01.009 11:49:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:39:01.009 11:49:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:39:01.009 11:49:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:39:01.009 11:49:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:39:01.009 11:49:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:39:01.268 [2024-06-10 11:49:45.072751] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:39:01.268 [2024-06-10 11:49:45.173030] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:39:01.268 [2024-06-10 11:49:45.173067] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:39:01.268 [2024-06-10 11:49:45.173079] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:39:01.268 [2024-06-10 11:49:45.173085] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:39:01.268 11:49:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:39:01.268 11:49:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:39:01.268 11:49:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:39:01.268 11:49:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:39:01.268 11:49:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:39:01.268 11:49:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:39:01.268 11:49:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:39:01.269 11:49:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:39:01.269 11:49:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:39:01.269 11:49:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:39:01.269 11:49:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:01.269 11:49:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:01.528 11:49:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:39:01.528 "name": "raid_bdev1", 00:39:01.528 "uuid": "7c748a3e-53d2-452c-9ff2-1e220f0c07d2", 00:39:01.528 "strip_size_kb": 0, 00:39:01.528 "state": "online", 00:39:01.528 "raid_level": "raid1", 00:39:01.528 "superblock": true, 00:39:01.528 "num_base_bdevs": 4, 00:39:01.528 "num_base_bdevs_discovered": 2, 00:39:01.528 "num_base_bdevs_operational": 2, 00:39:01.528 "base_bdevs_list": [ 00:39:01.528 { 00:39:01.528 "name": null, 00:39:01.528 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:01.528 "is_configured": false, 00:39:01.528 "data_offset": 2048, 00:39:01.528 "data_size": 63488 00:39:01.528 }, 00:39:01.528 { 00:39:01.528 "name": null, 00:39:01.528 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:01.528 "is_configured": false, 00:39:01.528 "data_offset": 2048, 00:39:01.528 "data_size": 63488 00:39:01.528 }, 00:39:01.528 { 00:39:01.528 "name": "BaseBdev3", 00:39:01.528 "uuid": "bea53c42-8e6a-5af8-9747-3cb5fa5b6f38", 00:39:01.528 "is_configured": true, 00:39:01.528 "data_offset": 2048, 00:39:01.528 "data_size": 63488 00:39:01.528 }, 00:39:01.528 { 00:39:01.528 "name": "BaseBdev4", 00:39:01.528 "uuid": "a8ef8abb-56e0-50ec-88a2-d78c32f80e05", 00:39:01.528 "is_configured": true, 00:39:01.528 "data_offset": 2048, 00:39:01.528 "data_size": 63488 00:39:01.528 } 00:39:01.528 ] 00:39:01.528 }' 00:39:01.528 11:49:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:39:01.528 11:49:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:39:02.096 11:49:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:39:02.096 [2024-06-10 11:49:46.027364] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:39:02.096 [2024-06-10 11:49:46.027403] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:39:02.096 [2024-06-10 11:49:46.027419] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2004f90 00:39:02.096 [2024-06-10 11:49:46.027427] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:39:02.096 [2024-06-10 11:49:46.027702] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:39:02.096 [2024-06-10 11:49:46.027714] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:39:02.096 [2024-06-10 11:49:46.027772] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:39:02.096 [2024-06-10 11:49:46.027780] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:39:02.096 [2024-06-10 11:49:46.027787] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:39:02.096 [2024-06-10 11:49:46.027799] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:39:02.096 [2024-06-10 11:49:46.031363] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x207eb90 00:39:02.096 [2024-06-10 11:49:46.032375] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:39:02.096 spare 00:39:02.355 11:49:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@762 -- # sleep 1 00:39:03.291 11:49:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:39:03.291 11:49:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:39:03.291 11:49:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:39:03.291 11:49:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:39:03.291 11:49:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:39:03.291 11:49:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:03.291 11:49:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:03.291 11:49:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:39:03.291 "name": "raid_bdev1", 00:39:03.291 "uuid": "7c748a3e-53d2-452c-9ff2-1e220f0c07d2", 00:39:03.291 "strip_size_kb": 0, 00:39:03.291 "state": "online", 00:39:03.291 "raid_level": "raid1", 00:39:03.291 "superblock": true, 00:39:03.291 "num_base_bdevs": 4, 00:39:03.291 "num_base_bdevs_discovered": 3, 00:39:03.291 "num_base_bdevs_operational": 3, 00:39:03.292 "process": { 00:39:03.292 "type": "rebuild", 00:39:03.292 "target": "spare", 00:39:03.292 "progress": { 00:39:03.292 "blocks": 22528, 00:39:03.292 "percent": 35 00:39:03.292 } 00:39:03.292 }, 00:39:03.292 "base_bdevs_list": [ 00:39:03.292 { 00:39:03.292 "name": "spare", 00:39:03.292 "uuid": "789ab350-1002-5be6-8367-50d753b78c40", 00:39:03.292 "is_configured": true, 00:39:03.292 "data_offset": 2048, 00:39:03.292 "data_size": 63488 00:39:03.292 }, 00:39:03.292 { 00:39:03.292 "name": null, 00:39:03.292 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:03.292 "is_configured": false, 00:39:03.292 "data_offset": 2048, 00:39:03.292 "data_size": 63488 00:39:03.292 }, 00:39:03.292 { 00:39:03.292 "name": "BaseBdev3", 00:39:03.292 "uuid": "bea53c42-8e6a-5af8-9747-3cb5fa5b6f38", 00:39:03.292 "is_configured": true, 00:39:03.292 "data_offset": 2048, 00:39:03.292 "data_size": 63488 00:39:03.292 }, 00:39:03.292 { 00:39:03.292 "name": "BaseBdev4", 00:39:03.292 "uuid": "a8ef8abb-56e0-50ec-88a2-d78c32f80e05", 00:39:03.292 "is_configured": true, 00:39:03.292 "data_offset": 2048, 00:39:03.292 "data_size": 63488 00:39:03.292 } 00:39:03.292 ] 00:39:03.292 }' 00:39:03.292 11:49:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:39:03.549 11:49:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:39:03.549 11:49:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:39:03.549 11:49:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:39:03.549 11:49:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:39:03.549 [2024-06-10 11:49:47.439738] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:39:03.549 [2024-06-10 11:49:47.442438] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:39:03.549 [2024-06-10 11:49:47.442472] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:39:03.549 [2024-06-10 11:49:47.442483] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:39:03.549 [2024-06-10 11:49:47.442488] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:39:03.549 11:49:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:39:03.549 11:49:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:39:03.549 11:49:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:39:03.549 11:49:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:39:03.550 11:49:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:39:03.550 11:49:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:39:03.550 11:49:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:39:03.550 11:49:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:39:03.550 11:49:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:39:03.550 11:49:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:39:03.550 11:49:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:03.550 11:49:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:03.807 11:49:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:39:03.807 "name": "raid_bdev1", 00:39:03.807 "uuid": "7c748a3e-53d2-452c-9ff2-1e220f0c07d2", 00:39:03.807 "strip_size_kb": 0, 00:39:03.807 "state": "online", 00:39:03.807 "raid_level": "raid1", 00:39:03.807 "superblock": true, 00:39:03.807 "num_base_bdevs": 4, 00:39:03.807 "num_base_bdevs_discovered": 2, 00:39:03.807 "num_base_bdevs_operational": 2, 00:39:03.807 "base_bdevs_list": [ 00:39:03.807 { 00:39:03.807 "name": null, 00:39:03.807 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:03.807 "is_configured": false, 00:39:03.807 "data_offset": 2048, 00:39:03.807 "data_size": 63488 00:39:03.807 }, 00:39:03.807 { 00:39:03.807 "name": null, 00:39:03.807 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:03.807 "is_configured": false, 00:39:03.807 "data_offset": 2048, 00:39:03.807 "data_size": 63488 00:39:03.807 }, 00:39:03.807 { 00:39:03.807 "name": "BaseBdev3", 00:39:03.807 "uuid": "bea53c42-8e6a-5af8-9747-3cb5fa5b6f38", 00:39:03.807 "is_configured": true, 00:39:03.807 "data_offset": 2048, 00:39:03.807 "data_size": 63488 00:39:03.807 }, 00:39:03.807 { 00:39:03.807 "name": "BaseBdev4", 00:39:03.807 "uuid": "a8ef8abb-56e0-50ec-88a2-d78c32f80e05", 00:39:03.807 "is_configured": true, 00:39:03.807 "data_offset": 2048, 00:39:03.807 "data_size": 63488 00:39:03.807 } 00:39:03.807 ] 00:39:03.807 }' 00:39:03.807 11:49:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:39:03.807 11:49:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:39:04.374 11:49:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:39:04.374 11:49:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:39:04.374 11:49:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:39:04.374 11:49:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:39:04.374 11:49:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:39:04.374 11:49:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:04.374 11:49:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:04.374 11:49:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:39:04.374 "name": "raid_bdev1", 00:39:04.374 "uuid": "7c748a3e-53d2-452c-9ff2-1e220f0c07d2", 00:39:04.374 "strip_size_kb": 0, 00:39:04.374 "state": "online", 00:39:04.374 "raid_level": "raid1", 00:39:04.374 "superblock": true, 00:39:04.374 "num_base_bdevs": 4, 00:39:04.374 "num_base_bdevs_discovered": 2, 00:39:04.374 "num_base_bdevs_operational": 2, 00:39:04.374 "base_bdevs_list": [ 00:39:04.374 { 00:39:04.374 "name": null, 00:39:04.374 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:04.374 "is_configured": false, 00:39:04.374 "data_offset": 2048, 00:39:04.374 "data_size": 63488 00:39:04.374 }, 00:39:04.374 { 00:39:04.374 "name": null, 00:39:04.374 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:04.374 "is_configured": false, 00:39:04.374 "data_offset": 2048, 00:39:04.374 "data_size": 63488 00:39:04.374 }, 00:39:04.374 { 00:39:04.374 "name": "BaseBdev3", 00:39:04.374 "uuid": "bea53c42-8e6a-5af8-9747-3cb5fa5b6f38", 00:39:04.374 "is_configured": true, 00:39:04.374 "data_offset": 2048, 00:39:04.374 "data_size": 63488 00:39:04.374 }, 00:39:04.374 { 00:39:04.374 "name": "BaseBdev4", 00:39:04.374 "uuid": "a8ef8abb-56e0-50ec-88a2-d78c32f80e05", 00:39:04.374 "is_configured": true, 00:39:04.374 "data_offset": 2048, 00:39:04.374 "data_size": 63488 00:39:04.374 } 00:39:04.374 ] 00:39:04.374 }' 00:39:04.374 11:49:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:39:04.633 11:49:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:39:04.633 11:49:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:39:04.633 11:49:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:39:04.633 11:49:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:39:04.633 11:49:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:39:04.891 [2024-06-10 11:49:48.682008] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:39:04.891 [2024-06-10 11:49:48.682051] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:39:04.891 [2024-06-10 11:49:48.682065] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1eced40 00:39:04.891 [2024-06-10 11:49:48.682073] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:39:04.891 [2024-06-10 11:49:48.682335] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:39:04.891 [2024-06-10 11:49:48.682350] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:39:04.891 [2024-06-10 11:49:48.682401] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:39:04.891 [2024-06-10 11:49:48.682410] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:39:04.891 [2024-06-10 11:49:48.682418] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:39:04.891 BaseBdev1 00:39:04.891 11:49:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@773 -- # sleep 1 00:39:05.874 11:49:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:39:05.874 11:49:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:39:05.874 11:49:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:39:05.874 11:49:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:39:05.874 11:49:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:39:05.874 11:49:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:39:05.874 11:49:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:39:05.874 11:49:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:39:05.874 11:49:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:39:05.874 11:49:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:39:05.874 11:49:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:05.874 11:49:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:06.138 11:49:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:39:06.138 "name": "raid_bdev1", 00:39:06.138 "uuid": "7c748a3e-53d2-452c-9ff2-1e220f0c07d2", 00:39:06.138 "strip_size_kb": 0, 00:39:06.138 "state": "online", 00:39:06.138 "raid_level": "raid1", 00:39:06.138 "superblock": true, 00:39:06.138 "num_base_bdevs": 4, 00:39:06.138 "num_base_bdevs_discovered": 2, 00:39:06.138 "num_base_bdevs_operational": 2, 00:39:06.138 "base_bdevs_list": [ 00:39:06.138 { 00:39:06.138 "name": null, 00:39:06.138 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:06.138 "is_configured": false, 00:39:06.138 "data_offset": 2048, 00:39:06.138 "data_size": 63488 00:39:06.138 }, 00:39:06.138 { 00:39:06.138 "name": null, 00:39:06.138 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:06.138 "is_configured": false, 00:39:06.138 "data_offset": 2048, 00:39:06.138 "data_size": 63488 00:39:06.138 }, 00:39:06.138 { 00:39:06.138 "name": "BaseBdev3", 00:39:06.138 "uuid": "bea53c42-8e6a-5af8-9747-3cb5fa5b6f38", 00:39:06.138 "is_configured": true, 00:39:06.138 "data_offset": 2048, 00:39:06.138 "data_size": 63488 00:39:06.138 }, 00:39:06.138 { 00:39:06.138 "name": "BaseBdev4", 00:39:06.138 "uuid": "a8ef8abb-56e0-50ec-88a2-d78c32f80e05", 00:39:06.138 "is_configured": true, 00:39:06.138 "data_offset": 2048, 00:39:06.138 "data_size": 63488 00:39:06.138 } 00:39:06.138 ] 00:39:06.138 }' 00:39:06.138 11:49:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:39:06.138 11:49:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:39:06.707 11:49:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:39:06.707 11:49:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:39:06.707 11:49:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:39:06.707 11:49:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:39:06.707 11:49:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:39:06.707 11:49:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:06.707 11:49:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:06.707 11:49:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:39:06.707 "name": "raid_bdev1", 00:39:06.707 "uuid": "7c748a3e-53d2-452c-9ff2-1e220f0c07d2", 00:39:06.707 "strip_size_kb": 0, 00:39:06.707 "state": "online", 00:39:06.707 "raid_level": "raid1", 00:39:06.707 "superblock": true, 00:39:06.707 "num_base_bdevs": 4, 00:39:06.707 "num_base_bdevs_discovered": 2, 00:39:06.707 "num_base_bdevs_operational": 2, 00:39:06.707 "base_bdevs_list": [ 00:39:06.707 { 00:39:06.707 "name": null, 00:39:06.707 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:06.707 "is_configured": false, 00:39:06.707 "data_offset": 2048, 00:39:06.707 "data_size": 63488 00:39:06.707 }, 00:39:06.707 { 00:39:06.707 "name": null, 00:39:06.707 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:06.707 "is_configured": false, 00:39:06.707 "data_offset": 2048, 00:39:06.707 "data_size": 63488 00:39:06.707 }, 00:39:06.707 { 00:39:06.707 "name": "BaseBdev3", 00:39:06.707 "uuid": "bea53c42-8e6a-5af8-9747-3cb5fa5b6f38", 00:39:06.707 "is_configured": true, 00:39:06.707 "data_offset": 2048, 00:39:06.707 "data_size": 63488 00:39:06.707 }, 00:39:06.707 { 00:39:06.707 "name": "BaseBdev4", 00:39:06.707 "uuid": "a8ef8abb-56e0-50ec-88a2-d78c32f80e05", 00:39:06.707 "is_configured": true, 00:39:06.707 "data_offset": 2048, 00:39:06.707 "data_size": 63488 00:39:06.707 } 00:39:06.707 ] 00:39:06.707 }' 00:39:06.707 11:49:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:39:06.707 11:49:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:39:06.707 11:49:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:39:06.707 11:49:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:39:06.707 11:49:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:39:06.707 11:49:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@649 -- # local es=0 00:39:06.707 11:49:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:39:06.707 11:49:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:39:06.707 11:49:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:39:06.707 11:49:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:39:06.707 11:49:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:39:06.707 11:49:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:39:06.707 11:49:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:39:06.707 11:49:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:39:06.707 11:49:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:39:06.707 11:49:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:39:06.966 [2024-06-10 11:49:50.783583] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:39:06.966 [2024-06-10 11:49:50.783696] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:39:06.966 [2024-06-10 11:49:50.783708] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:39:06.966 request: 00:39:06.966 { 00:39:06.966 "raid_bdev": "raid_bdev1", 00:39:06.966 "base_bdev": "BaseBdev1", 00:39:06.966 "method": "bdev_raid_add_base_bdev", 00:39:06.966 "req_id": 1 00:39:06.966 } 00:39:06.966 Got JSON-RPC error response 00:39:06.966 response: 00:39:06.966 { 00:39:06.966 "code": -22, 00:39:06.966 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:39:06.966 } 00:39:06.966 11:49:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@652 -- # es=1 00:39:06.966 11:49:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:39:06.966 11:49:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:39:06.966 11:49:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:39:06.966 11:49:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@777 -- # sleep 1 00:39:07.902 11:49:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:39:07.902 11:49:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:39:07.902 11:49:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:39:07.902 11:49:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:39:07.902 11:49:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:39:07.902 11:49:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:39:07.902 11:49:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:39:07.902 11:49:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:39:07.902 11:49:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:39:07.902 11:49:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:39:07.902 11:49:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:07.902 11:49:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:08.162 11:49:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:39:08.162 "name": "raid_bdev1", 00:39:08.162 "uuid": "7c748a3e-53d2-452c-9ff2-1e220f0c07d2", 00:39:08.162 "strip_size_kb": 0, 00:39:08.162 "state": "online", 00:39:08.162 "raid_level": "raid1", 00:39:08.162 "superblock": true, 00:39:08.162 "num_base_bdevs": 4, 00:39:08.162 "num_base_bdevs_discovered": 2, 00:39:08.162 "num_base_bdevs_operational": 2, 00:39:08.162 "base_bdevs_list": [ 00:39:08.162 { 00:39:08.162 "name": null, 00:39:08.162 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:08.162 "is_configured": false, 00:39:08.162 "data_offset": 2048, 00:39:08.162 "data_size": 63488 00:39:08.162 }, 00:39:08.162 { 00:39:08.162 "name": null, 00:39:08.162 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:08.162 "is_configured": false, 00:39:08.162 "data_offset": 2048, 00:39:08.162 "data_size": 63488 00:39:08.162 }, 00:39:08.162 { 00:39:08.162 "name": "BaseBdev3", 00:39:08.162 "uuid": "bea53c42-8e6a-5af8-9747-3cb5fa5b6f38", 00:39:08.162 "is_configured": true, 00:39:08.162 "data_offset": 2048, 00:39:08.162 "data_size": 63488 00:39:08.162 }, 00:39:08.162 { 00:39:08.162 "name": "BaseBdev4", 00:39:08.162 "uuid": "a8ef8abb-56e0-50ec-88a2-d78c32f80e05", 00:39:08.162 "is_configured": true, 00:39:08.162 "data_offset": 2048, 00:39:08.162 "data_size": 63488 00:39:08.162 } 00:39:08.162 ] 00:39:08.162 }' 00:39:08.162 11:49:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:39:08.162 11:49:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:39:08.731 11:49:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:39:08.731 11:49:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:39:08.731 11:49:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:39:08.731 11:49:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:39:08.731 11:49:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:39:08.731 11:49:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:08.731 11:49:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:08.731 11:49:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:39:08.731 "name": "raid_bdev1", 00:39:08.731 "uuid": "7c748a3e-53d2-452c-9ff2-1e220f0c07d2", 00:39:08.731 "strip_size_kb": 0, 00:39:08.731 "state": "online", 00:39:08.731 "raid_level": "raid1", 00:39:08.731 "superblock": true, 00:39:08.731 "num_base_bdevs": 4, 00:39:08.731 "num_base_bdevs_discovered": 2, 00:39:08.732 "num_base_bdevs_operational": 2, 00:39:08.732 "base_bdevs_list": [ 00:39:08.732 { 00:39:08.732 "name": null, 00:39:08.732 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:08.732 "is_configured": false, 00:39:08.732 "data_offset": 2048, 00:39:08.732 "data_size": 63488 00:39:08.732 }, 00:39:08.732 { 00:39:08.732 "name": null, 00:39:08.732 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:08.732 "is_configured": false, 00:39:08.732 "data_offset": 2048, 00:39:08.732 "data_size": 63488 00:39:08.732 }, 00:39:08.732 { 00:39:08.732 "name": "BaseBdev3", 00:39:08.732 "uuid": "bea53c42-8e6a-5af8-9747-3cb5fa5b6f38", 00:39:08.732 "is_configured": true, 00:39:08.732 "data_offset": 2048, 00:39:08.732 "data_size": 63488 00:39:08.732 }, 00:39:08.732 { 00:39:08.732 "name": "BaseBdev4", 00:39:08.732 "uuid": "a8ef8abb-56e0-50ec-88a2-d78c32f80e05", 00:39:08.732 "is_configured": true, 00:39:08.732 "data_offset": 2048, 00:39:08.732 "data_size": 63488 00:39:08.732 } 00:39:08.732 ] 00:39:08.732 }' 00:39:08.732 11:49:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:39:08.991 11:49:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:39:08.991 11:49:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:39:08.991 11:49:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:39:08.991 11:49:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@782 -- # killprocess 232668 00:39:08.991 11:49:52 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@949 -- # '[' -z 232668 ']' 00:39:08.991 11:49:52 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # kill -0 232668 00:39:08.991 11:49:52 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # uname 00:39:08.991 11:49:52 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:39:08.991 11:49:52 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 232668 00:39:08.991 11:49:52 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:39:08.991 11:49:52 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:39:08.991 11:49:52 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@967 -- # echo 'killing process with pid 232668' 00:39:08.991 killing process with pid 232668 00:39:08.991 11:49:52 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@968 -- # kill 232668 00:39:08.991 Received shutdown signal, test time was about 60.000000 seconds 00:39:08.991 00:39:08.991 Latency(us) 00:39:08.991 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:39:08.991 =================================================================================================================== 00:39:08.991 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:39:08.991 [2024-06-10 11:49:52.772200] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:39:08.991 11:49:52 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@973 -- # wait 232668 00:39:08.991 [2024-06-10 11:49:52.772276] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:39:08.991 [2024-06-10 11:49:52.772320] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:39:08.991 [2024-06-10 11:49:52.772329] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2004110 name raid_bdev1, state offline 00:39:08.991 [2024-06-10 11:49:52.824298] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:39:09.250 11:49:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@784 -- # return 0 00:39:09.250 00:39:09.250 real 0m30.387s 00:39:09.250 user 0m43.200s 00:39:09.250 sys 0m5.381s 00:39:09.250 11:49:53 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1125 -- # xtrace_disable 00:39:09.250 11:49:53 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:39:09.250 ************************************ 00:39:09.250 END TEST raid_rebuild_test_sb 00:39:09.250 ************************************ 00:39:09.250 11:49:53 bdev_raid -- bdev/bdev_raid.sh@879 -- # run_test raid_rebuild_test_io raid_rebuild_test raid1 4 false true true 00:39:09.250 11:49:53 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:39:09.250 11:49:53 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:39:09.250 11:49:53 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:39:09.250 ************************************ 00:39:09.250 START TEST raid_rebuild_test_io 00:39:09.250 ************************************ 00:39:09.250 11:49:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1124 -- # raid_rebuild_test raid1 4 false true true 00:39:09.250 11:49:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:39:09.250 11:49:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:39:09.250 11:49:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:39:09.250 11:49:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:39:09.250 11:49:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:39:09.250 11:49:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:39:09.250 11:49:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:39:09.250 11:49:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:39:09.250 11:49:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:39:09.250 11:49:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:39:09.250 11:49:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:39:09.250 11:49:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:39:09.250 11:49:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:39:09.250 11:49:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:39:09.250 11:49:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:39:09.250 11:49:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:39:09.250 11:49:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:39:09.250 11:49:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:39:09.250 11:49:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:39:09.250 11:49:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:39:09.250 11:49:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:39:09.250 11:49:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:39:09.250 11:49:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:39:09.250 11:49:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:39:09.250 11:49:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:39:09.250 11:49:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:39:09.250 11:49:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:39:09.250 11:49:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:39:09.250 11:49:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:39:09.250 11:49:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@596 -- # raid_pid=237094 00:39:09.250 11:49:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 237094 /var/tmp/spdk-raid.sock 00:39:09.250 11:49:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@830 -- # '[' -z 237094 ']' 00:39:09.250 11:49:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:39:09.250 11:49:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@835 -- # local max_retries=100 00:39:09.250 11:49:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:39:09.250 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:39:09.250 11:49:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@839 -- # xtrace_disable 00:39:09.250 11:49:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:39:09.250 11:49:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:39:09.250 [2024-06-10 11:49:53.161806] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:39:09.250 [2024-06-10 11:49:53.161858] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid237094 ] 00:39:09.250 I/O size of 3145728 is greater than zero copy threshold (65536). 00:39:09.250 Zero copy mechanism will not be used. 00:39:09.511 [2024-06-10 11:49:53.247262] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:39:09.511 [2024-06-10 11:49:53.330985] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:39:09.511 [2024-06-10 11:49:53.380480] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:39:09.511 [2024-06-10 11:49:53.380506] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:39:10.078 11:49:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:39:10.078 11:49:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@863 -- # return 0 00:39:10.078 11:49:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:39:10.078 11:49:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:39:10.338 BaseBdev1_malloc 00:39:10.338 11:49:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:39:10.338 [2024-06-10 11:49:54.246152] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:39:10.338 [2024-06-10 11:49:54.246194] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:39:10.338 [2024-06-10 11:49:54.246210] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14df780 00:39:10.338 [2024-06-10 11:49:54.246219] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:39:10.338 [2024-06-10 11:49:54.247570] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:39:10.338 [2024-06-10 11:49:54.247595] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:39:10.338 BaseBdev1 00:39:10.338 11:49:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:39:10.338 11:49:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:39:10.597 BaseBdev2_malloc 00:39:10.597 11:49:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:39:10.856 [2024-06-10 11:49:54.580121] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:39:10.856 [2024-06-10 11:49:54.580158] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:39:10.857 [2024-06-10 11:49:54.580172] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x168aa50 00:39:10.857 [2024-06-10 11:49:54.580180] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:39:10.857 [2024-06-10 11:49:54.581365] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:39:10.857 [2024-06-10 11:49:54.581389] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:39:10.857 BaseBdev2 00:39:10.857 11:49:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:39:10.857 11:49:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:39:10.857 BaseBdev3_malloc 00:39:10.857 11:49:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:39:11.116 [2024-06-10 11:49:54.905826] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:39:11.116 [2024-06-10 11:49:54.905862] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:39:11.116 [2024-06-10 11:49:54.905882] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1689720 00:39:11.116 [2024-06-10 11:49:54.905896] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:39:11.116 [2024-06-10 11:49:54.906971] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:39:11.116 [2024-06-10 11:49:54.906993] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:39:11.116 BaseBdev3 00:39:11.116 11:49:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:39:11.116 11:49:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:39:11.376 BaseBdev4_malloc 00:39:11.376 11:49:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:39:11.376 [2024-06-10 11:49:55.234396] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:39:11.376 [2024-06-10 11:49:55.234434] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:39:11.376 [2024-06-10 11:49:55.234450] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x168d970 00:39:11.376 [2024-06-10 11:49:55.234458] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:39:11.376 [2024-06-10 11:49:55.235570] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:39:11.376 [2024-06-10 11:49:55.235592] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:39:11.376 BaseBdev4 00:39:11.376 11:49:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:39:11.635 spare_malloc 00:39:11.635 11:49:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:39:11.635 spare_delay 00:39:11.894 11:49:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:39:11.894 [2024-06-10 11:49:55.747419] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:39:11.894 [2024-06-10 11:49:55.747458] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:39:11.894 [2024-06-10 11:49:55.747472] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x168c060 00:39:11.894 [2024-06-10 11:49:55.747481] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:39:11.894 [2024-06-10 11:49:55.748648] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:39:11.894 [2024-06-10 11:49:55.748672] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:39:11.894 spare 00:39:11.894 11:49:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:39:12.154 [2024-06-10 11:49:55.919886] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:39:12.154 [2024-06-10 11:49:55.920839] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:39:12.154 [2024-06-10 11:49:55.920886] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:39:12.154 [2024-06-10 11:49:55.920914] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:39:12.154 [2024-06-10 11:49:55.920967] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1690790 00:39:12.154 [2024-06-10 11:49:55.920974] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:39:12.154 [2024-06-10 11:49:55.921123] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x168a5f0 00:39:12.154 [2024-06-10 11:49:55.921226] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1690790 00:39:12.154 [2024-06-10 11:49:55.921236] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1690790 00:39:12.154 [2024-06-10 11:49:55.921317] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:39:12.154 11:49:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:39:12.154 11:49:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:39:12.154 11:49:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:39:12.154 11:49:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:39:12.154 11:49:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:39:12.154 11:49:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:39:12.154 11:49:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:39:12.154 11:49:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:39:12.154 11:49:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:39:12.154 11:49:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:39:12.154 11:49:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:12.154 11:49:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:12.413 11:49:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:39:12.413 "name": "raid_bdev1", 00:39:12.413 "uuid": "a3508388-ea02-410c-9fcd-616cfd43851b", 00:39:12.413 "strip_size_kb": 0, 00:39:12.413 "state": "online", 00:39:12.413 "raid_level": "raid1", 00:39:12.413 "superblock": false, 00:39:12.413 "num_base_bdevs": 4, 00:39:12.413 "num_base_bdevs_discovered": 4, 00:39:12.413 "num_base_bdevs_operational": 4, 00:39:12.413 "base_bdevs_list": [ 00:39:12.413 { 00:39:12.413 "name": "BaseBdev1", 00:39:12.413 "uuid": "3a1c2731-0ff9-5a81-8d0d-1ed80e42d7e6", 00:39:12.413 "is_configured": true, 00:39:12.413 "data_offset": 0, 00:39:12.413 "data_size": 65536 00:39:12.413 }, 00:39:12.413 { 00:39:12.413 "name": "BaseBdev2", 00:39:12.413 "uuid": "5fbd84cc-6e2d-5f11-bc13-a3d67066c15a", 00:39:12.413 "is_configured": true, 00:39:12.413 "data_offset": 0, 00:39:12.413 "data_size": 65536 00:39:12.413 }, 00:39:12.413 { 00:39:12.413 "name": "BaseBdev3", 00:39:12.413 "uuid": "4abee77c-ab47-52a3-9875-c7ae6e907b5f", 00:39:12.413 "is_configured": true, 00:39:12.413 "data_offset": 0, 00:39:12.413 "data_size": 65536 00:39:12.413 }, 00:39:12.413 { 00:39:12.413 "name": "BaseBdev4", 00:39:12.413 "uuid": "edd8a304-cfde-5f52-b307-ab52fe1ff250", 00:39:12.413 "is_configured": true, 00:39:12.413 "data_offset": 0, 00:39:12.413 "data_size": 65536 00:39:12.413 } 00:39:12.413 ] 00:39:12.413 }' 00:39:12.413 11:49:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:39:12.413 11:49:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:39:12.673 11:49:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:39:12.673 11:49:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:39:12.932 [2024-06-10 11:49:56.742170] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:39:12.932 11:49:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:39:12.932 11:49:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:39:12.932 11:49:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:13.193 11:49:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:39:13.193 11:49:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:39:13.193 11:49:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:39:13.193 11:49:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:39:13.193 [2024-06-10 11:49:57.024590] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1616ba0 00:39:13.193 I/O size of 3145728 is greater than zero copy threshold (65536). 00:39:13.193 Zero copy mechanism will not be used. 00:39:13.193 Running I/O for 60 seconds... 00:39:13.193 [2024-06-10 11:49:57.100359] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:39:13.193 [2024-06-10 11:49:57.100519] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1616ba0 00:39:13.453 11:49:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:39:13.453 11:49:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:39:13.453 11:49:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:39:13.453 11:49:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:39:13.453 11:49:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:39:13.453 11:49:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:39:13.453 11:49:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:39:13.453 11:49:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:39:13.453 11:49:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:39:13.453 11:49:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:39:13.453 11:49:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:13.453 11:49:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:13.453 11:49:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:39:13.453 "name": "raid_bdev1", 00:39:13.453 "uuid": "a3508388-ea02-410c-9fcd-616cfd43851b", 00:39:13.453 "strip_size_kb": 0, 00:39:13.453 "state": "online", 00:39:13.453 "raid_level": "raid1", 00:39:13.453 "superblock": false, 00:39:13.453 "num_base_bdevs": 4, 00:39:13.453 "num_base_bdevs_discovered": 3, 00:39:13.453 "num_base_bdevs_operational": 3, 00:39:13.453 "base_bdevs_list": [ 00:39:13.453 { 00:39:13.453 "name": null, 00:39:13.453 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:13.453 "is_configured": false, 00:39:13.453 "data_offset": 0, 00:39:13.453 "data_size": 65536 00:39:13.453 }, 00:39:13.453 { 00:39:13.453 "name": "BaseBdev2", 00:39:13.453 "uuid": "5fbd84cc-6e2d-5f11-bc13-a3d67066c15a", 00:39:13.453 "is_configured": true, 00:39:13.453 "data_offset": 0, 00:39:13.453 "data_size": 65536 00:39:13.453 }, 00:39:13.453 { 00:39:13.453 "name": "BaseBdev3", 00:39:13.453 "uuid": "4abee77c-ab47-52a3-9875-c7ae6e907b5f", 00:39:13.453 "is_configured": true, 00:39:13.453 "data_offset": 0, 00:39:13.453 "data_size": 65536 00:39:13.453 }, 00:39:13.453 { 00:39:13.453 "name": "BaseBdev4", 00:39:13.453 "uuid": "edd8a304-cfde-5f52-b307-ab52fe1ff250", 00:39:13.453 "is_configured": true, 00:39:13.453 "data_offset": 0, 00:39:13.453 "data_size": 65536 00:39:13.453 } 00:39:13.453 ] 00:39:13.453 }' 00:39:13.453 11:49:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:39:13.453 11:49:57 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:39:14.022 11:49:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:39:14.281 [2024-06-10 11:49:57.982858] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:39:14.281 [2024-06-10 11:49:58.029528] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x168bc90 00:39:14.281 11:49:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:39:14.281 [2024-06-10 11:49:58.031242] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:39:14.281 [2024-06-10 11:49:58.149553] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:39:14.281 [2024-06-10 11:49:58.150728] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:39:14.541 [2024-06-10 11:49:58.368656] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:39:14.541 [2024-06-10 11:49:58.368880] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:39:14.801 [2024-06-10 11:49:58.712212] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:39:14.801 [2024-06-10 11:49:58.712592] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:39:15.061 [2024-06-10 11:49:58.922750] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:39:15.320 11:49:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:39:15.320 11:49:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:39:15.320 11:49:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:39:15.320 11:49:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:39:15.320 11:49:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:39:15.320 11:49:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:15.320 11:49:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:15.320 11:49:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:39:15.320 "name": "raid_bdev1", 00:39:15.320 "uuid": "a3508388-ea02-410c-9fcd-616cfd43851b", 00:39:15.320 "strip_size_kb": 0, 00:39:15.320 "state": "online", 00:39:15.320 "raid_level": "raid1", 00:39:15.320 "superblock": false, 00:39:15.320 "num_base_bdevs": 4, 00:39:15.320 "num_base_bdevs_discovered": 4, 00:39:15.320 "num_base_bdevs_operational": 4, 00:39:15.320 "process": { 00:39:15.320 "type": "rebuild", 00:39:15.320 "target": "spare", 00:39:15.320 "progress": { 00:39:15.320 "blocks": 12288, 00:39:15.320 "percent": 18 00:39:15.320 } 00:39:15.320 }, 00:39:15.320 "base_bdevs_list": [ 00:39:15.320 { 00:39:15.320 "name": "spare", 00:39:15.320 "uuid": "ce124d85-5880-5e2a-84d5-9cadb0a8e573", 00:39:15.320 "is_configured": true, 00:39:15.320 "data_offset": 0, 00:39:15.320 "data_size": 65536 00:39:15.320 }, 00:39:15.320 { 00:39:15.320 "name": "BaseBdev2", 00:39:15.320 "uuid": "5fbd84cc-6e2d-5f11-bc13-a3d67066c15a", 00:39:15.320 "is_configured": true, 00:39:15.320 "data_offset": 0, 00:39:15.320 "data_size": 65536 00:39:15.320 }, 00:39:15.320 { 00:39:15.320 "name": "BaseBdev3", 00:39:15.320 "uuid": "4abee77c-ab47-52a3-9875-c7ae6e907b5f", 00:39:15.320 "is_configured": true, 00:39:15.320 "data_offset": 0, 00:39:15.320 "data_size": 65536 00:39:15.320 }, 00:39:15.320 { 00:39:15.320 "name": "BaseBdev4", 00:39:15.320 "uuid": "edd8a304-cfde-5f52-b307-ab52fe1ff250", 00:39:15.320 "is_configured": true, 00:39:15.320 "data_offset": 0, 00:39:15.320 "data_size": 65536 00:39:15.320 } 00:39:15.320 ] 00:39:15.320 }' 00:39:15.320 11:49:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:39:15.320 [2024-06-10 11:49:59.241206] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:39:15.583 11:49:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:39:15.583 11:49:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:39:15.583 11:49:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:39:15.583 11:49:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:39:15.583 [2024-06-10 11:49:59.464797] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:39:15.583 [2024-06-10 11:49:59.476012] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:39:15.842 [2024-06-10 11:49:59.576251] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:39:15.842 [2024-06-10 11:49:59.583516] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:39:15.842 [2024-06-10 11:49:59.583540] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:39:15.842 [2024-06-10 11:49:59.583548] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:39:15.842 [2024-06-10 11:49:59.611894] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1616ba0 00:39:15.842 11:49:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:39:15.842 11:49:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:39:15.842 11:49:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:39:15.842 11:49:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:39:15.842 11:49:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:39:15.842 11:49:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:39:15.842 11:49:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:39:15.842 11:49:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:39:15.842 11:49:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:39:15.842 11:49:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:39:15.842 11:49:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:15.842 11:49:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:16.101 11:49:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:39:16.101 "name": "raid_bdev1", 00:39:16.101 "uuid": "a3508388-ea02-410c-9fcd-616cfd43851b", 00:39:16.101 "strip_size_kb": 0, 00:39:16.101 "state": "online", 00:39:16.101 "raid_level": "raid1", 00:39:16.101 "superblock": false, 00:39:16.101 "num_base_bdevs": 4, 00:39:16.101 "num_base_bdevs_discovered": 3, 00:39:16.101 "num_base_bdevs_operational": 3, 00:39:16.101 "base_bdevs_list": [ 00:39:16.101 { 00:39:16.101 "name": null, 00:39:16.101 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:16.101 "is_configured": false, 00:39:16.101 "data_offset": 0, 00:39:16.101 "data_size": 65536 00:39:16.101 }, 00:39:16.101 { 00:39:16.101 "name": "BaseBdev2", 00:39:16.101 "uuid": "5fbd84cc-6e2d-5f11-bc13-a3d67066c15a", 00:39:16.101 "is_configured": true, 00:39:16.101 "data_offset": 0, 00:39:16.101 "data_size": 65536 00:39:16.101 }, 00:39:16.101 { 00:39:16.101 "name": "BaseBdev3", 00:39:16.101 "uuid": "4abee77c-ab47-52a3-9875-c7ae6e907b5f", 00:39:16.101 "is_configured": true, 00:39:16.101 "data_offset": 0, 00:39:16.101 "data_size": 65536 00:39:16.101 }, 00:39:16.101 { 00:39:16.101 "name": "BaseBdev4", 00:39:16.101 "uuid": "edd8a304-cfde-5f52-b307-ab52fe1ff250", 00:39:16.101 "is_configured": true, 00:39:16.101 "data_offset": 0, 00:39:16.101 "data_size": 65536 00:39:16.101 } 00:39:16.101 ] 00:39:16.101 }' 00:39:16.101 11:49:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:39:16.101 11:49:59 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:39:16.670 11:50:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:39:16.670 11:50:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:39:16.670 11:50:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:39:16.670 11:50:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:39:16.670 11:50:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:39:16.670 11:50:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:16.670 11:50:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:16.670 11:50:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:39:16.670 "name": "raid_bdev1", 00:39:16.670 "uuid": "a3508388-ea02-410c-9fcd-616cfd43851b", 00:39:16.670 "strip_size_kb": 0, 00:39:16.670 "state": "online", 00:39:16.670 "raid_level": "raid1", 00:39:16.670 "superblock": false, 00:39:16.670 "num_base_bdevs": 4, 00:39:16.670 "num_base_bdevs_discovered": 3, 00:39:16.670 "num_base_bdevs_operational": 3, 00:39:16.670 "base_bdevs_list": [ 00:39:16.670 { 00:39:16.670 "name": null, 00:39:16.670 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:16.670 "is_configured": false, 00:39:16.670 "data_offset": 0, 00:39:16.670 "data_size": 65536 00:39:16.670 }, 00:39:16.670 { 00:39:16.670 "name": "BaseBdev2", 00:39:16.670 "uuid": "5fbd84cc-6e2d-5f11-bc13-a3d67066c15a", 00:39:16.670 "is_configured": true, 00:39:16.670 "data_offset": 0, 00:39:16.670 "data_size": 65536 00:39:16.670 }, 00:39:16.670 { 00:39:16.670 "name": "BaseBdev3", 00:39:16.670 "uuid": "4abee77c-ab47-52a3-9875-c7ae6e907b5f", 00:39:16.670 "is_configured": true, 00:39:16.670 "data_offset": 0, 00:39:16.670 "data_size": 65536 00:39:16.670 }, 00:39:16.670 { 00:39:16.670 "name": "BaseBdev4", 00:39:16.670 "uuid": "edd8a304-cfde-5f52-b307-ab52fe1ff250", 00:39:16.670 "is_configured": true, 00:39:16.670 "data_offset": 0, 00:39:16.670 "data_size": 65536 00:39:16.670 } 00:39:16.670 ] 00:39:16.670 }' 00:39:16.670 11:50:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:39:16.670 11:50:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:39:16.670 11:50:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:39:16.929 11:50:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:39:16.929 11:50:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:39:16.929 [2024-06-10 11:50:00.774899] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:39:16.929 [2024-06-10 11:50:00.815234] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1684760 00:39:16.929 [2024-06-10 11:50:00.816343] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:39:16.929 11:50:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:39:17.188 [2024-06-10 11:50:00.931091] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:39:17.188 [2024-06-10 11:50:00.931598] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:39:17.188 [2024-06-10 11:50:01.053477] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:39:17.188 [2024-06-10 11:50:01.053758] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:39:17.446 [2024-06-10 11:50:01.277961] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:39:17.446 [2024-06-10 11:50:01.278251] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:39:17.706 [2024-06-10 11:50:01.501561] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:39:17.965 11:50:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:39:17.965 11:50:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:39:17.965 11:50:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:39:17.965 11:50:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:39:17.965 11:50:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:39:17.965 11:50:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:17.965 11:50:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:18.225 [2024-06-10 11:50:01.973895] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:39:18.225 11:50:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:39:18.225 "name": "raid_bdev1", 00:39:18.225 "uuid": "a3508388-ea02-410c-9fcd-616cfd43851b", 00:39:18.225 "strip_size_kb": 0, 00:39:18.225 "state": "online", 00:39:18.225 "raid_level": "raid1", 00:39:18.225 "superblock": false, 00:39:18.225 "num_base_bdevs": 4, 00:39:18.225 "num_base_bdevs_discovered": 4, 00:39:18.225 "num_base_bdevs_operational": 4, 00:39:18.225 "process": { 00:39:18.225 "type": "rebuild", 00:39:18.225 "target": "spare", 00:39:18.225 "progress": { 00:39:18.225 "blocks": 16384, 00:39:18.225 "percent": 25 00:39:18.225 } 00:39:18.225 }, 00:39:18.225 "base_bdevs_list": [ 00:39:18.225 { 00:39:18.225 "name": "spare", 00:39:18.225 "uuid": "ce124d85-5880-5e2a-84d5-9cadb0a8e573", 00:39:18.225 "is_configured": true, 00:39:18.225 "data_offset": 0, 00:39:18.225 "data_size": 65536 00:39:18.225 }, 00:39:18.225 { 00:39:18.225 "name": "BaseBdev2", 00:39:18.225 "uuid": "5fbd84cc-6e2d-5f11-bc13-a3d67066c15a", 00:39:18.225 "is_configured": true, 00:39:18.225 "data_offset": 0, 00:39:18.225 "data_size": 65536 00:39:18.225 }, 00:39:18.225 { 00:39:18.225 "name": "BaseBdev3", 00:39:18.225 "uuid": "4abee77c-ab47-52a3-9875-c7ae6e907b5f", 00:39:18.225 "is_configured": true, 00:39:18.225 "data_offset": 0, 00:39:18.225 "data_size": 65536 00:39:18.225 }, 00:39:18.225 { 00:39:18.225 "name": "BaseBdev4", 00:39:18.225 "uuid": "edd8a304-cfde-5f52-b307-ab52fe1ff250", 00:39:18.225 "is_configured": true, 00:39:18.225 "data_offset": 0, 00:39:18.225 "data_size": 65536 00:39:18.225 } 00:39:18.225 ] 00:39:18.225 }' 00:39:18.225 11:50:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:39:18.225 11:50:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:39:18.225 11:50:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:39:18.225 11:50:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:39:18.225 11:50:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:39:18.225 11:50:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:39:18.225 11:50:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:39:18.225 11:50:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:39:18.225 11:50:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:39:18.483 [2024-06-10 11:50:02.227197] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:39:18.483 [2024-06-10 11:50:02.294314] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:39:18.483 [2024-06-10 11:50:02.401286] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x1616ba0 00:39:18.483 [2024-06-10 11:50:02.401306] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x1684760 00:39:18.483 [2024-06-10 11:50:02.402684] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:39:18.483 11:50:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:39:18.483 11:50:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:39:18.483 11:50:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:39:18.483 11:50:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:39:18.483 11:50:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:39:18.483 11:50:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:39:18.483 11:50:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:39:18.742 11:50:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:18.742 11:50:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:18.742 [2024-06-10 11:50:02.540935] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:39:18.742 11:50:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:39:18.742 "name": "raid_bdev1", 00:39:18.742 "uuid": "a3508388-ea02-410c-9fcd-616cfd43851b", 00:39:18.742 "strip_size_kb": 0, 00:39:18.742 "state": "online", 00:39:18.742 "raid_level": "raid1", 00:39:18.742 "superblock": false, 00:39:18.742 "num_base_bdevs": 4, 00:39:18.742 "num_base_bdevs_discovered": 3, 00:39:18.742 "num_base_bdevs_operational": 3, 00:39:18.742 "process": { 00:39:18.742 "type": "rebuild", 00:39:18.742 "target": "spare", 00:39:18.742 "progress": { 00:39:18.742 "blocks": 22528, 00:39:18.742 "percent": 34 00:39:18.742 } 00:39:18.742 }, 00:39:18.742 "base_bdevs_list": [ 00:39:18.742 { 00:39:18.742 "name": "spare", 00:39:18.742 "uuid": "ce124d85-5880-5e2a-84d5-9cadb0a8e573", 00:39:18.742 "is_configured": true, 00:39:18.742 "data_offset": 0, 00:39:18.742 "data_size": 65536 00:39:18.742 }, 00:39:18.742 { 00:39:18.742 "name": null, 00:39:18.742 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:18.742 "is_configured": false, 00:39:18.742 "data_offset": 0, 00:39:18.742 "data_size": 65536 00:39:18.742 }, 00:39:18.742 { 00:39:18.742 "name": "BaseBdev3", 00:39:18.742 "uuid": "4abee77c-ab47-52a3-9875-c7ae6e907b5f", 00:39:18.742 "is_configured": true, 00:39:18.742 "data_offset": 0, 00:39:18.742 "data_size": 65536 00:39:18.742 }, 00:39:18.742 { 00:39:18.742 "name": "BaseBdev4", 00:39:18.742 "uuid": "edd8a304-cfde-5f52-b307-ab52fe1ff250", 00:39:18.742 "is_configured": true, 00:39:18.742 "data_offset": 0, 00:39:18.742 "data_size": 65536 00:39:18.742 } 00:39:18.742 ] 00:39:18.742 }' 00:39:18.742 11:50:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:39:18.742 11:50:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:39:18.742 11:50:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:39:19.001 11:50:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:39:19.001 11:50:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@705 -- # local timeout=732 00:39:19.001 11:50:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:39:19.001 11:50:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:39:19.001 11:50:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:39:19.001 11:50:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:39:19.001 11:50:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:39:19.001 11:50:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:39:19.001 11:50:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:19.001 11:50:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:19.001 11:50:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:39:19.001 "name": "raid_bdev1", 00:39:19.001 "uuid": "a3508388-ea02-410c-9fcd-616cfd43851b", 00:39:19.001 "strip_size_kb": 0, 00:39:19.001 "state": "online", 00:39:19.001 "raid_level": "raid1", 00:39:19.001 "superblock": false, 00:39:19.001 "num_base_bdevs": 4, 00:39:19.001 "num_base_bdevs_discovered": 3, 00:39:19.001 "num_base_bdevs_operational": 3, 00:39:19.001 "process": { 00:39:19.001 "type": "rebuild", 00:39:19.001 "target": "spare", 00:39:19.001 "progress": { 00:39:19.001 "blocks": 24576, 00:39:19.001 "percent": 37 00:39:19.001 } 00:39:19.001 }, 00:39:19.001 "base_bdevs_list": [ 00:39:19.001 { 00:39:19.001 "name": "spare", 00:39:19.001 "uuid": "ce124d85-5880-5e2a-84d5-9cadb0a8e573", 00:39:19.001 "is_configured": true, 00:39:19.001 "data_offset": 0, 00:39:19.001 "data_size": 65536 00:39:19.001 }, 00:39:19.001 { 00:39:19.001 "name": null, 00:39:19.001 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:19.001 "is_configured": false, 00:39:19.001 "data_offset": 0, 00:39:19.001 "data_size": 65536 00:39:19.001 }, 00:39:19.001 { 00:39:19.001 "name": "BaseBdev3", 00:39:19.001 "uuid": "4abee77c-ab47-52a3-9875-c7ae6e907b5f", 00:39:19.001 "is_configured": true, 00:39:19.001 "data_offset": 0, 00:39:19.001 "data_size": 65536 00:39:19.001 }, 00:39:19.001 { 00:39:19.001 "name": "BaseBdev4", 00:39:19.001 "uuid": "edd8a304-cfde-5f52-b307-ab52fe1ff250", 00:39:19.001 "is_configured": true, 00:39:19.001 "data_offset": 0, 00:39:19.001 "data_size": 65536 00:39:19.001 } 00:39:19.001 ] 00:39:19.001 }' 00:39:19.001 11:50:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:39:19.001 11:50:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:39:19.001 11:50:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:39:19.260 11:50:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:39:19.260 11:50:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:39:19.260 [2024-06-10 11:50:02.993945] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:39:19.519 [2024-06-10 11:50:03.340804] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 32768 offset_begin: 30720 offset_end: 36864 00:39:20.086 [2024-06-10 11:50:03.746318] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:39:20.086 11:50:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:39:20.086 11:50:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:39:20.086 11:50:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:39:20.086 11:50:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:39:20.086 11:50:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:39:20.086 11:50:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:39:20.086 11:50:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:20.086 11:50:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:20.345 11:50:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:39:20.345 "name": "raid_bdev1", 00:39:20.345 "uuid": "a3508388-ea02-410c-9fcd-616cfd43851b", 00:39:20.345 "strip_size_kb": 0, 00:39:20.345 "state": "online", 00:39:20.345 "raid_level": "raid1", 00:39:20.345 "superblock": false, 00:39:20.345 "num_base_bdevs": 4, 00:39:20.345 "num_base_bdevs_discovered": 3, 00:39:20.345 "num_base_bdevs_operational": 3, 00:39:20.345 "process": { 00:39:20.345 "type": "rebuild", 00:39:20.345 "target": "spare", 00:39:20.345 "progress": { 00:39:20.345 "blocks": 45056, 00:39:20.345 "percent": 68 00:39:20.345 } 00:39:20.345 }, 00:39:20.345 "base_bdevs_list": [ 00:39:20.345 { 00:39:20.345 "name": "spare", 00:39:20.345 "uuid": "ce124d85-5880-5e2a-84d5-9cadb0a8e573", 00:39:20.345 "is_configured": true, 00:39:20.345 "data_offset": 0, 00:39:20.345 "data_size": 65536 00:39:20.345 }, 00:39:20.345 { 00:39:20.345 "name": null, 00:39:20.345 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:20.345 "is_configured": false, 00:39:20.345 "data_offset": 0, 00:39:20.345 "data_size": 65536 00:39:20.345 }, 00:39:20.345 { 00:39:20.345 "name": "BaseBdev3", 00:39:20.345 "uuid": "4abee77c-ab47-52a3-9875-c7ae6e907b5f", 00:39:20.345 "is_configured": true, 00:39:20.345 "data_offset": 0, 00:39:20.345 "data_size": 65536 00:39:20.345 }, 00:39:20.345 { 00:39:20.345 "name": "BaseBdev4", 00:39:20.345 "uuid": "edd8a304-cfde-5f52-b307-ab52fe1ff250", 00:39:20.345 "is_configured": true, 00:39:20.345 "data_offset": 0, 00:39:20.345 "data_size": 65536 00:39:20.345 } 00:39:20.345 ] 00:39:20.345 }' 00:39:20.345 11:50:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:39:20.345 11:50:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:39:20.345 11:50:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:39:20.345 11:50:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:39:20.345 11:50:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:39:20.604 [2024-06-10 11:50:04.510003] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 53248 offset_begin: 49152 offset_end: 55296 00:39:21.173 [2024-06-10 11:50:04.823394] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 57344 offset_begin: 55296 offset_end: 61440 00:39:21.173 [2024-06-10 11:50:04.925525] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 59392 offset_begin: 55296 offset_end: 61440 00:39:21.452 11:50:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:39:21.452 11:50:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:39:21.452 11:50:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:39:21.452 11:50:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:39:21.452 11:50:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:39:21.452 11:50:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:39:21.452 11:50:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:21.452 11:50:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:21.452 [2024-06-10 11:50:05.362321] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:39:21.712 11:50:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:39:21.712 "name": "raid_bdev1", 00:39:21.712 "uuid": "a3508388-ea02-410c-9fcd-616cfd43851b", 00:39:21.712 "strip_size_kb": 0, 00:39:21.712 "state": "online", 00:39:21.712 "raid_level": "raid1", 00:39:21.712 "superblock": false, 00:39:21.712 "num_base_bdevs": 4, 00:39:21.712 "num_base_bdevs_discovered": 3, 00:39:21.712 "num_base_bdevs_operational": 3, 00:39:21.712 "process": { 00:39:21.712 "type": "rebuild", 00:39:21.712 "target": "spare", 00:39:21.712 "progress": { 00:39:21.712 "blocks": 65536, 00:39:21.712 "percent": 100 00:39:21.712 } 00:39:21.712 }, 00:39:21.712 "base_bdevs_list": [ 00:39:21.712 { 00:39:21.712 "name": "spare", 00:39:21.712 "uuid": "ce124d85-5880-5e2a-84d5-9cadb0a8e573", 00:39:21.712 "is_configured": true, 00:39:21.712 "data_offset": 0, 00:39:21.712 "data_size": 65536 00:39:21.712 }, 00:39:21.712 { 00:39:21.712 "name": null, 00:39:21.712 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:21.712 "is_configured": false, 00:39:21.712 "data_offset": 0, 00:39:21.712 "data_size": 65536 00:39:21.712 }, 00:39:21.712 { 00:39:21.712 "name": "BaseBdev3", 00:39:21.712 "uuid": "4abee77c-ab47-52a3-9875-c7ae6e907b5f", 00:39:21.712 "is_configured": true, 00:39:21.712 "data_offset": 0, 00:39:21.712 "data_size": 65536 00:39:21.712 }, 00:39:21.712 { 00:39:21.712 "name": "BaseBdev4", 00:39:21.712 "uuid": "edd8a304-cfde-5f52-b307-ab52fe1ff250", 00:39:21.712 "is_configured": true, 00:39:21.712 "data_offset": 0, 00:39:21.712 "data_size": 65536 00:39:21.712 } 00:39:21.712 ] 00:39:21.712 }' 00:39:21.712 11:50:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:39:21.712 11:50:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:39:21.712 11:50:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:39:21.713 [2024-06-10 11:50:05.468040] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:39:21.713 [2024-06-10 11:50:05.470245] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:39:21.713 11:50:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:39:21.713 11:50:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:39:22.651 11:50:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:39:22.651 11:50:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:39:22.651 11:50:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:39:22.651 11:50:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:39:22.651 11:50:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:39:22.651 11:50:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:39:22.651 11:50:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:22.651 11:50:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:22.913 11:50:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:39:22.913 "name": "raid_bdev1", 00:39:22.913 "uuid": "a3508388-ea02-410c-9fcd-616cfd43851b", 00:39:22.913 "strip_size_kb": 0, 00:39:22.913 "state": "online", 00:39:22.913 "raid_level": "raid1", 00:39:22.913 "superblock": false, 00:39:22.913 "num_base_bdevs": 4, 00:39:22.913 "num_base_bdevs_discovered": 3, 00:39:22.913 "num_base_bdevs_operational": 3, 00:39:22.913 "base_bdevs_list": [ 00:39:22.913 { 00:39:22.913 "name": "spare", 00:39:22.913 "uuid": "ce124d85-5880-5e2a-84d5-9cadb0a8e573", 00:39:22.913 "is_configured": true, 00:39:22.913 "data_offset": 0, 00:39:22.913 "data_size": 65536 00:39:22.913 }, 00:39:22.913 { 00:39:22.913 "name": null, 00:39:22.913 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:22.913 "is_configured": false, 00:39:22.913 "data_offset": 0, 00:39:22.913 "data_size": 65536 00:39:22.913 }, 00:39:22.913 { 00:39:22.913 "name": "BaseBdev3", 00:39:22.913 "uuid": "4abee77c-ab47-52a3-9875-c7ae6e907b5f", 00:39:22.913 "is_configured": true, 00:39:22.913 "data_offset": 0, 00:39:22.913 "data_size": 65536 00:39:22.913 }, 00:39:22.913 { 00:39:22.913 "name": "BaseBdev4", 00:39:22.913 "uuid": "edd8a304-cfde-5f52-b307-ab52fe1ff250", 00:39:22.913 "is_configured": true, 00:39:22.913 "data_offset": 0, 00:39:22.913 "data_size": 65536 00:39:22.913 } 00:39:22.913 ] 00:39:22.913 }' 00:39:22.913 11:50:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:39:22.913 11:50:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:39:22.913 11:50:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:39:22.913 11:50:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:39:22.913 11:50:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # break 00:39:22.913 11:50:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:39:22.913 11:50:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:39:22.913 11:50:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:39:22.913 11:50:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:39:22.913 11:50:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:39:22.913 11:50:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:22.913 11:50:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:23.173 11:50:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:39:23.173 "name": "raid_bdev1", 00:39:23.173 "uuid": "a3508388-ea02-410c-9fcd-616cfd43851b", 00:39:23.173 "strip_size_kb": 0, 00:39:23.173 "state": "online", 00:39:23.173 "raid_level": "raid1", 00:39:23.173 "superblock": false, 00:39:23.173 "num_base_bdevs": 4, 00:39:23.173 "num_base_bdevs_discovered": 3, 00:39:23.173 "num_base_bdevs_operational": 3, 00:39:23.173 "base_bdevs_list": [ 00:39:23.173 { 00:39:23.173 "name": "spare", 00:39:23.173 "uuid": "ce124d85-5880-5e2a-84d5-9cadb0a8e573", 00:39:23.173 "is_configured": true, 00:39:23.173 "data_offset": 0, 00:39:23.173 "data_size": 65536 00:39:23.173 }, 00:39:23.173 { 00:39:23.173 "name": null, 00:39:23.173 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:23.173 "is_configured": false, 00:39:23.173 "data_offset": 0, 00:39:23.173 "data_size": 65536 00:39:23.173 }, 00:39:23.173 { 00:39:23.173 "name": "BaseBdev3", 00:39:23.173 "uuid": "4abee77c-ab47-52a3-9875-c7ae6e907b5f", 00:39:23.173 "is_configured": true, 00:39:23.173 "data_offset": 0, 00:39:23.173 "data_size": 65536 00:39:23.173 }, 00:39:23.173 { 00:39:23.173 "name": "BaseBdev4", 00:39:23.173 "uuid": "edd8a304-cfde-5f52-b307-ab52fe1ff250", 00:39:23.173 "is_configured": true, 00:39:23.173 "data_offset": 0, 00:39:23.173 "data_size": 65536 00:39:23.173 } 00:39:23.173 ] 00:39:23.173 }' 00:39:23.173 11:50:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:39:23.173 11:50:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:39:23.173 11:50:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:39:23.173 11:50:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:39:23.173 11:50:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:39:23.173 11:50:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:39:23.173 11:50:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:39:23.173 11:50:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:39:23.173 11:50:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:39:23.173 11:50:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:39:23.173 11:50:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:39:23.173 11:50:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:39:23.173 11:50:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:39:23.173 11:50:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:39:23.173 11:50:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:23.173 11:50:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:23.433 11:50:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:39:23.433 "name": "raid_bdev1", 00:39:23.433 "uuid": "a3508388-ea02-410c-9fcd-616cfd43851b", 00:39:23.433 "strip_size_kb": 0, 00:39:23.433 "state": "online", 00:39:23.433 "raid_level": "raid1", 00:39:23.433 "superblock": false, 00:39:23.433 "num_base_bdevs": 4, 00:39:23.433 "num_base_bdevs_discovered": 3, 00:39:23.433 "num_base_bdevs_operational": 3, 00:39:23.433 "base_bdevs_list": [ 00:39:23.433 { 00:39:23.433 "name": "spare", 00:39:23.433 "uuid": "ce124d85-5880-5e2a-84d5-9cadb0a8e573", 00:39:23.433 "is_configured": true, 00:39:23.433 "data_offset": 0, 00:39:23.433 "data_size": 65536 00:39:23.433 }, 00:39:23.433 { 00:39:23.433 "name": null, 00:39:23.433 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:23.433 "is_configured": false, 00:39:23.433 "data_offset": 0, 00:39:23.433 "data_size": 65536 00:39:23.433 }, 00:39:23.433 { 00:39:23.433 "name": "BaseBdev3", 00:39:23.433 "uuid": "4abee77c-ab47-52a3-9875-c7ae6e907b5f", 00:39:23.433 "is_configured": true, 00:39:23.433 "data_offset": 0, 00:39:23.433 "data_size": 65536 00:39:23.433 }, 00:39:23.433 { 00:39:23.433 "name": "BaseBdev4", 00:39:23.433 "uuid": "edd8a304-cfde-5f52-b307-ab52fe1ff250", 00:39:23.433 "is_configured": true, 00:39:23.433 "data_offset": 0, 00:39:23.433 "data_size": 65536 00:39:23.433 } 00:39:23.433 ] 00:39:23.433 }' 00:39:23.433 11:50:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:39:23.433 11:50:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:39:24.003 11:50:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:39:24.003 [2024-06-10 11:50:07.815766] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:39:24.003 [2024-06-10 11:50:07.815796] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:39:24.003 00:39:24.003 Latency(us) 00:39:24.003 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:39:24.003 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:39:24.003 raid_bdev1 : 10.79 109.85 329.56 0.00 0.00 13110.40 251.10 113975.65 00:39:24.003 =================================================================================================================== 00:39:24.003 Total : 109.85 329.56 0.00 0.00 13110.40 251.10 113975.65 00:39:24.003 [2024-06-10 11:50:07.842512] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:39:24.003 [2024-06-10 11:50:07.842532] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:39:24.003 [2024-06-10 11:50:07.842591] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:39:24.003 [2024-06-10 11:50:07.842598] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1690790 name raid_bdev1, state offline 00:39:24.003 0 00:39:24.003 11:50:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:24.003 11:50:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # jq length 00:39:24.263 11:50:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:39:24.263 11:50:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:39:24.263 11:50:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:39:24.263 11:50:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:39:24.263 11:50:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:39:24.263 11:50:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:39:24.263 11:50:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:39:24.263 11:50:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:39:24.263 11:50:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:39:24.263 11:50:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:39:24.263 11:50:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:39:24.263 11:50:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:39:24.263 11:50:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:39:24.522 /dev/nbd0 00:39:24.522 11:50:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:39:24.522 11:50:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:39:24.522 11:50:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:39:24.522 11:50:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@868 -- # local i 00:39:24.522 11:50:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:39:24.522 11:50:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:39:24.522 11:50:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:39:24.522 11:50:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@872 -- # break 00:39:24.522 11:50:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:39:24.522 11:50:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:39:24.522 11:50:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:39:24.522 1+0 records in 00:39:24.522 1+0 records out 00:39:24.522 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000251627 s, 16.3 MB/s 00:39:24.522 11:50:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:39:24.522 11:50:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # size=4096 00:39:24.522 11:50:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:39:24.522 11:50:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:39:24.522 11:50:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@888 -- # return 0 00:39:24.522 11:50:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:39:24.522 11:50:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:39:24.522 11:50:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:39:24.522 11:50:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z '' ']' 00:39:24.522 11:50:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@727 -- # continue 00:39:24.522 11:50:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:39:24.522 11:50:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev3 ']' 00:39:24.522 11:50:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev3 /dev/nbd1 00:39:24.522 11:50:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:39:24.522 11:50:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev3') 00:39:24.522 11:50:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:39:24.522 11:50:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:39:24.522 11:50:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:39:24.522 11:50:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:39:24.522 11:50:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:39:24.522 11:50:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:39:24.522 11:50:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev3 /dev/nbd1 00:39:24.522 /dev/nbd1 00:39:24.782 11:50:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:39:24.782 11:50:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:39:24.782 11:50:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:39:24.782 11:50:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@868 -- # local i 00:39:24.782 11:50:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:39:24.782 11:50:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:39:24.782 11:50:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:39:24.782 11:50:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@872 -- # break 00:39:24.782 11:50:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:39:24.782 11:50:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:39:24.782 11:50:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:39:24.782 1+0 records in 00:39:24.782 1+0 records out 00:39:24.782 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000177065 s, 23.1 MB/s 00:39:24.782 11:50:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:39:24.782 11:50:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # size=4096 00:39:24.782 11:50:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:39:24.782 11:50:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:39:24.782 11:50:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@888 -- # return 0 00:39:24.782 11:50:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:39:24.782 11:50:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:39:24.782 11:50:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:39:24.782 11:50:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:39:24.782 11:50:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:39:24.782 11:50:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:39:24.782 11:50:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:39:24.782 11:50:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:39:24.782 11:50:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:39:24.782 11:50:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:39:25.041 11:50:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:39:25.041 11:50:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:39:25.041 11:50:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:39:25.041 11:50:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:39:25.041 11:50:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:39:25.041 11:50:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:39:25.041 11:50:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:39:25.041 11:50:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:39:25.041 11:50:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:39:25.041 11:50:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev4 ']' 00:39:25.041 11:50:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev4 /dev/nbd1 00:39:25.041 11:50:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:39:25.041 11:50:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev4') 00:39:25.041 11:50:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:39:25.041 11:50:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:39:25.041 11:50:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:39:25.042 11:50:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:39:25.042 11:50:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:39:25.042 11:50:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:39:25.042 11:50:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev4 /dev/nbd1 00:39:25.042 /dev/nbd1 00:39:25.042 11:50:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:39:25.042 11:50:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:39:25.042 11:50:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:39:25.042 11:50:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@868 -- # local i 00:39:25.042 11:50:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:39:25.042 11:50:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:39:25.042 11:50:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:39:25.042 11:50:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@872 -- # break 00:39:25.042 11:50:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:39:25.042 11:50:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:39:25.042 11:50:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:39:25.042 1+0 records in 00:39:25.042 1+0 records out 00:39:25.042 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000216113 s, 19.0 MB/s 00:39:25.042 11:50:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:39:25.042 11:50:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # size=4096 00:39:25.042 11:50:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:39:25.042 11:50:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:39:25.042 11:50:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@888 -- # return 0 00:39:25.042 11:50:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:39:25.042 11:50:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:39:25.042 11:50:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:39:25.301 11:50:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:39:25.301 11:50:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:39:25.301 11:50:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:39:25.301 11:50:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:39:25.301 11:50:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:39:25.301 11:50:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:39:25.301 11:50:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:39:25.301 11:50:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:39:25.301 11:50:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:39:25.301 11:50:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:39:25.301 11:50:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:39:25.301 11:50:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:39:25.301 11:50:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:39:25.301 11:50:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:39:25.301 11:50:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:39:25.301 11:50:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:39:25.301 11:50:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:39:25.301 11:50:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:39:25.301 11:50:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:39:25.301 11:50:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:39:25.301 11:50:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:39:25.301 11:50:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:39:25.561 11:50:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:39:25.561 11:50:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:39:25.561 11:50:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:39:25.561 11:50:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:39:25.561 11:50:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:39:25.561 11:50:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:39:25.561 11:50:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:39:25.561 11:50:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:39:25.561 11:50:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:39:25.561 11:50:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@782 -- # killprocess 237094 00:39:25.561 11:50:09 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@949 -- # '[' -z 237094 ']' 00:39:25.561 11:50:09 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # kill -0 237094 00:39:25.561 11:50:09 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # uname 00:39:25.561 11:50:09 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:39:25.561 11:50:09 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 237094 00:39:25.561 11:50:09 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:39:25.561 11:50:09 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:39:25.561 11:50:09 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@967 -- # echo 'killing process with pid 237094' 00:39:25.561 killing process with pid 237094 00:39:25.561 11:50:09 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@968 -- # kill 237094 00:39:25.561 Received shutdown signal, test time was about 12.414529 seconds 00:39:25.561 00:39:25.561 Latency(us) 00:39:25.561 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:39:25.561 =================================================================================================================== 00:39:25.561 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:39:25.561 [2024-06-10 11:50:09.471243] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:39:25.561 11:50:09 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@973 -- # wait 237094 00:39:25.820 [2024-06-10 11:50:09.513934] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:39:25.820 11:50:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@784 -- # return 0 00:39:25.820 00:39:25.820 real 0m16.614s 00:39:25.820 user 0m24.436s 00:39:25.820 sys 0m2.862s 00:39:25.820 11:50:09 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1125 -- # xtrace_disable 00:39:25.820 11:50:09 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:39:25.820 ************************************ 00:39:25.820 END TEST raid_rebuild_test_io 00:39:25.820 ************************************ 00:39:25.820 11:50:09 bdev_raid -- bdev/bdev_raid.sh@880 -- # run_test raid_rebuild_test_sb_io raid_rebuild_test raid1 4 true true true 00:39:25.820 11:50:09 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:39:25.820 11:50:09 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:39:25.820 11:50:09 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:39:26.080 ************************************ 00:39:26.080 START TEST raid_rebuild_test_sb_io 00:39:26.080 ************************************ 00:39:26.080 11:50:09 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1124 -- # raid_rebuild_test raid1 4 true true true 00:39:26.080 11:50:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:39:26.080 11:50:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:39:26.080 11:50:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:39:26.080 11:50:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:39:26.080 11:50:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:39:26.080 11:50:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:39:26.080 11:50:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:39:26.080 11:50:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:39:26.080 11:50:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:39:26.080 11:50:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:39:26.080 11:50:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:39:26.080 11:50:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:39:26.080 11:50:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:39:26.080 11:50:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:39:26.080 11:50:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:39:26.080 11:50:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:39:26.080 11:50:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:39:26.080 11:50:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:39:26.080 11:50:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:39:26.080 11:50:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:39:26.080 11:50:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:39:26.080 11:50:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:39:26.080 11:50:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:39:26.080 11:50:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:39:26.080 11:50:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:39:26.080 11:50:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:39:26.080 11:50:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:39:26.080 11:50:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:39:26.080 11:50:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:39:26.080 11:50:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:39:26.080 11:50:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@596 -- # raid_pid=239553 00:39:26.080 11:50:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 239553 /var/tmp/spdk-raid.sock 00:39:26.080 11:50:09 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@830 -- # '[' -z 239553 ']' 00:39:26.080 11:50:09 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:39:26.080 11:50:09 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@835 -- # local max_retries=100 00:39:26.080 11:50:09 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:39:26.080 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:39:26.080 11:50:09 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@839 -- # xtrace_disable 00:39:26.080 11:50:09 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:39:26.080 11:50:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:39:26.080 [2024-06-10 11:50:09.864640] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:39:26.080 [2024-06-10 11:50:09.864692] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid239553 ] 00:39:26.080 I/O size of 3145728 is greater than zero copy threshold (65536). 00:39:26.080 Zero copy mechanism will not be used. 00:39:26.080 [2024-06-10 11:50:09.950285] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:39:26.342 [2024-06-10 11:50:10.047483] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:39:26.342 [2024-06-10 11:50:10.106088] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:39:26.342 [2024-06-10 11:50:10.106119] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:39:27.008 11:50:10 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:39:27.008 11:50:10 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@863 -- # return 0 00:39:27.008 11:50:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:39:27.008 11:50:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:39:27.008 BaseBdev1_malloc 00:39:27.008 11:50:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:39:27.267 [2024-06-10 11:50:10.963703] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:39:27.267 [2024-06-10 11:50:10.963741] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:39:27.267 [2024-06-10 11:50:10.963756] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xca3780 00:39:27.267 [2024-06-10 11:50:10.963765] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:39:27.267 [2024-06-10 11:50:10.964977] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:39:27.267 [2024-06-10 11:50:10.965002] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:39:27.267 BaseBdev1 00:39:27.267 11:50:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:39:27.267 11:50:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:39:27.267 BaseBdev2_malloc 00:39:27.267 11:50:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:39:27.526 [2024-06-10 11:50:11.308544] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:39:27.526 [2024-06-10 11:50:11.308581] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:39:27.526 [2024-06-10 11:50:11.308594] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe4ea50 00:39:27.526 [2024-06-10 11:50:11.308602] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:39:27.526 [2024-06-10 11:50:11.309712] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:39:27.526 [2024-06-10 11:50:11.309735] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:39:27.526 BaseBdev2 00:39:27.526 11:50:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:39:27.526 11:50:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:39:27.786 BaseBdev3_malloc 00:39:27.786 11:50:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:39:27.786 [2024-06-10 11:50:11.646237] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:39:27.786 [2024-06-10 11:50:11.646275] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:39:27.786 [2024-06-10 11:50:11.646289] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe4d720 00:39:27.786 [2024-06-10 11:50:11.646298] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:39:27.786 [2024-06-10 11:50:11.647430] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:39:27.786 [2024-06-10 11:50:11.647453] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:39:27.786 BaseBdev3 00:39:27.786 11:50:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:39:27.786 11:50:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:39:28.045 BaseBdev4_malloc 00:39:28.045 11:50:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:39:28.045 [2024-06-10 11:50:11.971900] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:39:28.045 [2024-06-10 11:50:11.971948] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:39:28.045 [2024-06-10 11:50:11.971965] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe51970 00:39:28.045 [2024-06-10 11:50:11.971972] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:39:28.045 [2024-06-10 11:50:11.973128] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:39:28.045 [2024-06-10 11:50:11.973150] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:39:28.045 BaseBdev4 00:39:28.045 11:50:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:39:28.306 spare_malloc 00:39:28.306 11:50:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:39:28.567 spare_delay 00:39:28.567 11:50:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:39:28.567 [2024-06-10 11:50:12.480950] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:39:28.567 [2024-06-10 11:50:12.480986] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:39:28.567 [2024-06-10 11:50:12.481001] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe50060 00:39:28.567 [2024-06-10 11:50:12.481009] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:39:28.567 [2024-06-10 11:50:12.482212] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:39:28.567 [2024-06-10 11:50:12.482236] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:39:28.567 spare 00:39:28.567 11:50:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:39:28.826 [2024-06-10 11:50:12.653423] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:39:28.826 [2024-06-10 11:50:12.654340] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:39:28.826 [2024-06-10 11:50:12.654378] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:39:28.826 [2024-06-10 11:50:12.654406] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:39:28.826 [2024-06-10 11:50:12.654540] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xe54790 00:39:28.826 [2024-06-10 11:50:12.654547] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:39:28.826 [2024-06-10 11:50:12.654684] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe4e5f0 00:39:28.826 [2024-06-10 11:50:12.654785] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe54790 00:39:28.826 [2024-06-10 11:50:12.654792] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xe54790 00:39:28.826 [2024-06-10 11:50:12.654857] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:39:28.826 11:50:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:39:28.826 11:50:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:39:28.826 11:50:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:39:28.826 11:50:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:39:28.826 11:50:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:39:28.826 11:50:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:39:28.826 11:50:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:39:28.826 11:50:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:39:28.826 11:50:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:39:28.826 11:50:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:39:28.826 11:50:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:28.826 11:50:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:29.086 11:50:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:39:29.086 "name": "raid_bdev1", 00:39:29.086 "uuid": "2e593a30-6530-42b2-b6c5-23ce758078f9", 00:39:29.086 "strip_size_kb": 0, 00:39:29.086 "state": "online", 00:39:29.086 "raid_level": "raid1", 00:39:29.086 "superblock": true, 00:39:29.086 "num_base_bdevs": 4, 00:39:29.086 "num_base_bdevs_discovered": 4, 00:39:29.086 "num_base_bdevs_operational": 4, 00:39:29.086 "base_bdevs_list": [ 00:39:29.086 { 00:39:29.086 "name": "BaseBdev1", 00:39:29.086 "uuid": "c83c5f5b-83f6-5977-b964-6527cdcb954d", 00:39:29.086 "is_configured": true, 00:39:29.086 "data_offset": 2048, 00:39:29.086 "data_size": 63488 00:39:29.086 }, 00:39:29.086 { 00:39:29.086 "name": "BaseBdev2", 00:39:29.086 "uuid": "6cc38149-7309-58e7-8b14-75987a5371eb", 00:39:29.086 "is_configured": true, 00:39:29.086 "data_offset": 2048, 00:39:29.086 "data_size": 63488 00:39:29.086 }, 00:39:29.086 { 00:39:29.086 "name": "BaseBdev3", 00:39:29.086 "uuid": "c72d4910-dd99-5f92-b8e6-896958e178a1", 00:39:29.086 "is_configured": true, 00:39:29.086 "data_offset": 2048, 00:39:29.086 "data_size": 63488 00:39:29.086 }, 00:39:29.086 { 00:39:29.086 "name": "BaseBdev4", 00:39:29.086 "uuid": "43f9b2e0-5023-540e-a976-947942981aee", 00:39:29.086 "is_configured": true, 00:39:29.086 "data_offset": 2048, 00:39:29.086 "data_size": 63488 00:39:29.086 } 00:39:29.086 ] 00:39:29.086 }' 00:39:29.086 11:50:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:39:29.086 11:50:12 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:39:29.653 11:50:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:39:29.653 11:50:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:39:29.653 [2024-06-10 11:50:13.475701] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:39:29.653 11:50:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:39:29.653 11:50:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:29.653 11:50:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:39:29.912 11:50:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:39:29.912 11:50:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:39:29.912 11:50:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:39:29.912 11:50:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:39:29.912 [2024-06-10 11:50:13.750398] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe4fc90 00:39:29.912 I/O size of 3145728 is greater than zero copy threshold (65536). 00:39:29.912 Zero copy mechanism will not be used. 00:39:29.912 Running I/O for 60 seconds... 00:39:29.912 [2024-06-10 11:50:13.838652] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:39:29.912 [2024-06-10 11:50:13.844108] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xe4fc90 00:39:30.172 11:50:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:39:30.172 11:50:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:39:30.172 11:50:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:39:30.172 11:50:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:39:30.172 11:50:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:39:30.172 11:50:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:39:30.172 11:50:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:39:30.172 11:50:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:39:30.172 11:50:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:39:30.172 11:50:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:39:30.172 11:50:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:30.172 11:50:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:30.172 11:50:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:39:30.172 "name": "raid_bdev1", 00:39:30.172 "uuid": "2e593a30-6530-42b2-b6c5-23ce758078f9", 00:39:30.172 "strip_size_kb": 0, 00:39:30.172 "state": "online", 00:39:30.172 "raid_level": "raid1", 00:39:30.172 "superblock": true, 00:39:30.172 "num_base_bdevs": 4, 00:39:30.172 "num_base_bdevs_discovered": 3, 00:39:30.172 "num_base_bdevs_operational": 3, 00:39:30.172 "base_bdevs_list": [ 00:39:30.172 { 00:39:30.172 "name": null, 00:39:30.172 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:30.172 "is_configured": false, 00:39:30.172 "data_offset": 2048, 00:39:30.172 "data_size": 63488 00:39:30.172 }, 00:39:30.172 { 00:39:30.172 "name": "BaseBdev2", 00:39:30.172 "uuid": "6cc38149-7309-58e7-8b14-75987a5371eb", 00:39:30.172 "is_configured": true, 00:39:30.172 "data_offset": 2048, 00:39:30.172 "data_size": 63488 00:39:30.172 }, 00:39:30.172 { 00:39:30.172 "name": "BaseBdev3", 00:39:30.172 "uuid": "c72d4910-dd99-5f92-b8e6-896958e178a1", 00:39:30.172 "is_configured": true, 00:39:30.172 "data_offset": 2048, 00:39:30.172 "data_size": 63488 00:39:30.172 }, 00:39:30.172 { 00:39:30.173 "name": "BaseBdev4", 00:39:30.173 "uuid": "43f9b2e0-5023-540e-a976-947942981aee", 00:39:30.173 "is_configured": true, 00:39:30.173 "data_offset": 2048, 00:39:30.173 "data_size": 63488 00:39:30.173 } 00:39:30.173 ] 00:39:30.173 }' 00:39:30.173 11:50:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:39:30.173 11:50:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:39:30.740 11:50:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:39:30.999 [2024-06-10 11:50:14.744946] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:39:30.999 [2024-06-10 11:50:14.785594] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd3aad0 00:39:30.999 11:50:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:39:30.999 [2024-06-10 11:50:14.787481] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:39:31.000 [2024-06-10 11:50:14.908945] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:39:31.000 [2024-06-10 11:50:14.909250] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:39:31.259 [2024-06-10 11:50:15.039235] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:39:31.259 [2024-06-10 11:50:15.039847] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:39:31.518 [2024-06-10 11:50:15.379869] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:39:31.518 [2024-06-10 11:50:15.380143] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:39:31.777 [2024-06-10 11:50:15.501787] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:39:32.036 11:50:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:39:32.036 11:50:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:39:32.036 11:50:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:39:32.036 11:50:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:39:32.036 11:50:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:39:32.036 11:50:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:32.037 11:50:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:32.037 11:50:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:39:32.037 "name": "raid_bdev1", 00:39:32.037 "uuid": "2e593a30-6530-42b2-b6c5-23ce758078f9", 00:39:32.037 "strip_size_kb": 0, 00:39:32.037 "state": "online", 00:39:32.037 "raid_level": "raid1", 00:39:32.037 "superblock": true, 00:39:32.037 "num_base_bdevs": 4, 00:39:32.037 "num_base_bdevs_discovered": 4, 00:39:32.037 "num_base_bdevs_operational": 4, 00:39:32.037 "process": { 00:39:32.037 "type": "rebuild", 00:39:32.037 "target": "spare", 00:39:32.037 "progress": { 00:39:32.037 "blocks": 14336, 00:39:32.037 "percent": 22 00:39:32.037 } 00:39:32.037 }, 00:39:32.037 "base_bdevs_list": [ 00:39:32.037 { 00:39:32.037 "name": "spare", 00:39:32.037 "uuid": "adfff921-f1a0-5a50-bb7a-d470aeffb389", 00:39:32.037 "is_configured": true, 00:39:32.037 "data_offset": 2048, 00:39:32.037 "data_size": 63488 00:39:32.037 }, 00:39:32.037 { 00:39:32.037 "name": "BaseBdev2", 00:39:32.037 "uuid": "6cc38149-7309-58e7-8b14-75987a5371eb", 00:39:32.037 "is_configured": true, 00:39:32.037 "data_offset": 2048, 00:39:32.037 "data_size": 63488 00:39:32.037 }, 00:39:32.037 { 00:39:32.037 "name": "BaseBdev3", 00:39:32.037 "uuid": "c72d4910-dd99-5f92-b8e6-896958e178a1", 00:39:32.037 "is_configured": true, 00:39:32.037 "data_offset": 2048, 00:39:32.037 "data_size": 63488 00:39:32.037 }, 00:39:32.037 { 00:39:32.037 "name": "BaseBdev4", 00:39:32.037 "uuid": "43f9b2e0-5023-540e-a976-947942981aee", 00:39:32.037 "is_configured": true, 00:39:32.037 "data_offset": 2048, 00:39:32.037 "data_size": 63488 00:39:32.037 } 00:39:32.037 ] 00:39:32.037 }' 00:39:32.037 11:50:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:39:32.037 [2024-06-10 11:50:15.982333] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:39:32.306 11:50:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:39:32.306 11:50:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:39:32.306 11:50:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:39:32.306 11:50:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:39:32.306 [2024-06-10 11:50:16.187579] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:39:32.566 [2024-06-10 11:50:16.313905] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:39:32.566 [2024-06-10 11:50:16.323409] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:39:32.566 [2024-06-10 11:50:16.323441] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:39:32.566 [2024-06-10 11:50:16.323449] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:39:32.566 [2024-06-10 11:50:16.347775] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xe4fc90 00:39:32.566 11:50:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:39:32.566 11:50:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:39:32.566 11:50:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:39:32.566 11:50:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:39:32.566 11:50:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:39:32.566 11:50:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:39:32.566 11:50:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:39:32.566 11:50:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:39:32.566 11:50:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:39:32.566 11:50:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:39:32.566 11:50:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:32.566 11:50:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:32.826 11:50:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:39:32.826 "name": "raid_bdev1", 00:39:32.826 "uuid": "2e593a30-6530-42b2-b6c5-23ce758078f9", 00:39:32.826 "strip_size_kb": 0, 00:39:32.826 "state": "online", 00:39:32.826 "raid_level": "raid1", 00:39:32.826 "superblock": true, 00:39:32.826 "num_base_bdevs": 4, 00:39:32.826 "num_base_bdevs_discovered": 3, 00:39:32.826 "num_base_bdevs_operational": 3, 00:39:32.826 "base_bdevs_list": [ 00:39:32.826 { 00:39:32.826 "name": null, 00:39:32.826 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:32.826 "is_configured": false, 00:39:32.826 "data_offset": 2048, 00:39:32.826 "data_size": 63488 00:39:32.826 }, 00:39:32.826 { 00:39:32.826 "name": "BaseBdev2", 00:39:32.826 "uuid": "6cc38149-7309-58e7-8b14-75987a5371eb", 00:39:32.826 "is_configured": true, 00:39:32.826 "data_offset": 2048, 00:39:32.826 "data_size": 63488 00:39:32.826 }, 00:39:32.826 { 00:39:32.826 "name": "BaseBdev3", 00:39:32.826 "uuid": "c72d4910-dd99-5f92-b8e6-896958e178a1", 00:39:32.826 "is_configured": true, 00:39:32.826 "data_offset": 2048, 00:39:32.826 "data_size": 63488 00:39:32.826 }, 00:39:32.826 { 00:39:32.826 "name": "BaseBdev4", 00:39:32.826 "uuid": "43f9b2e0-5023-540e-a976-947942981aee", 00:39:32.826 "is_configured": true, 00:39:32.826 "data_offset": 2048, 00:39:32.826 "data_size": 63488 00:39:32.826 } 00:39:32.826 ] 00:39:32.826 }' 00:39:32.826 11:50:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:39:32.826 11:50:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:39:33.394 11:50:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:39:33.394 11:50:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:39:33.394 11:50:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:39:33.394 11:50:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:39:33.394 11:50:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:39:33.394 11:50:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:33.394 11:50:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:33.394 11:50:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:39:33.394 "name": "raid_bdev1", 00:39:33.395 "uuid": "2e593a30-6530-42b2-b6c5-23ce758078f9", 00:39:33.395 "strip_size_kb": 0, 00:39:33.395 "state": "online", 00:39:33.395 "raid_level": "raid1", 00:39:33.395 "superblock": true, 00:39:33.395 "num_base_bdevs": 4, 00:39:33.395 "num_base_bdevs_discovered": 3, 00:39:33.395 "num_base_bdevs_operational": 3, 00:39:33.395 "base_bdevs_list": [ 00:39:33.395 { 00:39:33.395 "name": null, 00:39:33.395 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:33.395 "is_configured": false, 00:39:33.395 "data_offset": 2048, 00:39:33.395 "data_size": 63488 00:39:33.395 }, 00:39:33.395 { 00:39:33.395 "name": "BaseBdev2", 00:39:33.395 "uuid": "6cc38149-7309-58e7-8b14-75987a5371eb", 00:39:33.395 "is_configured": true, 00:39:33.395 "data_offset": 2048, 00:39:33.395 "data_size": 63488 00:39:33.395 }, 00:39:33.395 { 00:39:33.395 "name": "BaseBdev3", 00:39:33.395 "uuid": "c72d4910-dd99-5f92-b8e6-896958e178a1", 00:39:33.395 "is_configured": true, 00:39:33.395 "data_offset": 2048, 00:39:33.395 "data_size": 63488 00:39:33.395 }, 00:39:33.395 { 00:39:33.395 "name": "BaseBdev4", 00:39:33.395 "uuid": "43f9b2e0-5023-540e-a976-947942981aee", 00:39:33.395 "is_configured": true, 00:39:33.395 "data_offset": 2048, 00:39:33.395 "data_size": 63488 00:39:33.395 } 00:39:33.395 ] 00:39:33.395 }' 00:39:33.395 11:50:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:39:33.395 11:50:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:39:33.395 11:50:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:39:33.654 11:50:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:39:33.654 11:50:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:39:33.654 [2024-06-10 11:50:17.505579] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:39:33.654 11:50:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:39:33.654 [2024-06-10 11:50:17.559976] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xdd87a0 00:39:33.654 [2024-06-10 11:50:17.561118] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:39:33.912 [2024-06-10 11:50:17.676811] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:39:33.912 [2024-06-10 11:50:17.677335] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:39:33.912 [2024-06-10 11:50:17.794386] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:39:33.912 [2024-06-10 11:50:17.794545] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:39:34.480 [2024-06-10 11:50:18.136620] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:39:34.480 [2024-06-10 11:50:18.358783] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:39:34.739 11:50:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:39:34.739 11:50:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:39:34.739 11:50:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:39:34.739 11:50:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:39:34.739 11:50:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:39:34.739 11:50:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:34.739 11:50:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:34.999 11:50:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:39:34.999 "name": "raid_bdev1", 00:39:34.999 "uuid": "2e593a30-6530-42b2-b6c5-23ce758078f9", 00:39:34.999 "strip_size_kb": 0, 00:39:34.999 "state": "online", 00:39:34.999 "raid_level": "raid1", 00:39:34.999 "superblock": true, 00:39:34.999 "num_base_bdevs": 4, 00:39:34.999 "num_base_bdevs_discovered": 4, 00:39:34.999 "num_base_bdevs_operational": 4, 00:39:34.999 "process": { 00:39:34.999 "type": "rebuild", 00:39:34.999 "target": "spare", 00:39:34.999 "progress": { 00:39:34.999 "blocks": 14336, 00:39:34.999 "percent": 22 00:39:34.999 } 00:39:34.999 }, 00:39:34.999 "base_bdevs_list": [ 00:39:34.999 { 00:39:34.999 "name": "spare", 00:39:34.999 "uuid": "adfff921-f1a0-5a50-bb7a-d470aeffb389", 00:39:34.999 "is_configured": true, 00:39:34.999 "data_offset": 2048, 00:39:34.999 "data_size": 63488 00:39:34.999 }, 00:39:34.999 { 00:39:34.999 "name": "BaseBdev2", 00:39:34.999 "uuid": "6cc38149-7309-58e7-8b14-75987a5371eb", 00:39:34.999 "is_configured": true, 00:39:34.999 "data_offset": 2048, 00:39:34.999 "data_size": 63488 00:39:34.999 }, 00:39:34.999 { 00:39:34.999 "name": "BaseBdev3", 00:39:34.999 "uuid": "c72d4910-dd99-5f92-b8e6-896958e178a1", 00:39:34.999 "is_configured": true, 00:39:34.999 "data_offset": 2048, 00:39:34.999 "data_size": 63488 00:39:34.999 }, 00:39:34.999 { 00:39:34.999 "name": "BaseBdev4", 00:39:34.999 "uuid": "43f9b2e0-5023-540e-a976-947942981aee", 00:39:34.999 "is_configured": true, 00:39:34.999 "data_offset": 2048, 00:39:34.999 "data_size": 63488 00:39:34.999 } 00:39:34.999 ] 00:39:34.999 }' 00:39:34.999 11:50:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:39:34.999 11:50:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:39:34.999 11:50:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:39:34.999 [2024-06-10 11:50:18.812693] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:39:34.999 [2024-06-10 11:50:18.813337] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:39:34.999 11:50:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:39:34.999 11:50:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:39:34.999 11:50:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:39:34.999 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:39:34.999 11:50:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:39:34.999 11:50:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:39:34.999 11:50:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:39:34.999 11:50:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:39:35.258 [2024-06-10 11:50:18.988723] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:39:35.517 [2024-06-10 11:50:19.248439] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0xe4fc90 00:39:35.517 [2024-06-10 11:50:19.248468] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0xdd87a0 00:39:35.517 11:50:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:39:35.517 11:50:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:39:35.517 11:50:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:39:35.517 11:50:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:39:35.517 11:50:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:39:35.517 11:50:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:39:35.517 11:50:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:39:35.517 11:50:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:35.517 11:50:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:35.517 [2024-06-10 11:50:19.358816] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:39:35.517 11:50:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:39:35.517 "name": "raid_bdev1", 00:39:35.517 "uuid": "2e593a30-6530-42b2-b6c5-23ce758078f9", 00:39:35.517 "strip_size_kb": 0, 00:39:35.517 "state": "online", 00:39:35.517 "raid_level": "raid1", 00:39:35.517 "superblock": true, 00:39:35.517 "num_base_bdevs": 4, 00:39:35.517 "num_base_bdevs_discovered": 3, 00:39:35.517 "num_base_bdevs_operational": 3, 00:39:35.517 "process": { 00:39:35.517 "type": "rebuild", 00:39:35.517 "target": "spare", 00:39:35.517 "progress": { 00:39:35.517 "blocks": 20480, 00:39:35.517 "percent": 32 00:39:35.517 } 00:39:35.517 }, 00:39:35.517 "base_bdevs_list": [ 00:39:35.517 { 00:39:35.517 "name": "spare", 00:39:35.517 "uuid": "adfff921-f1a0-5a50-bb7a-d470aeffb389", 00:39:35.517 "is_configured": true, 00:39:35.518 "data_offset": 2048, 00:39:35.518 "data_size": 63488 00:39:35.518 }, 00:39:35.518 { 00:39:35.518 "name": null, 00:39:35.518 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:35.518 "is_configured": false, 00:39:35.518 "data_offset": 2048, 00:39:35.518 "data_size": 63488 00:39:35.518 }, 00:39:35.518 { 00:39:35.518 "name": "BaseBdev3", 00:39:35.518 "uuid": "c72d4910-dd99-5f92-b8e6-896958e178a1", 00:39:35.518 "is_configured": true, 00:39:35.518 "data_offset": 2048, 00:39:35.518 "data_size": 63488 00:39:35.518 }, 00:39:35.518 { 00:39:35.518 "name": "BaseBdev4", 00:39:35.518 "uuid": "43f9b2e0-5023-540e-a976-947942981aee", 00:39:35.518 "is_configured": true, 00:39:35.518 "data_offset": 2048, 00:39:35.518 "data_size": 63488 00:39:35.518 } 00:39:35.518 ] 00:39:35.518 }' 00:39:35.518 11:50:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:39:35.777 11:50:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:39:35.777 11:50:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:39:35.777 11:50:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:39:35.777 11:50:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@705 -- # local timeout=749 00:39:35.777 11:50:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:39:35.777 11:50:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:39:35.777 11:50:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:39:35.777 11:50:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:39:35.777 11:50:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:39:35.777 11:50:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:39:35.777 11:50:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:35.777 11:50:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:36.036 11:50:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:39:36.036 "name": "raid_bdev1", 00:39:36.036 "uuid": "2e593a30-6530-42b2-b6c5-23ce758078f9", 00:39:36.036 "strip_size_kb": 0, 00:39:36.036 "state": "online", 00:39:36.036 "raid_level": "raid1", 00:39:36.036 "superblock": true, 00:39:36.036 "num_base_bdevs": 4, 00:39:36.036 "num_base_bdevs_discovered": 3, 00:39:36.036 "num_base_bdevs_operational": 3, 00:39:36.036 "process": { 00:39:36.036 "type": "rebuild", 00:39:36.036 "target": "spare", 00:39:36.036 "progress": { 00:39:36.036 "blocks": 24576, 00:39:36.036 "percent": 38 00:39:36.036 } 00:39:36.036 }, 00:39:36.036 "base_bdevs_list": [ 00:39:36.036 { 00:39:36.036 "name": "spare", 00:39:36.036 "uuid": "adfff921-f1a0-5a50-bb7a-d470aeffb389", 00:39:36.036 "is_configured": true, 00:39:36.036 "data_offset": 2048, 00:39:36.036 "data_size": 63488 00:39:36.036 }, 00:39:36.036 { 00:39:36.036 "name": null, 00:39:36.036 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:36.036 "is_configured": false, 00:39:36.036 "data_offset": 2048, 00:39:36.036 "data_size": 63488 00:39:36.036 }, 00:39:36.036 { 00:39:36.036 "name": "BaseBdev3", 00:39:36.036 "uuid": "c72d4910-dd99-5f92-b8e6-896958e178a1", 00:39:36.036 "is_configured": true, 00:39:36.036 "data_offset": 2048, 00:39:36.036 "data_size": 63488 00:39:36.036 }, 00:39:36.036 { 00:39:36.036 "name": "BaseBdev4", 00:39:36.036 "uuid": "43f9b2e0-5023-540e-a976-947942981aee", 00:39:36.036 "is_configured": true, 00:39:36.036 "data_offset": 2048, 00:39:36.036 "data_size": 63488 00:39:36.036 } 00:39:36.036 ] 00:39:36.036 }' 00:39:36.036 11:50:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:39:36.036 11:50:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:39:36.036 11:50:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:39:36.036 11:50:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:39:36.036 11:50:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:39:36.602 [2024-06-10 11:50:20.489135] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:39:36.861 11:50:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:39:36.861 11:50:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:39:36.861 11:50:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:39:36.861 11:50:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:39:36.861 11:50:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:39:36.861 11:50:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:39:36.861 11:50:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:36.861 11:50:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:37.119 [2024-06-10 11:50:20.932859] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 47104 offset_begin: 43008 offset_end: 49152 00:39:37.119 11:50:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:39:37.119 "name": "raid_bdev1", 00:39:37.119 "uuid": "2e593a30-6530-42b2-b6c5-23ce758078f9", 00:39:37.119 "strip_size_kb": 0, 00:39:37.119 "state": "online", 00:39:37.119 "raid_level": "raid1", 00:39:37.119 "superblock": true, 00:39:37.119 "num_base_bdevs": 4, 00:39:37.119 "num_base_bdevs_discovered": 3, 00:39:37.119 "num_base_bdevs_operational": 3, 00:39:37.119 "process": { 00:39:37.119 "type": "rebuild", 00:39:37.119 "target": "spare", 00:39:37.119 "progress": { 00:39:37.119 "blocks": 47104, 00:39:37.119 "percent": 74 00:39:37.119 } 00:39:37.119 }, 00:39:37.119 "base_bdevs_list": [ 00:39:37.119 { 00:39:37.119 "name": "spare", 00:39:37.119 "uuid": "adfff921-f1a0-5a50-bb7a-d470aeffb389", 00:39:37.119 "is_configured": true, 00:39:37.119 "data_offset": 2048, 00:39:37.119 "data_size": 63488 00:39:37.119 }, 00:39:37.119 { 00:39:37.119 "name": null, 00:39:37.119 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:37.119 "is_configured": false, 00:39:37.119 "data_offset": 2048, 00:39:37.119 "data_size": 63488 00:39:37.119 }, 00:39:37.119 { 00:39:37.119 "name": "BaseBdev3", 00:39:37.119 "uuid": "c72d4910-dd99-5f92-b8e6-896958e178a1", 00:39:37.119 "is_configured": true, 00:39:37.119 "data_offset": 2048, 00:39:37.119 "data_size": 63488 00:39:37.119 }, 00:39:37.119 { 00:39:37.119 "name": "BaseBdev4", 00:39:37.119 "uuid": "43f9b2e0-5023-540e-a976-947942981aee", 00:39:37.119 "is_configured": true, 00:39:37.119 "data_offset": 2048, 00:39:37.119 "data_size": 63488 00:39:37.119 } 00:39:37.119 ] 00:39:37.119 }' 00:39:37.119 11:50:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:39:37.119 11:50:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:39:37.119 11:50:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:39:37.377 11:50:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:39:37.377 11:50:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:39:37.377 [2024-06-10 11:50:21.261360] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 51200 offset_begin: 49152 offset_end: 55296 00:39:37.636 [2024-06-10 11:50:21.476224] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 53248 offset_begin: 49152 offset_end: 55296 00:39:38.203 [2024-06-10 11:50:22.022949] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:39:38.203 11:50:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:39:38.203 11:50:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:39:38.203 11:50:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:39:38.203 11:50:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:39:38.203 11:50:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:39:38.203 11:50:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:39:38.203 11:50:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:38.203 11:50:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:38.203 [2024-06-10 11:50:22.123252] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:39:38.203 [2024-06-10 11:50:22.125643] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:39:38.474 11:50:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:39:38.475 "name": "raid_bdev1", 00:39:38.475 "uuid": "2e593a30-6530-42b2-b6c5-23ce758078f9", 00:39:38.475 "strip_size_kb": 0, 00:39:38.475 "state": "online", 00:39:38.475 "raid_level": "raid1", 00:39:38.475 "superblock": true, 00:39:38.475 "num_base_bdevs": 4, 00:39:38.475 "num_base_bdevs_discovered": 3, 00:39:38.475 "num_base_bdevs_operational": 3, 00:39:38.475 "base_bdevs_list": [ 00:39:38.475 { 00:39:38.475 "name": "spare", 00:39:38.475 "uuid": "adfff921-f1a0-5a50-bb7a-d470aeffb389", 00:39:38.475 "is_configured": true, 00:39:38.475 "data_offset": 2048, 00:39:38.475 "data_size": 63488 00:39:38.475 }, 00:39:38.475 { 00:39:38.475 "name": null, 00:39:38.475 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:38.475 "is_configured": false, 00:39:38.475 "data_offset": 2048, 00:39:38.475 "data_size": 63488 00:39:38.475 }, 00:39:38.475 { 00:39:38.475 "name": "BaseBdev3", 00:39:38.475 "uuid": "c72d4910-dd99-5f92-b8e6-896958e178a1", 00:39:38.475 "is_configured": true, 00:39:38.475 "data_offset": 2048, 00:39:38.475 "data_size": 63488 00:39:38.475 }, 00:39:38.475 { 00:39:38.475 "name": "BaseBdev4", 00:39:38.475 "uuid": "43f9b2e0-5023-540e-a976-947942981aee", 00:39:38.475 "is_configured": true, 00:39:38.475 "data_offset": 2048, 00:39:38.475 "data_size": 63488 00:39:38.475 } 00:39:38.475 ] 00:39:38.475 }' 00:39:38.475 11:50:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:39:38.475 11:50:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:39:38.475 11:50:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:39:38.475 11:50:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:39:38.475 11:50:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # break 00:39:38.475 11:50:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:39:38.475 11:50:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:39:38.475 11:50:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:39:38.475 11:50:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:39:38.475 11:50:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:39:38.475 11:50:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:38.475 11:50:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:38.733 11:50:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:39:38.733 "name": "raid_bdev1", 00:39:38.733 "uuid": "2e593a30-6530-42b2-b6c5-23ce758078f9", 00:39:38.733 "strip_size_kb": 0, 00:39:38.733 "state": "online", 00:39:38.733 "raid_level": "raid1", 00:39:38.733 "superblock": true, 00:39:38.733 "num_base_bdevs": 4, 00:39:38.733 "num_base_bdevs_discovered": 3, 00:39:38.733 "num_base_bdevs_operational": 3, 00:39:38.733 "base_bdevs_list": [ 00:39:38.733 { 00:39:38.733 "name": "spare", 00:39:38.733 "uuid": "adfff921-f1a0-5a50-bb7a-d470aeffb389", 00:39:38.733 "is_configured": true, 00:39:38.733 "data_offset": 2048, 00:39:38.733 "data_size": 63488 00:39:38.733 }, 00:39:38.733 { 00:39:38.733 "name": null, 00:39:38.733 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:38.733 "is_configured": false, 00:39:38.733 "data_offset": 2048, 00:39:38.733 "data_size": 63488 00:39:38.733 }, 00:39:38.733 { 00:39:38.733 "name": "BaseBdev3", 00:39:38.733 "uuid": "c72d4910-dd99-5f92-b8e6-896958e178a1", 00:39:38.734 "is_configured": true, 00:39:38.734 "data_offset": 2048, 00:39:38.734 "data_size": 63488 00:39:38.734 }, 00:39:38.734 { 00:39:38.734 "name": "BaseBdev4", 00:39:38.734 "uuid": "43f9b2e0-5023-540e-a976-947942981aee", 00:39:38.734 "is_configured": true, 00:39:38.734 "data_offset": 2048, 00:39:38.734 "data_size": 63488 00:39:38.734 } 00:39:38.734 ] 00:39:38.734 }' 00:39:38.734 11:50:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:39:38.734 11:50:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:39:38.734 11:50:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:39:38.734 11:50:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:39:38.734 11:50:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:39:38.734 11:50:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:39:38.734 11:50:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:39:38.734 11:50:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:39:38.734 11:50:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:39:38.734 11:50:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:39:38.734 11:50:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:39:38.734 11:50:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:39:38.734 11:50:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:39:38.734 11:50:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:39:38.734 11:50:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:38.734 11:50:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:38.992 11:50:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:39:38.992 "name": "raid_bdev1", 00:39:38.992 "uuid": "2e593a30-6530-42b2-b6c5-23ce758078f9", 00:39:38.992 "strip_size_kb": 0, 00:39:38.992 "state": "online", 00:39:38.992 "raid_level": "raid1", 00:39:38.992 "superblock": true, 00:39:38.992 "num_base_bdevs": 4, 00:39:38.992 "num_base_bdevs_discovered": 3, 00:39:38.992 "num_base_bdevs_operational": 3, 00:39:38.992 "base_bdevs_list": [ 00:39:38.992 { 00:39:38.992 "name": "spare", 00:39:38.992 "uuid": "adfff921-f1a0-5a50-bb7a-d470aeffb389", 00:39:38.992 "is_configured": true, 00:39:38.992 "data_offset": 2048, 00:39:38.992 "data_size": 63488 00:39:38.992 }, 00:39:38.992 { 00:39:38.992 "name": null, 00:39:38.992 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:38.992 "is_configured": false, 00:39:38.992 "data_offset": 2048, 00:39:38.992 "data_size": 63488 00:39:38.992 }, 00:39:38.992 { 00:39:38.992 "name": "BaseBdev3", 00:39:38.992 "uuid": "c72d4910-dd99-5f92-b8e6-896958e178a1", 00:39:38.992 "is_configured": true, 00:39:38.992 "data_offset": 2048, 00:39:38.992 "data_size": 63488 00:39:38.992 }, 00:39:38.992 { 00:39:38.992 "name": "BaseBdev4", 00:39:38.992 "uuid": "43f9b2e0-5023-540e-a976-947942981aee", 00:39:38.992 "is_configured": true, 00:39:38.992 "data_offset": 2048, 00:39:38.992 "data_size": 63488 00:39:38.992 } 00:39:38.992 ] 00:39:38.992 }' 00:39:38.993 11:50:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:39:38.993 11:50:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:39:39.559 11:50:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:39:39.559 [2024-06-10 11:50:23.379262] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:39:39.559 [2024-06-10 11:50:23.379287] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:39:39.559 00:39:39.559 Latency(us) 00:39:39.559 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:39:39.559 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:39:39.559 raid_bdev1 : 9.65 99.71 299.14 0.00 0.00 13350.92 245.76 113519.75 00:39:39.559 =================================================================================================================== 00:39:39.559 Total : 99.71 299.14 0.00 0.00 13350.92 245.76 113519.75 00:39:39.559 [2024-06-10 11:50:23.426137] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:39:39.559 [2024-06-10 11:50:23.426157] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:39:39.559 [2024-06-10 11:50:23.426218] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:39:39.559 [2024-06-10 11:50:23.426226] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe54790 name raid_bdev1, state offline 00:39:39.559 0 00:39:39.559 11:50:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:39.559 11:50:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # jq length 00:39:39.817 11:50:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:39:39.817 11:50:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:39:39.817 11:50:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:39:39.817 11:50:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:39:39.817 11:50:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:39:39.817 11:50:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:39:39.817 11:50:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:39:39.817 11:50:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:39:39.817 11:50:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:39:39.817 11:50:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:39:39.817 11:50:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:39:39.817 11:50:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:39:39.817 11:50:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:39:40.076 /dev/nbd0 00:39:40.076 11:50:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:39:40.076 11:50:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:39:40.076 11:50:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:39:40.076 11:50:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@868 -- # local i 00:39:40.076 11:50:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:39:40.076 11:50:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:39:40.076 11:50:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:39:40.076 11:50:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@872 -- # break 00:39:40.076 11:50:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:39:40.076 11:50:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:39:40.076 11:50:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:39:40.076 1+0 records in 00:39:40.076 1+0 records out 00:39:40.076 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000186962 s, 21.9 MB/s 00:39:40.076 11:50:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:39:40.076 11:50:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # size=4096 00:39:40.076 11:50:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:39:40.076 11:50:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:39:40.076 11:50:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@888 -- # return 0 00:39:40.076 11:50:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:39:40.076 11:50:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:39:40.076 11:50:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:39:40.076 11:50:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z '' ']' 00:39:40.076 11:50:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@727 -- # continue 00:39:40.076 11:50:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:39:40.076 11:50:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev3 ']' 00:39:40.076 11:50:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev3 /dev/nbd1 00:39:40.076 11:50:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:39:40.076 11:50:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev3') 00:39:40.076 11:50:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:39:40.076 11:50:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:39:40.076 11:50:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:39:40.076 11:50:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:39:40.076 11:50:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:39:40.076 11:50:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:39:40.076 11:50:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev3 /dev/nbd1 00:39:40.076 /dev/nbd1 00:39:40.076 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:39:40.334 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:39:40.334 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:39:40.334 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@868 -- # local i 00:39:40.334 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:39:40.334 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:39:40.334 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:39:40.334 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@872 -- # break 00:39:40.334 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:39:40.334 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:39:40.334 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:39:40.334 1+0 records in 00:39:40.334 1+0 records out 00:39:40.334 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000226241 s, 18.1 MB/s 00:39:40.334 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:39:40.334 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # size=4096 00:39:40.334 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:39:40.334 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:39:40.334 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@888 -- # return 0 00:39:40.334 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:39:40.334 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:39:40.334 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:39:40.334 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:39:40.334 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:39:40.334 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:39:40.334 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:39:40.334 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:39:40.334 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:39:40.334 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:39:40.593 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:39:40.593 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:39:40.593 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:39:40.593 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:39:40.593 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:39:40.593 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:39:40.593 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:39:40.593 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:39:40.593 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:39:40.593 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev4 ']' 00:39:40.593 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev4 /dev/nbd1 00:39:40.593 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:39:40.593 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev4') 00:39:40.593 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:39:40.593 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:39:40.593 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:39:40.593 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:39:40.593 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:39:40.593 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:39:40.593 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev4 /dev/nbd1 00:39:40.593 /dev/nbd1 00:39:40.593 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:39:40.593 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:39:40.593 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:39:40.593 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@868 -- # local i 00:39:40.593 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:39:40.593 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:39:40.593 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:39:40.593 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@872 -- # break 00:39:40.593 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:39:40.593 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:39:40.593 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:39:40.593 1+0 records in 00:39:40.593 1+0 records out 00:39:40.593 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000261586 s, 15.7 MB/s 00:39:40.593 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:39:40.593 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # size=4096 00:39:40.593 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:39:40.593 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:39:40.593 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@888 -- # return 0 00:39:40.593 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:39:40.593 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:39:40.593 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:39:40.851 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:39:40.851 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:39:40.851 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:39:40.851 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:39:40.851 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:39:40.851 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:39:40.851 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:39:40.851 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:39:40.851 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:39:40.851 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:39:40.851 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:39:40.851 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:39:40.851 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:39:40.851 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:39:40.851 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:39:40.851 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:39:40.851 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:39:40.851 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:39:40.851 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:39:40.851 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:39:40.851 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:39:40.851 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:39:41.109 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:39:41.109 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:39:41.109 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:39:41.109 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:39:41.109 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:39:41.109 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:39:41.109 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:39:41.109 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:39:41.109 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:39:41.109 11:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:39:41.367 11:50:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:39:41.367 [2024-06-10 11:50:25.288907] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:39:41.367 [2024-06-10 11:50:25.288952] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:39:41.367 [2024-06-10 11:50:25.288966] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe54a10 00:39:41.367 [2024-06-10 11:50:25.288974] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:39:41.367 [2024-06-10 11:50:25.290171] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:39:41.367 [2024-06-10 11:50:25.290195] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:39:41.367 [2024-06-10 11:50:25.290254] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:39:41.367 [2024-06-10 11:50:25.290275] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:39:41.367 [2024-06-10 11:50:25.290349] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:39:41.367 [2024-06-10 11:50:25.290397] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:39:41.367 spare 00:39:41.367 11:50:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:39:41.367 11:50:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:39:41.367 11:50:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:39:41.367 11:50:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:39:41.367 11:50:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:39:41.367 11:50:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:39:41.367 11:50:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:39:41.367 11:50:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:39:41.367 11:50:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:39:41.367 11:50:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:39:41.626 11:50:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:41.626 11:50:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:41.626 [2024-06-10 11:50:25.390695] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xdd9f70 00:39:41.626 [2024-06-10 11:50:25.390707] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:39:41.626 [2024-06-10 11:50:25.390841] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe53710 00:39:41.626 [2024-06-10 11:50:25.390952] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xdd9f70 00:39:41.626 [2024-06-10 11:50:25.390959] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xdd9f70 00:39:41.626 [2024-06-10 11:50:25.391035] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:39:41.626 11:50:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:39:41.626 "name": "raid_bdev1", 00:39:41.626 "uuid": "2e593a30-6530-42b2-b6c5-23ce758078f9", 00:39:41.626 "strip_size_kb": 0, 00:39:41.626 "state": "online", 00:39:41.626 "raid_level": "raid1", 00:39:41.626 "superblock": true, 00:39:41.626 "num_base_bdevs": 4, 00:39:41.626 "num_base_bdevs_discovered": 3, 00:39:41.626 "num_base_bdevs_operational": 3, 00:39:41.626 "base_bdevs_list": [ 00:39:41.626 { 00:39:41.626 "name": "spare", 00:39:41.626 "uuid": "adfff921-f1a0-5a50-bb7a-d470aeffb389", 00:39:41.626 "is_configured": true, 00:39:41.626 "data_offset": 2048, 00:39:41.626 "data_size": 63488 00:39:41.626 }, 00:39:41.626 { 00:39:41.626 "name": null, 00:39:41.626 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:41.626 "is_configured": false, 00:39:41.626 "data_offset": 2048, 00:39:41.626 "data_size": 63488 00:39:41.626 }, 00:39:41.626 { 00:39:41.626 "name": "BaseBdev3", 00:39:41.626 "uuid": "c72d4910-dd99-5f92-b8e6-896958e178a1", 00:39:41.626 "is_configured": true, 00:39:41.626 "data_offset": 2048, 00:39:41.626 "data_size": 63488 00:39:41.626 }, 00:39:41.626 { 00:39:41.626 "name": "BaseBdev4", 00:39:41.626 "uuid": "43f9b2e0-5023-540e-a976-947942981aee", 00:39:41.626 "is_configured": true, 00:39:41.626 "data_offset": 2048, 00:39:41.626 "data_size": 63488 00:39:41.626 } 00:39:41.626 ] 00:39:41.626 }' 00:39:41.626 11:50:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:39:41.626 11:50:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:39:42.192 11:50:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:39:42.192 11:50:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:39:42.192 11:50:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:39:42.192 11:50:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:39:42.192 11:50:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:39:42.192 11:50:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:42.192 11:50:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:42.450 11:50:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:39:42.450 "name": "raid_bdev1", 00:39:42.450 "uuid": "2e593a30-6530-42b2-b6c5-23ce758078f9", 00:39:42.450 "strip_size_kb": 0, 00:39:42.450 "state": "online", 00:39:42.450 "raid_level": "raid1", 00:39:42.450 "superblock": true, 00:39:42.450 "num_base_bdevs": 4, 00:39:42.450 "num_base_bdevs_discovered": 3, 00:39:42.450 "num_base_bdevs_operational": 3, 00:39:42.450 "base_bdevs_list": [ 00:39:42.450 { 00:39:42.450 "name": "spare", 00:39:42.450 "uuid": "adfff921-f1a0-5a50-bb7a-d470aeffb389", 00:39:42.450 "is_configured": true, 00:39:42.450 "data_offset": 2048, 00:39:42.450 "data_size": 63488 00:39:42.450 }, 00:39:42.450 { 00:39:42.450 "name": null, 00:39:42.450 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:42.450 "is_configured": false, 00:39:42.450 "data_offset": 2048, 00:39:42.450 "data_size": 63488 00:39:42.450 }, 00:39:42.450 { 00:39:42.450 "name": "BaseBdev3", 00:39:42.450 "uuid": "c72d4910-dd99-5f92-b8e6-896958e178a1", 00:39:42.450 "is_configured": true, 00:39:42.450 "data_offset": 2048, 00:39:42.450 "data_size": 63488 00:39:42.450 }, 00:39:42.450 { 00:39:42.450 "name": "BaseBdev4", 00:39:42.450 "uuid": "43f9b2e0-5023-540e-a976-947942981aee", 00:39:42.450 "is_configured": true, 00:39:42.450 "data_offset": 2048, 00:39:42.450 "data_size": 63488 00:39:42.450 } 00:39:42.450 ] 00:39:42.450 }' 00:39:42.450 11:50:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:39:42.450 11:50:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:39:42.450 11:50:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:39:42.450 11:50:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:39:42.450 11:50:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:42.450 11:50:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:39:42.715 11:50:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:39:42.715 11:50:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:39:42.715 [2024-06-10 11:50:26.572395] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:39:42.715 11:50:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:39:42.715 11:50:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:39:42.715 11:50:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:39:42.715 11:50:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:39:42.715 11:50:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:39:42.715 11:50:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:39:42.715 11:50:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:39:42.715 11:50:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:39:42.715 11:50:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:39:42.715 11:50:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:39:42.715 11:50:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:42.715 11:50:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:42.974 11:50:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:39:42.974 "name": "raid_bdev1", 00:39:42.974 "uuid": "2e593a30-6530-42b2-b6c5-23ce758078f9", 00:39:42.974 "strip_size_kb": 0, 00:39:42.974 "state": "online", 00:39:42.974 "raid_level": "raid1", 00:39:42.974 "superblock": true, 00:39:42.974 "num_base_bdevs": 4, 00:39:42.974 "num_base_bdevs_discovered": 2, 00:39:42.974 "num_base_bdevs_operational": 2, 00:39:42.974 "base_bdevs_list": [ 00:39:42.974 { 00:39:42.974 "name": null, 00:39:42.974 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:42.974 "is_configured": false, 00:39:42.974 "data_offset": 2048, 00:39:42.974 "data_size": 63488 00:39:42.974 }, 00:39:42.974 { 00:39:42.974 "name": null, 00:39:42.974 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:42.974 "is_configured": false, 00:39:42.974 "data_offset": 2048, 00:39:42.974 "data_size": 63488 00:39:42.974 }, 00:39:42.974 { 00:39:42.974 "name": "BaseBdev3", 00:39:42.974 "uuid": "c72d4910-dd99-5f92-b8e6-896958e178a1", 00:39:42.974 "is_configured": true, 00:39:42.974 "data_offset": 2048, 00:39:42.974 "data_size": 63488 00:39:42.974 }, 00:39:42.974 { 00:39:42.974 "name": "BaseBdev4", 00:39:42.974 "uuid": "43f9b2e0-5023-540e-a976-947942981aee", 00:39:42.974 "is_configured": true, 00:39:42.974 "data_offset": 2048, 00:39:42.974 "data_size": 63488 00:39:42.974 } 00:39:42.974 ] 00:39:42.974 }' 00:39:42.974 11:50:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:39:42.974 11:50:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:39:43.541 11:50:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:39:43.541 [2024-06-10 11:50:27.374548] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:39:43.541 [2024-06-10 11:50:27.374666] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:39:43.541 [2024-06-10 11:50:27.374678] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:39:43.541 [2024-06-10 11:50:27.374701] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:39:43.541 [2024-06-10 11:50:27.378743] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe53710 00:39:43.541 [2024-06-10 11:50:27.380457] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:39:43.541 11:50:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@755 -- # sleep 1 00:39:44.475 11:50:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:39:44.475 11:50:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:39:44.476 11:50:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:39:44.476 11:50:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:39:44.476 11:50:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:39:44.476 11:50:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:44.476 11:50:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:44.734 11:50:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:39:44.734 "name": "raid_bdev1", 00:39:44.734 "uuid": "2e593a30-6530-42b2-b6c5-23ce758078f9", 00:39:44.734 "strip_size_kb": 0, 00:39:44.734 "state": "online", 00:39:44.734 "raid_level": "raid1", 00:39:44.734 "superblock": true, 00:39:44.734 "num_base_bdevs": 4, 00:39:44.734 "num_base_bdevs_discovered": 3, 00:39:44.734 "num_base_bdevs_operational": 3, 00:39:44.734 "process": { 00:39:44.734 "type": "rebuild", 00:39:44.734 "target": "spare", 00:39:44.734 "progress": { 00:39:44.734 "blocks": 22528, 00:39:44.734 "percent": 35 00:39:44.734 } 00:39:44.734 }, 00:39:44.734 "base_bdevs_list": [ 00:39:44.734 { 00:39:44.734 "name": "spare", 00:39:44.734 "uuid": "adfff921-f1a0-5a50-bb7a-d470aeffb389", 00:39:44.734 "is_configured": true, 00:39:44.734 "data_offset": 2048, 00:39:44.734 "data_size": 63488 00:39:44.734 }, 00:39:44.734 { 00:39:44.734 "name": null, 00:39:44.734 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:44.734 "is_configured": false, 00:39:44.734 "data_offset": 2048, 00:39:44.734 "data_size": 63488 00:39:44.734 }, 00:39:44.734 { 00:39:44.734 "name": "BaseBdev3", 00:39:44.734 "uuid": "c72d4910-dd99-5f92-b8e6-896958e178a1", 00:39:44.734 "is_configured": true, 00:39:44.734 "data_offset": 2048, 00:39:44.734 "data_size": 63488 00:39:44.734 }, 00:39:44.734 { 00:39:44.734 "name": "BaseBdev4", 00:39:44.734 "uuid": "43f9b2e0-5023-540e-a976-947942981aee", 00:39:44.734 "is_configured": true, 00:39:44.734 "data_offset": 2048, 00:39:44.734 "data_size": 63488 00:39:44.734 } 00:39:44.734 ] 00:39:44.734 }' 00:39:44.734 11:50:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:39:44.734 11:50:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:39:44.734 11:50:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:39:44.734 11:50:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:39:44.734 11:50:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:39:44.994 [2024-06-10 11:50:28.819332] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:39:44.994 [2024-06-10 11:50:28.891490] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:39:44.994 [2024-06-10 11:50:28.891528] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:39:44.994 [2024-06-10 11:50:28.891538] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:39:44.994 [2024-06-10 11:50:28.891543] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:39:44.994 11:50:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:39:44.994 11:50:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:39:44.994 11:50:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:39:44.994 11:50:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:39:44.994 11:50:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:39:44.994 11:50:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:39:44.994 11:50:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:39:44.994 11:50:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:39:44.994 11:50:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:39:44.994 11:50:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:39:44.994 11:50:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:44.994 11:50:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:45.253 11:50:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:39:45.253 "name": "raid_bdev1", 00:39:45.253 "uuid": "2e593a30-6530-42b2-b6c5-23ce758078f9", 00:39:45.253 "strip_size_kb": 0, 00:39:45.253 "state": "online", 00:39:45.253 "raid_level": "raid1", 00:39:45.253 "superblock": true, 00:39:45.253 "num_base_bdevs": 4, 00:39:45.253 "num_base_bdevs_discovered": 2, 00:39:45.253 "num_base_bdevs_operational": 2, 00:39:45.253 "base_bdevs_list": [ 00:39:45.253 { 00:39:45.253 "name": null, 00:39:45.253 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:45.253 "is_configured": false, 00:39:45.253 "data_offset": 2048, 00:39:45.253 "data_size": 63488 00:39:45.253 }, 00:39:45.253 { 00:39:45.253 "name": null, 00:39:45.253 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:45.253 "is_configured": false, 00:39:45.253 "data_offset": 2048, 00:39:45.253 "data_size": 63488 00:39:45.253 }, 00:39:45.253 { 00:39:45.253 "name": "BaseBdev3", 00:39:45.253 "uuid": "c72d4910-dd99-5f92-b8e6-896958e178a1", 00:39:45.253 "is_configured": true, 00:39:45.253 "data_offset": 2048, 00:39:45.253 "data_size": 63488 00:39:45.253 }, 00:39:45.253 { 00:39:45.253 "name": "BaseBdev4", 00:39:45.253 "uuid": "43f9b2e0-5023-540e-a976-947942981aee", 00:39:45.253 "is_configured": true, 00:39:45.253 "data_offset": 2048, 00:39:45.253 "data_size": 63488 00:39:45.253 } 00:39:45.253 ] 00:39:45.253 }' 00:39:45.253 11:50:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:39:45.253 11:50:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:39:45.821 11:50:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:39:45.821 [2024-06-10 11:50:29.729861] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:39:45.821 [2024-06-10 11:50:29.729905] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:39:45.821 [2024-06-10 11:50:29.729922] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdda950 00:39:45.821 [2024-06-10 11:50:29.729931] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:39:45.821 [2024-06-10 11:50:29.730190] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:39:45.821 [2024-06-10 11:50:29.730202] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:39:45.821 [2024-06-10 11:50:29.730260] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:39:45.821 [2024-06-10 11:50:29.730270] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:39:45.821 [2024-06-10 11:50:29.730281] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:39:45.821 [2024-06-10 11:50:29.730295] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:39:45.821 [2024-06-10 11:50:29.734250] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xca2610 00:39:45.821 [2024-06-10 11:50:29.735254] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:39:45.821 spare 00:39:45.821 11:50:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@762 -- # sleep 1 00:39:47.197 11:50:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:39:47.197 11:50:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:39:47.197 11:50:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:39:47.197 11:50:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:39:47.197 11:50:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:39:47.197 11:50:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:47.197 11:50:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:47.197 11:50:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:39:47.197 "name": "raid_bdev1", 00:39:47.197 "uuid": "2e593a30-6530-42b2-b6c5-23ce758078f9", 00:39:47.197 "strip_size_kb": 0, 00:39:47.197 "state": "online", 00:39:47.197 "raid_level": "raid1", 00:39:47.197 "superblock": true, 00:39:47.197 "num_base_bdevs": 4, 00:39:47.197 "num_base_bdevs_discovered": 3, 00:39:47.197 "num_base_bdevs_operational": 3, 00:39:47.197 "process": { 00:39:47.197 "type": "rebuild", 00:39:47.197 "target": "spare", 00:39:47.197 "progress": { 00:39:47.197 "blocks": 22528, 00:39:47.197 "percent": 35 00:39:47.197 } 00:39:47.197 }, 00:39:47.197 "base_bdevs_list": [ 00:39:47.197 { 00:39:47.197 "name": "spare", 00:39:47.197 "uuid": "adfff921-f1a0-5a50-bb7a-d470aeffb389", 00:39:47.197 "is_configured": true, 00:39:47.197 "data_offset": 2048, 00:39:47.197 "data_size": 63488 00:39:47.197 }, 00:39:47.197 { 00:39:47.197 "name": null, 00:39:47.197 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:47.197 "is_configured": false, 00:39:47.197 "data_offset": 2048, 00:39:47.197 "data_size": 63488 00:39:47.197 }, 00:39:47.197 { 00:39:47.197 "name": "BaseBdev3", 00:39:47.197 "uuid": "c72d4910-dd99-5f92-b8e6-896958e178a1", 00:39:47.197 "is_configured": true, 00:39:47.197 "data_offset": 2048, 00:39:47.197 "data_size": 63488 00:39:47.197 }, 00:39:47.197 { 00:39:47.197 "name": "BaseBdev4", 00:39:47.197 "uuid": "43f9b2e0-5023-540e-a976-947942981aee", 00:39:47.197 "is_configured": true, 00:39:47.197 "data_offset": 2048, 00:39:47.197 "data_size": 63488 00:39:47.197 } 00:39:47.197 ] 00:39:47.197 }' 00:39:47.197 11:50:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:39:47.197 11:50:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:39:47.197 11:50:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:39:47.197 11:50:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:39:47.197 11:50:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:39:47.457 [2024-06-10 11:50:31.163351] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:39:47.457 [2024-06-10 11:50:31.246067] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:39:47.457 [2024-06-10 11:50:31.246100] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:39:47.457 [2024-06-10 11:50:31.246111] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:39:47.457 [2024-06-10 11:50:31.246116] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:39:47.457 11:50:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:39:47.457 11:50:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:39:47.457 11:50:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:39:47.457 11:50:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:39:47.457 11:50:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:39:47.457 11:50:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:39:47.457 11:50:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:39:47.457 11:50:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:39:47.457 11:50:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:39:47.457 11:50:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:39:47.457 11:50:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:47.457 11:50:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:47.716 11:50:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:39:47.716 "name": "raid_bdev1", 00:39:47.716 "uuid": "2e593a30-6530-42b2-b6c5-23ce758078f9", 00:39:47.716 "strip_size_kb": 0, 00:39:47.716 "state": "online", 00:39:47.716 "raid_level": "raid1", 00:39:47.716 "superblock": true, 00:39:47.716 "num_base_bdevs": 4, 00:39:47.716 "num_base_bdevs_discovered": 2, 00:39:47.716 "num_base_bdevs_operational": 2, 00:39:47.716 "base_bdevs_list": [ 00:39:47.716 { 00:39:47.716 "name": null, 00:39:47.716 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:47.716 "is_configured": false, 00:39:47.716 "data_offset": 2048, 00:39:47.716 "data_size": 63488 00:39:47.716 }, 00:39:47.716 { 00:39:47.716 "name": null, 00:39:47.716 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:47.716 "is_configured": false, 00:39:47.716 "data_offset": 2048, 00:39:47.716 "data_size": 63488 00:39:47.716 }, 00:39:47.716 { 00:39:47.716 "name": "BaseBdev3", 00:39:47.716 "uuid": "c72d4910-dd99-5f92-b8e6-896958e178a1", 00:39:47.716 "is_configured": true, 00:39:47.716 "data_offset": 2048, 00:39:47.716 "data_size": 63488 00:39:47.716 }, 00:39:47.716 { 00:39:47.716 "name": "BaseBdev4", 00:39:47.716 "uuid": "43f9b2e0-5023-540e-a976-947942981aee", 00:39:47.716 "is_configured": true, 00:39:47.716 "data_offset": 2048, 00:39:47.716 "data_size": 63488 00:39:47.716 } 00:39:47.716 ] 00:39:47.716 }' 00:39:47.716 11:50:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:39:47.716 11:50:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:39:48.286 11:50:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:39:48.286 11:50:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:39:48.286 11:50:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:39:48.286 11:50:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:39:48.286 11:50:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:39:48.286 11:50:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:48.286 11:50:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:48.286 11:50:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:39:48.286 "name": "raid_bdev1", 00:39:48.286 "uuid": "2e593a30-6530-42b2-b6c5-23ce758078f9", 00:39:48.286 "strip_size_kb": 0, 00:39:48.286 "state": "online", 00:39:48.286 "raid_level": "raid1", 00:39:48.286 "superblock": true, 00:39:48.286 "num_base_bdevs": 4, 00:39:48.286 "num_base_bdevs_discovered": 2, 00:39:48.286 "num_base_bdevs_operational": 2, 00:39:48.286 "base_bdevs_list": [ 00:39:48.286 { 00:39:48.286 "name": null, 00:39:48.286 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:48.286 "is_configured": false, 00:39:48.286 "data_offset": 2048, 00:39:48.286 "data_size": 63488 00:39:48.286 }, 00:39:48.286 { 00:39:48.286 "name": null, 00:39:48.286 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:48.286 "is_configured": false, 00:39:48.286 "data_offset": 2048, 00:39:48.286 "data_size": 63488 00:39:48.286 }, 00:39:48.286 { 00:39:48.286 "name": "BaseBdev3", 00:39:48.286 "uuid": "c72d4910-dd99-5f92-b8e6-896958e178a1", 00:39:48.286 "is_configured": true, 00:39:48.286 "data_offset": 2048, 00:39:48.286 "data_size": 63488 00:39:48.286 }, 00:39:48.286 { 00:39:48.286 "name": "BaseBdev4", 00:39:48.286 "uuid": "43f9b2e0-5023-540e-a976-947942981aee", 00:39:48.286 "is_configured": true, 00:39:48.286 "data_offset": 2048, 00:39:48.286 "data_size": 63488 00:39:48.286 } 00:39:48.286 ] 00:39:48.286 }' 00:39:48.286 11:50:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:39:48.286 11:50:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:39:48.286 11:50:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:39:48.286 11:50:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:39:48.286 11:50:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:39:48.648 11:50:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:39:48.648 [2024-06-10 11:50:32.478098] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:39:48.648 [2024-06-10 11:50:32.478136] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:39:48.648 [2024-06-10 11:50:32.478149] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xca2410 00:39:48.648 [2024-06-10 11:50:32.478157] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:39:48.648 [2024-06-10 11:50:32.478407] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:39:48.648 [2024-06-10 11:50:32.478421] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:39:48.648 [2024-06-10 11:50:32.478468] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:39:48.648 [2024-06-10 11:50:32.478479] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:39:48.648 [2024-06-10 11:50:32.478487] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:39:48.648 BaseBdev1 00:39:48.648 11:50:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@773 -- # sleep 1 00:39:49.584 11:50:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:39:49.584 11:50:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:39:49.584 11:50:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:39:49.584 11:50:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:39:49.584 11:50:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:39:49.584 11:50:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:39:49.584 11:50:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:39:49.584 11:50:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:39:49.584 11:50:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:39:49.584 11:50:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:39:49.584 11:50:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:49.584 11:50:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:49.843 11:50:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:39:49.843 "name": "raid_bdev1", 00:39:49.843 "uuid": "2e593a30-6530-42b2-b6c5-23ce758078f9", 00:39:49.843 "strip_size_kb": 0, 00:39:49.843 "state": "online", 00:39:49.843 "raid_level": "raid1", 00:39:49.843 "superblock": true, 00:39:49.843 "num_base_bdevs": 4, 00:39:49.843 "num_base_bdevs_discovered": 2, 00:39:49.843 "num_base_bdevs_operational": 2, 00:39:49.843 "base_bdevs_list": [ 00:39:49.843 { 00:39:49.843 "name": null, 00:39:49.843 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:49.843 "is_configured": false, 00:39:49.843 "data_offset": 2048, 00:39:49.843 "data_size": 63488 00:39:49.843 }, 00:39:49.843 { 00:39:49.843 "name": null, 00:39:49.843 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:49.843 "is_configured": false, 00:39:49.843 "data_offset": 2048, 00:39:49.843 "data_size": 63488 00:39:49.843 }, 00:39:49.843 { 00:39:49.843 "name": "BaseBdev3", 00:39:49.843 "uuid": "c72d4910-dd99-5f92-b8e6-896958e178a1", 00:39:49.843 "is_configured": true, 00:39:49.843 "data_offset": 2048, 00:39:49.843 "data_size": 63488 00:39:49.843 }, 00:39:49.843 { 00:39:49.843 "name": "BaseBdev4", 00:39:49.843 "uuid": "43f9b2e0-5023-540e-a976-947942981aee", 00:39:49.843 "is_configured": true, 00:39:49.843 "data_offset": 2048, 00:39:49.843 "data_size": 63488 00:39:49.843 } 00:39:49.843 ] 00:39:49.843 }' 00:39:49.843 11:50:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:39:49.843 11:50:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:39:50.410 11:50:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:39:50.410 11:50:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:39:50.410 11:50:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:39:50.410 11:50:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:39:50.410 11:50:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:39:50.411 11:50:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:50.411 11:50:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:50.411 11:50:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:39:50.411 "name": "raid_bdev1", 00:39:50.411 "uuid": "2e593a30-6530-42b2-b6c5-23ce758078f9", 00:39:50.411 "strip_size_kb": 0, 00:39:50.411 "state": "online", 00:39:50.411 "raid_level": "raid1", 00:39:50.411 "superblock": true, 00:39:50.411 "num_base_bdevs": 4, 00:39:50.411 "num_base_bdevs_discovered": 2, 00:39:50.411 "num_base_bdevs_operational": 2, 00:39:50.411 "base_bdevs_list": [ 00:39:50.411 { 00:39:50.411 "name": null, 00:39:50.411 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:50.411 "is_configured": false, 00:39:50.411 "data_offset": 2048, 00:39:50.411 "data_size": 63488 00:39:50.411 }, 00:39:50.411 { 00:39:50.411 "name": null, 00:39:50.411 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:50.411 "is_configured": false, 00:39:50.411 "data_offset": 2048, 00:39:50.411 "data_size": 63488 00:39:50.411 }, 00:39:50.411 { 00:39:50.411 "name": "BaseBdev3", 00:39:50.411 "uuid": "c72d4910-dd99-5f92-b8e6-896958e178a1", 00:39:50.411 "is_configured": true, 00:39:50.411 "data_offset": 2048, 00:39:50.411 "data_size": 63488 00:39:50.411 }, 00:39:50.411 { 00:39:50.411 "name": "BaseBdev4", 00:39:50.411 "uuid": "43f9b2e0-5023-540e-a976-947942981aee", 00:39:50.411 "is_configured": true, 00:39:50.411 "data_offset": 2048, 00:39:50.411 "data_size": 63488 00:39:50.411 } 00:39:50.411 ] 00:39:50.411 }' 00:39:50.411 11:50:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:39:50.670 11:50:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:39:50.670 11:50:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:39:50.670 11:50:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:39:50.670 11:50:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:39:50.670 11:50:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@649 -- # local es=0 00:39:50.670 11:50:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:39:50.670 11:50:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:39:50.670 11:50:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:39:50.670 11:50:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:39:50.670 11:50:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:39:50.670 11:50:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:39:50.670 11:50:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:39:50.670 11:50:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:39:50.670 11:50:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:39:50.670 11:50:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:39:50.670 [2024-06-10 11:50:34.571727] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:39:50.670 [2024-06-10 11:50:34.571828] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:39:50.670 [2024-06-10 11:50:34.571839] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:39:50.670 request: 00:39:50.670 { 00:39:50.670 "raid_bdev": "raid_bdev1", 00:39:50.670 "base_bdev": "BaseBdev1", 00:39:50.670 "method": "bdev_raid_add_base_bdev", 00:39:50.670 "req_id": 1 00:39:50.670 } 00:39:50.670 Got JSON-RPC error response 00:39:50.670 response: 00:39:50.670 { 00:39:50.670 "code": -22, 00:39:50.670 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:39:50.670 } 00:39:50.670 11:50:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@652 -- # es=1 00:39:50.670 11:50:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:39:50.670 11:50:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:39:50.670 11:50:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:39:50.670 11:50:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@777 -- # sleep 1 00:39:52.047 11:50:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:39:52.047 11:50:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:39:52.047 11:50:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:39:52.047 11:50:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:39:52.047 11:50:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:39:52.047 11:50:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:39:52.047 11:50:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:39:52.047 11:50:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:39:52.047 11:50:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:39:52.047 11:50:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:39:52.047 11:50:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:52.047 11:50:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:52.047 11:50:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:39:52.047 "name": "raid_bdev1", 00:39:52.047 "uuid": "2e593a30-6530-42b2-b6c5-23ce758078f9", 00:39:52.047 "strip_size_kb": 0, 00:39:52.047 "state": "online", 00:39:52.047 "raid_level": "raid1", 00:39:52.047 "superblock": true, 00:39:52.047 "num_base_bdevs": 4, 00:39:52.047 "num_base_bdevs_discovered": 2, 00:39:52.047 "num_base_bdevs_operational": 2, 00:39:52.047 "base_bdevs_list": [ 00:39:52.047 { 00:39:52.047 "name": null, 00:39:52.047 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:52.047 "is_configured": false, 00:39:52.047 "data_offset": 2048, 00:39:52.047 "data_size": 63488 00:39:52.047 }, 00:39:52.047 { 00:39:52.047 "name": null, 00:39:52.047 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:52.047 "is_configured": false, 00:39:52.047 "data_offset": 2048, 00:39:52.047 "data_size": 63488 00:39:52.047 }, 00:39:52.047 { 00:39:52.047 "name": "BaseBdev3", 00:39:52.047 "uuid": "c72d4910-dd99-5f92-b8e6-896958e178a1", 00:39:52.047 "is_configured": true, 00:39:52.047 "data_offset": 2048, 00:39:52.047 "data_size": 63488 00:39:52.047 }, 00:39:52.047 { 00:39:52.047 "name": "BaseBdev4", 00:39:52.047 "uuid": "43f9b2e0-5023-540e-a976-947942981aee", 00:39:52.047 "is_configured": true, 00:39:52.047 "data_offset": 2048, 00:39:52.047 "data_size": 63488 00:39:52.047 } 00:39:52.047 ] 00:39:52.047 }' 00:39:52.047 11:50:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:39:52.047 11:50:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:39:52.615 11:50:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:39:52.615 11:50:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:39:52.615 11:50:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:39:52.615 11:50:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:39:52.615 11:50:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:39:52.615 11:50:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:52.615 11:50:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:52.615 11:50:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:39:52.615 "name": "raid_bdev1", 00:39:52.615 "uuid": "2e593a30-6530-42b2-b6c5-23ce758078f9", 00:39:52.615 "strip_size_kb": 0, 00:39:52.615 "state": "online", 00:39:52.615 "raid_level": "raid1", 00:39:52.615 "superblock": true, 00:39:52.615 "num_base_bdevs": 4, 00:39:52.615 "num_base_bdevs_discovered": 2, 00:39:52.615 "num_base_bdevs_operational": 2, 00:39:52.615 "base_bdevs_list": [ 00:39:52.615 { 00:39:52.615 "name": null, 00:39:52.615 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:52.615 "is_configured": false, 00:39:52.615 "data_offset": 2048, 00:39:52.615 "data_size": 63488 00:39:52.615 }, 00:39:52.615 { 00:39:52.615 "name": null, 00:39:52.615 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:52.615 "is_configured": false, 00:39:52.615 "data_offset": 2048, 00:39:52.615 "data_size": 63488 00:39:52.615 }, 00:39:52.615 { 00:39:52.615 "name": "BaseBdev3", 00:39:52.615 "uuid": "c72d4910-dd99-5f92-b8e6-896958e178a1", 00:39:52.615 "is_configured": true, 00:39:52.615 "data_offset": 2048, 00:39:52.615 "data_size": 63488 00:39:52.615 }, 00:39:52.615 { 00:39:52.615 "name": "BaseBdev4", 00:39:52.615 "uuid": "43f9b2e0-5023-540e-a976-947942981aee", 00:39:52.615 "is_configured": true, 00:39:52.615 "data_offset": 2048, 00:39:52.615 "data_size": 63488 00:39:52.615 } 00:39:52.615 ] 00:39:52.615 }' 00:39:52.615 11:50:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:39:52.615 11:50:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:39:52.615 11:50:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:39:52.615 11:50:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:39:52.615 11:50:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@782 -- # killprocess 239553 00:39:52.615 11:50:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@949 -- # '[' -z 239553 ']' 00:39:52.615 11:50:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # kill -0 239553 00:39:52.615 11:50:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # uname 00:39:52.615 11:50:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:39:52.615 11:50:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 239553 00:39:52.874 11:50:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:39:52.874 11:50:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:39:52.874 11:50:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@967 -- # echo 'killing process with pid 239553' 00:39:52.874 killing process with pid 239553 00:39:52.874 11:50:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@968 -- # kill 239553 00:39:52.874 Received shutdown signal, test time was about 22.765210 seconds 00:39:52.874 00:39:52.874 Latency(us) 00:39:52.874 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:39:52.874 =================================================================================================================== 00:39:52.874 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:39:52.874 [2024-06-10 11:50:36.573125] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:39:52.874 [2024-06-10 11:50:36.573200] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:39:52.874 [2024-06-10 11:50:36.573240] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:39:52.874 [2024-06-10 11:50:36.573249] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xdd9f70 name raid_bdev1, state offline 00:39:52.874 11:50:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@973 -- # wait 239553 00:39:52.874 [2024-06-10 11:50:36.619532] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:39:53.133 11:50:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@784 -- # return 0 00:39:53.133 00:39:53.133 real 0m27.021s 00:39:53.133 user 0m41.080s 00:39:53.133 sys 0m4.227s 00:39:53.133 11:50:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1125 -- # xtrace_disable 00:39:53.133 11:50:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:39:53.133 ************************************ 00:39:53.133 END TEST raid_rebuild_test_sb_io 00:39:53.133 ************************************ 00:39:53.133 11:50:36 bdev_raid -- bdev/bdev_raid.sh@884 -- # '[' n == y ']' 00:39:53.133 11:50:36 bdev_raid -- bdev/bdev_raid.sh@896 -- # base_blocklen=4096 00:39:53.133 11:50:36 bdev_raid -- bdev/bdev_raid.sh@898 -- # run_test raid_state_function_test_sb_4k raid_state_function_test raid1 2 true 00:39:53.133 11:50:36 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:39:53.133 11:50:36 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:39:53.133 11:50:36 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:39:53.133 ************************************ 00:39:53.133 START TEST raid_state_function_test_sb_4k 00:39:53.133 ************************************ 00:39:53.133 11:50:36 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@1124 -- # raid_state_function_test raid1 2 true 00:39:53.133 11:50:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:39:53.133 11:50:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:39:53.133 11:50:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:39:53.133 11:50:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:39:53.133 11:50:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:39:53.133 11:50:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:39:53.133 11:50:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:39:53.133 11:50:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:39:53.133 11:50:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:39:53.133 11:50:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:39:53.133 11:50:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:39:53.133 11:50:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:39:53.133 11:50:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:39:53.133 11:50:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:39:53.134 11:50:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:39:53.134 11:50:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # local strip_size 00:39:53.134 11:50:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:39:53.134 11:50:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:39:53.134 11:50:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:39:53.134 11:50:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:39:53.134 11:50:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:39:53.134 11:50:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:39:53.134 11:50:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@244 -- # raid_pid=243456 00:39:53.134 11:50:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 243456' 00:39:53.134 Process raid pid: 243456 00:39:53.134 11:50:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:39:53.134 11:50:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@246 -- # waitforlisten 243456 /var/tmp/spdk-raid.sock 00:39:53.134 11:50:36 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@830 -- # '[' -z 243456 ']' 00:39:53.134 11:50:36 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:39:53.134 11:50:36 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@835 -- # local max_retries=100 00:39:53.134 11:50:36 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:39:53.134 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:39:53.134 11:50:36 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@839 -- # xtrace_disable 00:39:53.134 11:50:36 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:39:53.134 [2024-06-10 11:50:36.979131] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:39:53.134 [2024-06-10 11:50:36.979182] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:39:53.134 [2024-06-10 11:50:37.067514] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:39:53.393 [2024-06-10 11:50:37.147073] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:39:53.393 [2024-06-10 11:50:37.200766] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:39:53.393 [2024-06-10 11:50:37.200794] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:39:53.960 11:50:37 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:39:53.960 11:50:37 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@863 -- # return 0 00:39:53.960 11:50:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:39:54.219 [2024-06-10 11:50:37.920286] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:39:54.219 [2024-06-10 11:50:37.920322] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:39:54.219 [2024-06-10 11:50:37.920329] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:39:54.219 [2024-06-10 11:50:37.920337] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:39:54.219 11:50:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:39:54.219 11:50:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:39:54.219 11:50:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:39:54.219 11:50:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:39:54.219 11:50:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:39:54.219 11:50:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:39:54.219 11:50:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:39:54.219 11:50:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:39:54.220 11:50:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:39:54.220 11:50:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:39:54.220 11:50:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:54.220 11:50:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:39:54.220 11:50:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:39:54.220 "name": "Existed_Raid", 00:39:54.220 "uuid": "ef6e8ef1-9ae0-4e0b-9292-4260d71cdc76", 00:39:54.220 "strip_size_kb": 0, 00:39:54.220 "state": "configuring", 00:39:54.220 "raid_level": "raid1", 00:39:54.220 "superblock": true, 00:39:54.220 "num_base_bdevs": 2, 00:39:54.220 "num_base_bdevs_discovered": 0, 00:39:54.220 "num_base_bdevs_operational": 2, 00:39:54.220 "base_bdevs_list": [ 00:39:54.220 { 00:39:54.220 "name": "BaseBdev1", 00:39:54.220 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:54.220 "is_configured": false, 00:39:54.220 "data_offset": 0, 00:39:54.220 "data_size": 0 00:39:54.220 }, 00:39:54.220 { 00:39:54.220 "name": "BaseBdev2", 00:39:54.220 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:54.220 "is_configured": false, 00:39:54.220 "data_offset": 0, 00:39:54.220 "data_size": 0 00:39:54.220 } 00:39:54.220 ] 00:39:54.220 }' 00:39:54.220 11:50:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:39:54.220 11:50:38 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:39:54.787 11:50:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:39:55.046 [2024-06-10 11:50:38.758355] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:39:55.046 [2024-06-10 11:50:38.758379] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x204c510 name Existed_Raid, state configuring 00:39:55.046 11:50:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:39:55.046 [2024-06-10 11:50:38.938847] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:39:55.046 [2024-06-10 11:50:38.938873] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:39:55.046 [2024-06-10 11:50:38.938879] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:39:55.046 [2024-06-10 11:50:38.938887] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:39:55.046 11:50:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev1 00:39:55.305 [2024-06-10 11:50:39.120037] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:39:55.305 BaseBdev1 00:39:55.305 11:50:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:39:55.305 11:50:39 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:39:55.305 11:50:39 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:39:55.305 11:50:39 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # local i 00:39:55.305 11:50:39 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:39:55.305 11:50:39 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:39:55.305 11:50:39 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:39:55.564 11:50:39 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:39:55.565 [ 00:39:55.565 { 00:39:55.565 "name": "BaseBdev1", 00:39:55.565 "aliases": [ 00:39:55.565 "b6651527-7bd8-450c-b4b5-27e1dec25f5b" 00:39:55.565 ], 00:39:55.565 "product_name": "Malloc disk", 00:39:55.565 "block_size": 4096, 00:39:55.565 "num_blocks": 8192, 00:39:55.565 "uuid": "b6651527-7bd8-450c-b4b5-27e1dec25f5b", 00:39:55.565 "assigned_rate_limits": { 00:39:55.565 "rw_ios_per_sec": 0, 00:39:55.565 "rw_mbytes_per_sec": 0, 00:39:55.565 "r_mbytes_per_sec": 0, 00:39:55.565 "w_mbytes_per_sec": 0 00:39:55.565 }, 00:39:55.565 "claimed": true, 00:39:55.565 "claim_type": "exclusive_write", 00:39:55.565 "zoned": false, 00:39:55.565 "supported_io_types": { 00:39:55.565 "read": true, 00:39:55.565 "write": true, 00:39:55.565 "unmap": true, 00:39:55.565 "write_zeroes": true, 00:39:55.565 "flush": true, 00:39:55.565 "reset": true, 00:39:55.565 "compare": false, 00:39:55.565 "compare_and_write": false, 00:39:55.565 "abort": true, 00:39:55.565 "nvme_admin": false, 00:39:55.565 "nvme_io": false 00:39:55.565 }, 00:39:55.565 "memory_domains": [ 00:39:55.565 { 00:39:55.565 "dma_device_id": "system", 00:39:55.565 "dma_device_type": 1 00:39:55.565 }, 00:39:55.565 { 00:39:55.565 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:39:55.565 "dma_device_type": 2 00:39:55.565 } 00:39:55.565 ], 00:39:55.565 "driver_specific": {} 00:39:55.565 } 00:39:55.565 ] 00:39:55.565 11:50:39 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@906 -- # return 0 00:39:55.565 11:50:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:39:55.565 11:50:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:39:55.565 11:50:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:39:55.565 11:50:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:39:55.565 11:50:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:39:55.565 11:50:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:39:55.565 11:50:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:39:55.565 11:50:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:39:55.565 11:50:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:39:55.565 11:50:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:39:55.565 11:50:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:55.565 11:50:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:39:55.824 11:50:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:39:55.824 "name": "Existed_Raid", 00:39:55.824 "uuid": "cb14e87c-7fac-42b1-83d6-9e018ce92504", 00:39:55.824 "strip_size_kb": 0, 00:39:55.824 "state": "configuring", 00:39:55.824 "raid_level": "raid1", 00:39:55.824 "superblock": true, 00:39:55.824 "num_base_bdevs": 2, 00:39:55.824 "num_base_bdevs_discovered": 1, 00:39:55.824 "num_base_bdevs_operational": 2, 00:39:55.824 "base_bdevs_list": [ 00:39:55.824 { 00:39:55.824 "name": "BaseBdev1", 00:39:55.824 "uuid": "b6651527-7bd8-450c-b4b5-27e1dec25f5b", 00:39:55.824 "is_configured": true, 00:39:55.824 "data_offset": 256, 00:39:55.824 "data_size": 7936 00:39:55.824 }, 00:39:55.824 { 00:39:55.824 "name": "BaseBdev2", 00:39:55.824 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:55.824 "is_configured": false, 00:39:55.825 "data_offset": 0, 00:39:55.825 "data_size": 0 00:39:55.825 } 00:39:55.825 ] 00:39:55.825 }' 00:39:55.825 11:50:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:39:55.825 11:50:39 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:39:56.393 11:50:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:39:56.393 [2024-06-10 11:50:40.327157] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:39:56.393 [2024-06-10 11:50:40.327194] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x204be00 name Existed_Raid, state configuring 00:39:56.652 11:50:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:39:56.652 [2024-06-10 11:50:40.499623] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:39:56.652 [2024-06-10 11:50:40.500657] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:39:56.652 [2024-06-10 11:50:40.500682] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:39:56.652 11:50:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:39:56.652 11:50:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:39:56.652 11:50:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:39:56.652 11:50:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:39:56.652 11:50:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:39:56.652 11:50:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:39:56.652 11:50:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:39:56.652 11:50:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:39:56.652 11:50:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:39:56.652 11:50:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:39:56.652 11:50:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:39:56.652 11:50:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:39:56.652 11:50:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:56.652 11:50:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:39:56.911 11:50:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:39:56.911 "name": "Existed_Raid", 00:39:56.911 "uuid": "caea2180-e05d-4652-b925-7a5aed8758eb", 00:39:56.911 "strip_size_kb": 0, 00:39:56.911 "state": "configuring", 00:39:56.911 "raid_level": "raid1", 00:39:56.911 "superblock": true, 00:39:56.911 "num_base_bdevs": 2, 00:39:56.911 "num_base_bdevs_discovered": 1, 00:39:56.911 "num_base_bdevs_operational": 2, 00:39:56.911 "base_bdevs_list": [ 00:39:56.911 { 00:39:56.911 "name": "BaseBdev1", 00:39:56.911 "uuid": "b6651527-7bd8-450c-b4b5-27e1dec25f5b", 00:39:56.911 "is_configured": true, 00:39:56.911 "data_offset": 256, 00:39:56.911 "data_size": 7936 00:39:56.911 }, 00:39:56.911 { 00:39:56.911 "name": "BaseBdev2", 00:39:56.911 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:56.911 "is_configured": false, 00:39:56.911 "data_offset": 0, 00:39:56.911 "data_size": 0 00:39:56.911 } 00:39:56.911 ] 00:39:56.911 }' 00:39:56.911 11:50:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:39:56.911 11:50:40 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:39:57.479 11:50:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev2 00:39:57.479 [2024-06-10 11:50:41.368721] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:39:57.479 [2024-06-10 11:50:41.368844] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x204cbf0 00:39:57.479 [2024-06-10 11:50:41.368853] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:39:57.479 [2024-06-10 11:50:41.369002] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21fe9b0 00:39:57.479 [2024-06-10 11:50:41.369095] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x204cbf0 00:39:57.479 [2024-06-10 11:50:41.369102] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x204cbf0 00:39:57.479 [2024-06-10 11:50:41.369171] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:39:57.479 BaseBdev2 00:39:57.479 11:50:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:39:57.479 11:50:41 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:39:57.479 11:50:41 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:39:57.479 11:50:41 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # local i 00:39:57.479 11:50:41 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:39:57.479 11:50:41 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:39:57.479 11:50:41 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:39:57.738 11:50:41 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:39:57.997 [ 00:39:57.997 { 00:39:57.997 "name": "BaseBdev2", 00:39:57.997 "aliases": [ 00:39:57.997 "4d0440be-25b3-493a-a1cc-a1675fd564df" 00:39:57.997 ], 00:39:57.997 "product_name": "Malloc disk", 00:39:57.997 "block_size": 4096, 00:39:57.997 "num_blocks": 8192, 00:39:57.997 "uuid": "4d0440be-25b3-493a-a1cc-a1675fd564df", 00:39:57.997 "assigned_rate_limits": { 00:39:57.997 "rw_ios_per_sec": 0, 00:39:57.997 "rw_mbytes_per_sec": 0, 00:39:57.997 "r_mbytes_per_sec": 0, 00:39:57.997 "w_mbytes_per_sec": 0 00:39:57.997 }, 00:39:57.997 "claimed": true, 00:39:57.997 "claim_type": "exclusive_write", 00:39:57.997 "zoned": false, 00:39:57.997 "supported_io_types": { 00:39:57.997 "read": true, 00:39:57.997 "write": true, 00:39:57.997 "unmap": true, 00:39:57.997 "write_zeroes": true, 00:39:57.997 "flush": true, 00:39:57.997 "reset": true, 00:39:57.997 "compare": false, 00:39:57.997 "compare_and_write": false, 00:39:57.997 "abort": true, 00:39:57.997 "nvme_admin": false, 00:39:57.997 "nvme_io": false 00:39:57.997 }, 00:39:57.997 "memory_domains": [ 00:39:57.997 { 00:39:57.997 "dma_device_id": "system", 00:39:57.997 "dma_device_type": 1 00:39:57.997 }, 00:39:57.997 { 00:39:57.997 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:39:57.997 "dma_device_type": 2 00:39:57.997 } 00:39:57.998 ], 00:39:57.998 "driver_specific": {} 00:39:57.998 } 00:39:57.998 ] 00:39:57.998 11:50:41 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@906 -- # return 0 00:39:57.998 11:50:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:39:57.998 11:50:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:39:57.998 11:50:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:39:57.998 11:50:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:39:57.998 11:50:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:39:57.998 11:50:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:39:57.998 11:50:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:39:57.998 11:50:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:39:57.998 11:50:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:39:57.998 11:50:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:39:57.998 11:50:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:39:57.998 11:50:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:39:57.998 11:50:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:57.998 11:50:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:39:57.998 11:50:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:39:57.998 "name": "Existed_Raid", 00:39:57.998 "uuid": "caea2180-e05d-4652-b925-7a5aed8758eb", 00:39:57.998 "strip_size_kb": 0, 00:39:57.998 "state": "online", 00:39:57.998 "raid_level": "raid1", 00:39:57.998 "superblock": true, 00:39:57.998 "num_base_bdevs": 2, 00:39:57.998 "num_base_bdevs_discovered": 2, 00:39:57.998 "num_base_bdevs_operational": 2, 00:39:57.998 "base_bdevs_list": [ 00:39:57.998 { 00:39:57.998 "name": "BaseBdev1", 00:39:57.998 "uuid": "b6651527-7bd8-450c-b4b5-27e1dec25f5b", 00:39:57.998 "is_configured": true, 00:39:57.998 "data_offset": 256, 00:39:57.998 "data_size": 7936 00:39:57.998 }, 00:39:57.998 { 00:39:57.998 "name": "BaseBdev2", 00:39:57.998 "uuid": "4d0440be-25b3-493a-a1cc-a1675fd564df", 00:39:57.998 "is_configured": true, 00:39:57.998 "data_offset": 256, 00:39:57.998 "data_size": 7936 00:39:57.998 } 00:39:57.998 ] 00:39:57.998 }' 00:39:57.998 11:50:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:39:57.998 11:50:41 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:39:58.567 11:50:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:39:58.567 11:50:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:39:58.567 11:50:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:39:58.567 11:50:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:39:58.567 11:50:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:39:58.567 11:50:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@198 -- # local name 00:39:58.567 11:50:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:39:58.567 11:50:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:39:58.826 [2024-06-10 11:50:42.563988] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:39:58.826 11:50:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:39:58.826 "name": "Existed_Raid", 00:39:58.826 "aliases": [ 00:39:58.826 "caea2180-e05d-4652-b925-7a5aed8758eb" 00:39:58.826 ], 00:39:58.826 "product_name": "Raid Volume", 00:39:58.826 "block_size": 4096, 00:39:58.826 "num_blocks": 7936, 00:39:58.826 "uuid": "caea2180-e05d-4652-b925-7a5aed8758eb", 00:39:58.826 "assigned_rate_limits": { 00:39:58.826 "rw_ios_per_sec": 0, 00:39:58.826 "rw_mbytes_per_sec": 0, 00:39:58.826 "r_mbytes_per_sec": 0, 00:39:58.826 "w_mbytes_per_sec": 0 00:39:58.826 }, 00:39:58.826 "claimed": false, 00:39:58.826 "zoned": false, 00:39:58.826 "supported_io_types": { 00:39:58.826 "read": true, 00:39:58.826 "write": true, 00:39:58.826 "unmap": false, 00:39:58.826 "write_zeroes": true, 00:39:58.826 "flush": false, 00:39:58.826 "reset": true, 00:39:58.826 "compare": false, 00:39:58.826 "compare_and_write": false, 00:39:58.826 "abort": false, 00:39:58.826 "nvme_admin": false, 00:39:58.826 "nvme_io": false 00:39:58.826 }, 00:39:58.826 "memory_domains": [ 00:39:58.826 { 00:39:58.826 "dma_device_id": "system", 00:39:58.826 "dma_device_type": 1 00:39:58.826 }, 00:39:58.826 { 00:39:58.826 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:39:58.826 "dma_device_type": 2 00:39:58.826 }, 00:39:58.826 { 00:39:58.826 "dma_device_id": "system", 00:39:58.826 "dma_device_type": 1 00:39:58.826 }, 00:39:58.826 { 00:39:58.826 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:39:58.826 "dma_device_type": 2 00:39:58.826 } 00:39:58.826 ], 00:39:58.826 "driver_specific": { 00:39:58.826 "raid": { 00:39:58.826 "uuid": "caea2180-e05d-4652-b925-7a5aed8758eb", 00:39:58.826 "strip_size_kb": 0, 00:39:58.826 "state": "online", 00:39:58.826 "raid_level": "raid1", 00:39:58.826 "superblock": true, 00:39:58.826 "num_base_bdevs": 2, 00:39:58.826 "num_base_bdevs_discovered": 2, 00:39:58.826 "num_base_bdevs_operational": 2, 00:39:58.826 "base_bdevs_list": [ 00:39:58.826 { 00:39:58.826 "name": "BaseBdev1", 00:39:58.826 "uuid": "b6651527-7bd8-450c-b4b5-27e1dec25f5b", 00:39:58.826 "is_configured": true, 00:39:58.826 "data_offset": 256, 00:39:58.826 "data_size": 7936 00:39:58.826 }, 00:39:58.826 { 00:39:58.826 "name": "BaseBdev2", 00:39:58.826 "uuid": "4d0440be-25b3-493a-a1cc-a1675fd564df", 00:39:58.826 "is_configured": true, 00:39:58.826 "data_offset": 256, 00:39:58.826 "data_size": 7936 00:39:58.826 } 00:39:58.826 ] 00:39:58.826 } 00:39:58.826 } 00:39:58.826 }' 00:39:58.826 11:50:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:39:58.826 11:50:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:39:58.826 BaseBdev2' 00:39:58.826 11:50:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:39:58.826 11:50:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:39:58.826 11:50:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:39:59.085 11:50:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:39:59.085 "name": "BaseBdev1", 00:39:59.085 "aliases": [ 00:39:59.085 "b6651527-7bd8-450c-b4b5-27e1dec25f5b" 00:39:59.085 ], 00:39:59.085 "product_name": "Malloc disk", 00:39:59.085 "block_size": 4096, 00:39:59.085 "num_blocks": 8192, 00:39:59.085 "uuid": "b6651527-7bd8-450c-b4b5-27e1dec25f5b", 00:39:59.085 "assigned_rate_limits": { 00:39:59.085 "rw_ios_per_sec": 0, 00:39:59.085 "rw_mbytes_per_sec": 0, 00:39:59.085 "r_mbytes_per_sec": 0, 00:39:59.085 "w_mbytes_per_sec": 0 00:39:59.085 }, 00:39:59.085 "claimed": true, 00:39:59.085 "claim_type": "exclusive_write", 00:39:59.085 "zoned": false, 00:39:59.085 "supported_io_types": { 00:39:59.085 "read": true, 00:39:59.085 "write": true, 00:39:59.085 "unmap": true, 00:39:59.086 "write_zeroes": true, 00:39:59.086 "flush": true, 00:39:59.086 "reset": true, 00:39:59.086 "compare": false, 00:39:59.086 "compare_and_write": false, 00:39:59.086 "abort": true, 00:39:59.086 "nvme_admin": false, 00:39:59.086 "nvme_io": false 00:39:59.086 }, 00:39:59.086 "memory_domains": [ 00:39:59.086 { 00:39:59.086 "dma_device_id": "system", 00:39:59.086 "dma_device_type": 1 00:39:59.086 }, 00:39:59.086 { 00:39:59.086 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:39:59.086 "dma_device_type": 2 00:39:59.086 } 00:39:59.086 ], 00:39:59.086 "driver_specific": {} 00:39:59.086 }' 00:39:59.086 11:50:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:39:59.086 11:50:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:39:59.086 11:50:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:39:59.086 11:50:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:39:59.086 11:50:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:39:59.086 11:50:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:39:59.086 11:50:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:39:59.086 11:50:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:39:59.345 11:50:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:39:59.345 11:50:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:39:59.345 11:50:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:39:59.345 11:50:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:39:59.345 11:50:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:39:59.345 11:50:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:39:59.345 11:50:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:39:59.345 11:50:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:39:59.345 "name": "BaseBdev2", 00:39:59.345 "aliases": [ 00:39:59.345 "4d0440be-25b3-493a-a1cc-a1675fd564df" 00:39:59.345 ], 00:39:59.345 "product_name": "Malloc disk", 00:39:59.345 "block_size": 4096, 00:39:59.345 "num_blocks": 8192, 00:39:59.345 "uuid": "4d0440be-25b3-493a-a1cc-a1675fd564df", 00:39:59.345 "assigned_rate_limits": { 00:39:59.345 "rw_ios_per_sec": 0, 00:39:59.345 "rw_mbytes_per_sec": 0, 00:39:59.345 "r_mbytes_per_sec": 0, 00:39:59.345 "w_mbytes_per_sec": 0 00:39:59.345 }, 00:39:59.345 "claimed": true, 00:39:59.345 "claim_type": "exclusive_write", 00:39:59.345 "zoned": false, 00:39:59.345 "supported_io_types": { 00:39:59.345 "read": true, 00:39:59.345 "write": true, 00:39:59.345 "unmap": true, 00:39:59.345 "write_zeroes": true, 00:39:59.345 "flush": true, 00:39:59.345 "reset": true, 00:39:59.345 "compare": false, 00:39:59.345 "compare_and_write": false, 00:39:59.345 "abort": true, 00:39:59.345 "nvme_admin": false, 00:39:59.345 "nvme_io": false 00:39:59.345 }, 00:39:59.345 "memory_domains": [ 00:39:59.345 { 00:39:59.345 "dma_device_id": "system", 00:39:59.345 "dma_device_type": 1 00:39:59.345 }, 00:39:59.345 { 00:39:59.345 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:39:59.345 "dma_device_type": 2 00:39:59.345 } 00:39:59.345 ], 00:39:59.345 "driver_specific": {} 00:39:59.345 }' 00:39:59.345 11:50:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:39:59.604 11:50:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:39:59.604 11:50:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:39:59.604 11:50:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:39:59.604 11:50:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:39:59.604 11:50:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:39:59.604 11:50:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:39:59.604 11:50:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:39:59.604 11:50:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:39:59.604 11:50:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:39:59.604 11:50:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:39:59.863 11:50:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:39:59.863 11:50:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:39:59.863 [2024-06-10 11:50:43.718957] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:39:59.863 11:50:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@275 -- # local expected_state 00:39:59.863 11:50:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:39:59.863 11:50:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@213 -- # case $1 in 00:39:59.863 11:50:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@214 -- # return 0 00:39:59.863 11:50:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:39:59.863 11:50:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:39:59.863 11:50:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:39:59.863 11:50:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:39:59.863 11:50:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:39:59.863 11:50:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:39:59.863 11:50:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:39:59.863 11:50:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:39:59.863 11:50:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:39:59.863 11:50:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:39:59.863 11:50:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:39:59.863 11:50:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:59.863 11:50:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:40:00.123 11:50:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:40:00.123 "name": "Existed_Raid", 00:40:00.123 "uuid": "caea2180-e05d-4652-b925-7a5aed8758eb", 00:40:00.123 "strip_size_kb": 0, 00:40:00.123 "state": "online", 00:40:00.123 "raid_level": "raid1", 00:40:00.123 "superblock": true, 00:40:00.123 "num_base_bdevs": 2, 00:40:00.123 "num_base_bdevs_discovered": 1, 00:40:00.123 "num_base_bdevs_operational": 1, 00:40:00.123 "base_bdevs_list": [ 00:40:00.123 { 00:40:00.123 "name": null, 00:40:00.123 "uuid": "00000000-0000-0000-0000-000000000000", 00:40:00.123 "is_configured": false, 00:40:00.123 "data_offset": 256, 00:40:00.123 "data_size": 7936 00:40:00.123 }, 00:40:00.123 { 00:40:00.123 "name": "BaseBdev2", 00:40:00.123 "uuid": "4d0440be-25b3-493a-a1cc-a1675fd564df", 00:40:00.123 "is_configured": true, 00:40:00.123 "data_offset": 256, 00:40:00.123 "data_size": 7936 00:40:00.123 } 00:40:00.123 ] 00:40:00.123 }' 00:40:00.123 11:50:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:40:00.123 11:50:43 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:40:00.691 11:50:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:40:00.691 11:50:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:40:00.691 11:50:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:40:00.691 11:50:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:00.691 11:50:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:40:00.691 11:50:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:40:00.691 11:50:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:40:00.951 [2024-06-10 11:50:44.747329] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:40:00.951 [2024-06-10 11:50:44.747392] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:40:00.951 [2024-06-10 11:50:44.757559] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:40:00.951 [2024-06-10 11:50:44.757587] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:40:00.951 [2024-06-10 11:50:44.757595] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x204cbf0 name Existed_Raid, state offline 00:40:00.951 11:50:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:40:00.951 11:50:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:40:00.951 11:50:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:00.951 11:50:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:40:01.211 11:50:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:40:01.211 11:50:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:40:01.211 11:50:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:40:01.211 11:50:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@341 -- # killprocess 243456 00:40:01.211 11:50:44 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@949 -- # '[' -z 243456 ']' 00:40:01.211 11:50:44 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@953 -- # kill -0 243456 00:40:01.211 11:50:44 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@954 -- # uname 00:40:01.211 11:50:44 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:40:01.211 11:50:44 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 243456 00:40:01.211 11:50:45 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:40:01.211 11:50:45 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:40:01.211 11:50:45 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@967 -- # echo 'killing process with pid 243456' 00:40:01.211 killing process with pid 243456 00:40:01.211 11:50:45 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@968 -- # kill 243456 00:40:01.211 [2024-06-10 11:50:45.009464] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:40:01.211 11:50:45 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@973 -- # wait 243456 00:40:01.211 [2024-06-10 11:50:45.010294] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:40:01.470 11:50:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@343 -- # return 0 00:40:01.470 00:40:01.470 real 0m8.294s 00:40:01.470 user 0m14.470s 00:40:01.470 sys 0m1.693s 00:40:01.470 11:50:45 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@1125 -- # xtrace_disable 00:40:01.471 11:50:45 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:40:01.471 ************************************ 00:40:01.471 END TEST raid_state_function_test_sb_4k 00:40:01.471 ************************************ 00:40:01.471 11:50:45 bdev_raid -- bdev/bdev_raid.sh@899 -- # run_test raid_superblock_test_4k raid_superblock_test raid1 2 00:40:01.471 11:50:45 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:40:01.471 11:50:45 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:40:01.471 11:50:45 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:40:01.471 ************************************ 00:40:01.471 START TEST raid_superblock_test_4k 00:40:01.471 ************************************ 00:40:01.471 11:50:45 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@1124 -- # raid_superblock_test raid1 2 00:40:01.471 11:50:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:40:01.471 11:50:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:40:01.471 11:50:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:40:01.471 11:50:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:40:01.471 11:50:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:40:01.471 11:50:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:40:01.471 11:50:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:40:01.471 11:50:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:40:01.471 11:50:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:40:01.471 11:50:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@398 -- # local strip_size 00:40:01.471 11:50:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:40:01.471 11:50:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:40:01.471 11:50:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:40:01.471 11:50:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:40:01.471 11:50:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:40:01.471 11:50:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@411 -- # raid_pid=244837 00:40:01.471 11:50:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@412 -- # waitforlisten 244837 /var/tmp/spdk-raid.sock 00:40:01.471 11:50:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:40:01.471 11:50:45 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@830 -- # '[' -z 244837 ']' 00:40:01.471 11:50:45 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:40:01.471 11:50:45 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@835 -- # local max_retries=100 00:40:01.471 11:50:45 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:40:01.471 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:40:01.471 11:50:45 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@839 -- # xtrace_disable 00:40:01.471 11:50:45 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:40:01.471 [2024-06-10 11:50:45.351991] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:40:01.471 [2024-06-10 11:50:45.352043] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid244837 ] 00:40:01.730 [2024-06-10 11:50:45.438980] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:40:01.730 [2024-06-10 11:50:45.526886] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:40:01.730 [2024-06-10 11:50:45.585831] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:40:01.730 [2024-06-10 11:50:45.585864] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:40:02.299 11:50:46 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:40:02.299 11:50:46 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@863 -- # return 0 00:40:02.299 11:50:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:40:02.299 11:50:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:40:02.299 11:50:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:40:02.299 11:50:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:40:02.299 11:50:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:40:02.299 11:50:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:40:02.299 11:50:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:40:02.299 11:50:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:40:02.299 11:50:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b malloc1 00:40:02.558 malloc1 00:40:02.558 11:50:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:40:02.558 [2024-06-10 11:50:46.477400] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:40:02.558 [2024-06-10 11:50:46.477438] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:40:02.558 [2024-06-10 11:50:46.477453] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f6c100 00:40:02.558 [2024-06-10 11:50:46.477461] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:40:02.558 [2024-06-10 11:50:46.478690] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:40:02.558 [2024-06-10 11:50:46.478714] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:40:02.558 pt1 00:40:02.558 11:50:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:40:02.558 11:50:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:40:02.558 11:50:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:40:02.558 11:50:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:40:02.558 11:50:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:40:02.558 11:50:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:40:02.558 11:50:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:40:02.558 11:50:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:40:02.558 11:50:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b malloc2 00:40:02.817 malloc2 00:40:02.817 11:50:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:40:03.077 [2024-06-10 11:50:46.810063] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:40:03.077 [2024-06-10 11:50:46.810098] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:40:03.077 [2024-06-10 11:50:46.810112] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f6d500 00:40:03.077 [2024-06-10 11:50:46.810120] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:40:03.077 [2024-06-10 11:50:46.811236] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:40:03.077 [2024-06-10 11:50:46.811260] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:40:03.077 pt2 00:40:03.077 11:50:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:40:03.077 11:50:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:40:03.077 11:50:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:40:03.077 [2024-06-10 11:50:46.982523] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:40:03.077 [2024-06-10 11:50:46.983517] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:40:03.077 [2024-06-10 11:50:46.983624] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f6cc00 00:40:03.077 [2024-06-10 11:50:46.983633] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:40:03.077 [2024-06-10 11:50:46.983766] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f82ff0 00:40:03.077 [2024-06-10 11:50:46.983880] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f6cc00 00:40:03.077 [2024-06-10 11:50:46.983907] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1f6cc00 00:40:03.077 [2024-06-10 11:50:46.983976] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:40:03.077 11:50:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:40:03.077 11:50:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:40:03.077 11:50:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:40:03.077 11:50:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:40:03.077 11:50:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:40:03.077 11:50:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:40:03.077 11:50:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:40:03.077 11:50:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:40:03.077 11:50:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:40:03.077 11:50:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:40:03.077 11:50:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:03.077 11:50:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:40:03.336 11:50:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:40:03.336 "name": "raid_bdev1", 00:40:03.336 "uuid": "008ba7ee-00ea-4fd8-80f2-c995f4c4efba", 00:40:03.336 "strip_size_kb": 0, 00:40:03.336 "state": "online", 00:40:03.336 "raid_level": "raid1", 00:40:03.336 "superblock": true, 00:40:03.336 "num_base_bdevs": 2, 00:40:03.336 "num_base_bdevs_discovered": 2, 00:40:03.336 "num_base_bdevs_operational": 2, 00:40:03.336 "base_bdevs_list": [ 00:40:03.336 { 00:40:03.336 "name": "pt1", 00:40:03.336 "uuid": "00000000-0000-0000-0000-000000000001", 00:40:03.336 "is_configured": true, 00:40:03.336 "data_offset": 256, 00:40:03.336 "data_size": 7936 00:40:03.336 }, 00:40:03.336 { 00:40:03.336 "name": "pt2", 00:40:03.336 "uuid": "00000000-0000-0000-0000-000000000002", 00:40:03.336 "is_configured": true, 00:40:03.336 "data_offset": 256, 00:40:03.336 "data_size": 7936 00:40:03.336 } 00:40:03.336 ] 00:40:03.336 }' 00:40:03.336 11:50:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:40:03.336 11:50:47 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:40:03.903 11:50:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:40:03.903 11:50:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:40:03.903 11:50:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:40:03.903 11:50:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:40:03.903 11:50:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:40:03.903 11:50:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@198 -- # local name 00:40:03.903 11:50:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:40:03.903 11:50:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:40:03.903 [2024-06-10 11:50:47.800792] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:40:03.903 11:50:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:40:03.903 "name": "raid_bdev1", 00:40:03.903 "aliases": [ 00:40:03.903 "008ba7ee-00ea-4fd8-80f2-c995f4c4efba" 00:40:03.903 ], 00:40:03.903 "product_name": "Raid Volume", 00:40:03.903 "block_size": 4096, 00:40:03.903 "num_blocks": 7936, 00:40:03.903 "uuid": "008ba7ee-00ea-4fd8-80f2-c995f4c4efba", 00:40:03.903 "assigned_rate_limits": { 00:40:03.903 "rw_ios_per_sec": 0, 00:40:03.903 "rw_mbytes_per_sec": 0, 00:40:03.903 "r_mbytes_per_sec": 0, 00:40:03.903 "w_mbytes_per_sec": 0 00:40:03.903 }, 00:40:03.903 "claimed": false, 00:40:03.903 "zoned": false, 00:40:03.903 "supported_io_types": { 00:40:03.903 "read": true, 00:40:03.903 "write": true, 00:40:03.903 "unmap": false, 00:40:03.903 "write_zeroes": true, 00:40:03.903 "flush": false, 00:40:03.903 "reset": true, 00:40:03.903 "compare": false, 00:40:03.903 "compare_and_write": false, 00:40:03.903 "abort": false, 00:40:03.903 "nvme_admin": false, 00:40:03.903 "nvme_io": false 00:40:03.903 }, 00:40:03.903 "memory_domains": [ 00:40:03.903 { 00:40:03.903 "dma_device_id": "system", 00:40:03.903 "dma_device_type": 1 00:40:03.903 }, 00:40:03.903 { 00:40:03.903 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:40:03.903 "dma_device_type": 2 00:40:03.903 }, 00:40:03.903 { 00:40:03.903 "dma_device_id": "system", 00:40:03.903 "dma_device_type": 1 00:40:03.903 }, 00:40:03.903 { 00:40:03.903 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:40:03.903 "dma_device_type": 2 00:40:03.903 } 00:40:03.903 ], 00:40:03.903 "driver_specific": { 00:40:03.903 "raid": { 00:40:03.903 "uuid": "008ba7ee-00ea-4fd8-80f2-c995f4c4efba", 00:40:03.903 "strip_size_kb": 0, 00:40:03.903 "state": "online", 00:40:03.903 "raid_level": "raid1", 00:40:03.903 "superblock": true, 00:40:03.903 "num_base_bdevs": 2, 00:40:03.903 "num_base_bdevs_discovered": 2, 00:40:03.903 "num_base_bdevs_operational": 2, 00:40:03.903 "base_bdevs_list": [ 00:40:03.903 { 00:40:03.903 "name": "pt1", 00:40:03.903 "uuid": "00000000-0000-0000-0000-000000000001", 00:40:03.903 "is_configured": true, 00:40:03.903 "data_offset": 256, 00:40:03.903 "data_size": 7936 00:40:03.903 }, 00:40:03.903 { 00:40:03.903 "name": "pt2", 00:40:03.903 "uuid": "00000000-0000-0000-0000-000000000002", 00:40:03.904 "is_configured": true, 00:40:03.904 "data_offset": 256, 00:40:03.904 "data_size": 7936 00:40:03.904 } 00:40:03.904 ] 00:40:03.904 } 00:40:03.904 } 00:40:03.904 }' 00:40:03.904 11:50:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:40:04.163 11:50:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:40:04.163 pt2' 00:40:04.163 11:50:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:40:04.163 11:50:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:40:04.163 11:50:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:40:04.163 11:50:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:40:04.163 "name": "pt1", 00:40:04.163 "aliases": [ 00:40:04.163 "00000000-0000-0000-0000-000000000001" 00:40:04.163 ], 00:40:04.163 "product_name": "passthru", 00:40:04.163 "block_size": 4096, 00:40:04.163 "num_blocks": 8192, 00:40:04.163 "uuid": "00000000-0000-0000-0000-000000000001", 00:40:04.163 "assigned_rate_limits": { 00:40:04.163 "rw_ios_per_sec": 0, 00:40:04.163 "rw_mbytes_per_sec": 0, 00:40:04.163 "r_mbytes_per_sec": 0, 00:40:04.163 "w_mbytes_per_sec": 0 00:40:04.163 }, 00:40:04.163 "claimed": true, 00:40:04.163 "claim_type": "exclusive_write", 00:40:04.163 "zoned": false, 00:40:04.163 "supported_io_types": { 00:40:04.163 "read": true, 00:40:04.163 "write": true, 00:40:04.163 "unmap": true, 00:40:04.163 "write_zeroes": true, 00:40:04.163 "flush": true, 00:40:04.163 "reset": true, 00:40:04.163 "compare": false, 00:40:04.163 "compare_and_write": false, 00:40:04.163 "abort": true, 00:40:04.163 "nvme_admin": false, 00:40:04.163 "nvme_io": false 00:40:04.163 }, 00:40:04.163 "memory_domains": [ 00:40:04.163 { 00:40:04.163 "dma_device_id": "system", 00:40:04.163 "dma_device_type": 1 00:40:04.163 }, 00:40:04.163 { 00:40:04.163 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:40:04.163 "dma_device_type": 2 00:40:04.163 } 00:40:04.163 ], 00:40:04.163 "driver_specific": { 00:40:04.163 "passthru": { 00:40:04.163 "name": "pt1", 00:40:04.163 "base_bdev_name": "malloc1" 00:40:04.163 } 00:40:04.163 } 00:40:04.163 }' 00:40:04.163 11:50:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:40:04.163 11:50:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:40:04.422 11:50:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:40:04.422 11:50:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:40:04.422 11:50:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:40:04.422 11:50:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:40:04.422 11:50:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:40:04.422 11:50:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:40:04.422 11:50:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:40:04.422 11:50:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:40:04.422 11:50:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:40:04.422 11:50:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:40:04.422 11:50:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:40:04.422 11:50:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:40:04.422 11:50:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:40:04.681 11:50:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:40:04.681 "name": "pt2", 00:40:04.681 "aliases": [ 00:40:04.681 "00000000-0000-0000-0000-000000000002" 00:40:04.681 ], 00:40:04.681 "product_name": "passthru", 00:40:04.681 "block_size": 4096, 00:40:04.681 "num_blocks": 8192, 00:40:04.681 "uuid": "00000000-0000-0000-0000-000000000002", 00:40:04.681 "assigned_rate_limits": { 00:40:04.681 "rw_ios_per_sec": 0, 00:40:04.681 "rw_mbytes_per_sec": 0, 00:40:04.681 "r_mbytes_per_sec": 0, 00:40:04.681 "w_mbytes_per_sec": 0 00:40:04.681 }, 00:40:04.681 "claimed": true, 00:40:04.681 "claim_type": "exclusive_write", 00:40:04.681 "zoned": false, 00:40:04.681 "supported_io_types": { 00:40:04.681 "read": true, 00:40:04.681 "write": true, 00:40:04.681 "unmap": true, 00:40:04.681 "write_zeroes": true, 00:40:04.681 "flush": true, 00:40:04.681 "reset": true, 00:40:04.681 "compare": false, 00:40:04.681 "compare_and_write": false, 00:40:04.681 "abort": true, 00:40:04.681 "nvme_admin": false, 00:40:04.681 "nvme_io": false 00:40:04.681 }, 00:40:04.681 "memory_domains": [ 00:40:04.681 { 00:40:04.681 "dma_device_id": "system", 00:40:04.681 "dma_device_type": 1 00:40:04.681 }, 00:40:04.681 { 00:40:04.681 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:40:04.681 "dma_device_type": 2 00:40:04.681 } 00:40:04.681 ], 00:40:04.682 "driver_specific": { 00:40:04.682 "passthru": { 00:40:04.682 "name": "pt2", 00:40:04.682 "base_bdev_name": "malloc2" 00:40:04.682 } 00:40:04.682 } 00:40:04.682 }' 00:40:04.682 11:50:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:40:04.682 11:50:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:40:04.682 11:50:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:40:04.682 11:50:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:40:04.941 11:50:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:40:04.941 11:50:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:40:04.941 11:50:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:40:04.941 11:50:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:40:04.941 11:50:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:40:04.941 11:50:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:40:04.941 11:50:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:40:04.941 11:50:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:40:04.941 11:50:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:40:04.941 11:50:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:40:05.200 [2024-06-10 11:50:48.975822] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:40:05.200 11:50:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=008ba7ee-00ea-4fd8-80f2-c995f4c4efba 00:40:05.200 11:50:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@435 -- # '[' -z 008ba7ee-00ea-4fd8-80f2-c995f4c4efba ']' 00:40:05.200 11:50:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:40:05.459 [2024-06-10 11:50:49.148146] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:40:05.459 [2024-06-10 11:50:49.148163] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:40:05.459 [2024-06-10 11:50:49.148201] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:40:05.459 [2024-06-10 11:50:49.148239] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:40:05.459 [2024-06-10 11:50:49.148247] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f6cc00 name raid_bdev1, state offline 00:40:05.460 11:50:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:05.460 11:50:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:40:05.460 11:50:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:40:05.460 11:50:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:40:05.460 11:50:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:40:05.460 11:50:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:40:05.719 11:50:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:40:05.719 11:50:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:40:05.979 11:50:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:40:05.979 11:50:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:40:05.979 11:50:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:40:05.979 11:50:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:40:05.979 11:50:49 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@649 -- # local es=0 00:40:05.979 11:50:49 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:40:05.979 11:50:49 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:40:05.979 11:50:49 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:40:05.979 11:50:49 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:40:05.979 11:50:49 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:40:05.979 11:50:49 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:40:05.979 11:50:49 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:40:05.979 11:50:49 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:40:05.979 11:50:49 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:40:05.979 11:50:49 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:40:06.238 [2024-06-10 11:50:49.994316] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:40:06.238 [2024-06-10 11:50:49.995311] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:40:06.238 [2024-06-10 11:50:49.995352] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:40:06.238 [2024-06-10 11:50:49.995381] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:40:06.238 [2024-06-10 11:50:49.995393] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:40:06.238 [2024-06-10 11:50:49.995400] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21169e0 name raid_bdev1, state configuring 00:40:06.238 request: 00:40:06.238 { 00:40:06.238 "name": "raid_bdev1", 00:40:06.238 "raid_level": "raid1", 00:40:06.238 "base_bdevs": [ 00:40:06.238 "malloc1", 00:40:06.238 "malloc2" 00:40:06.238 ], 00:40:06.238 "superblock": false, 00:40:06.238 "method": "bdev_raid_create", 00:40:06.238 "req_id": 1 00:40:06.238 } 00:40:06.238 Got JSON-RPC error response 00:40:06.238 response: 00:40:06.238 { 00:40:06.238 "code": -17, 00:40:06.238 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:40:06.238 } 00:40:06.238 11:50:50 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@652 -- # es=1 00:40:06.238 11:50:50 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:40:06.238 11:50:50 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:40:06.238 11:50:50 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:40:06.238 11:50:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:06.238 11:50:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:40:06.497 11:50:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:40:06.497 11:50:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:40:06.497 11:50:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:40:06.497 [2024-06-10 11:50:50.347204] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:40:06.497 [2024-06-10 11:50:50.347247] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:40:06.497 [2024-06-10 11:50:50.347259] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2117e70 00:40:06.497 [2024-06-10 11:50:50.347269] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:40:06.497 [2024-06-10 11:50:50.348471] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:40:06.497 [2024-06-10 11:50:50.348495] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:40:06.497 [2024-06-10 11:50:50.348547] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:40:06.497 [2024-06-10 11:50:50.348568] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:40:06.497 pt1 00:40:06.497 11:50:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:40:06.497 11:50:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:40:06.497 11:50:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:40:06.497 11:50:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:40:06.498 11:50:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:40:06.498 11:50:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:40:06.498 11:50:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:40:06.498 11:50:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:40:06.498 11:50:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:40:06.498 11:50:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:40:06.498 11:50:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:06.498 11:50:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:40:06.757 11:50:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:40:06.757 "name": "raid_bdev1", 00:40:06.757 "uuid": "008ba7ee-00ea-4fd8-80f2-c995f4c4efba", 00:40:06.757 "strip_size_kb": 0, 00:40:06.757 "state": "configuring", 00:40:06.757 "raid_level": "raid1", 00:40:06.757 "superblock": true, 00:40:06.757 "num_base_bdevs": 2, 00:40:06.757 "num_base_bdevs_discovered": 1, 00:40:06.757 "num_base_bdevs_operational": 2, 00:40:06.757 "base_bdevs_list": [ 00:40:06.757 { 00:40:06.757 "name": "pt1", 00:40:06.757 "uuid": "00000000-0000-0000-0000-000000000001", 00:40:06.757 "is_configured": true, 00:40:06.757 "data_offset": 256, 00:40:06.757 "data_size": 7936 00:40:06.757 }, 00:40:06.757 { 00:40:06.757 "name": null, 00:40:06.757 "uuid": "00000000-0000-0000-0000-000000000002", 00:40:06.757 "is_configured": false, 00:40:06.757 "data_offset": 256, 00:40:06.757 "data_size": 7936 00:40:06.757 } 00:40:06.757 ] 00:40:06.757 }' 00:40:06.757 11:50:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:40:06.757 11:50:50 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:40:07.326 11:50:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:40:07.326 11:50:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:40:07.326 11:50:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:40:07.326 11:50:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:40:07.326 [2024-06-10 11:50:51.165315] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:40:07.326 [2024-06-10 11:50:51.165347] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:40:07.326 [2024-06-10 11:50:51.165358] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2115340 00:40:07.326 [2024-06-10 11:50:51.165366] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:40:07.326 [2024-06-10 11:50:51.165589] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:40:07.326 [2024-06-10 11:50:51.165602] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:40:07.326 [2024-06-10 11:50:51.165643] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:40:07.326 [2024-06-10 11:50:51.165656] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:40:07.326 [2024-06-10 11:50:51.165724] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f6b490 00:40:07.326 [2024-06-10 11:50:51.165731] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:40:07.326 [2024-06-10 11:50:51.165836] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x211c5d0 00:40:07.326 [2024-06-10 11:50:51.165926] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f6b490 00:40:07.326 [2024-06-10 11:50:51.165933] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1f6b490 00:40:07.326 [2024-06-10 11:50:51.165997] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:40:07.326 pt2 00:40:07.326 11:50:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:40:07.326 11:50:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:40:07.326 11:50:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:40:07.326 11:50:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:40:07.326 11:50:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:40:07.326 11:50:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:40:07.326 11:50:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:40:07.326 11:50:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:40:07.326 11:50:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:40:07.326 11:50:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:40:07.326 11:50:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:40:07.326 11:50:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:40:07.326 11:50:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:07.326 11:50:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:40:07.586 11:50:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:40:07.586 "name": "raid_bdev1", 00:40:07.586 "uuid": "008ba7ee-00ea-4fd8-80f2-c995f4c4efba", 00:40:07.586 "strip_size_kb": 0, 00:40:07.586 "state": "online", 00:40:07.586 "raid_level": "raid1", 00:40:07.586 "superblock": true, 00:40:07.586 "num_base_bdevs": 2, 00:40:07.586 "num_base_bdevs_discovered": 2, 00:40:07.586 "num_base_bdevs_operational": 2, 00:40:07.586 "base_bdevs_list": [ 00:40:07.586 { 00:40:07.586 "name": "pt1", 00:40:07.586 "uuid": "00000000-0000-0000-0000-000000000001", 00:40:07.586 "is_configured": true, 00:40:07.586 "data_offset": 256, 00:40:07.586 "data_size": 7936 00:40:07.586 }, 00:40:07.586 { 00:40:07.586 "name": "pt2", 00:40:07.586 "uuid": "00000000-0000-0000-0000-000000000002", 00:40:07.586 "is_configured": true, 00:40:07.586 "data_offset": 256, 00:40:07.586 "data_size": 7936 00:40:07.586 } 00:40:07.586 ] 00:40:07.586 }' 00:40:07.586 11:50:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:40:07.586 11:50:51 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:40:08.154 11:50:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:40:08.154 11:50:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:40:08.154 11:50:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:40:08.154 11:50:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:40:08.154 11:50:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:40:08.154 11:50:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@198 -- # local name 00:40:08.154 11:50:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:40:08.154 11:50:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:40:08.154 [2024-06-10 11:50:52.015660] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:40:08.154 11:50:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:40:08.154 "name": "raid_bdev1", 00:40:08.154 "aliases": [ 00:40:08.154 "008ba7ee-00ea-4fd8-80f2-c995f4c4efba" 00:40:08.154 ], 00:40:08.154 "product_name": "Raid Volume", 00:40:08.154 "block_size": 4096, 00:40:08.154 "num_blocks": 7936, 00:40:08.154 "uuid": "008ba7ee-00ea-4fd8-80f2-c995f4c4efba", 00:40:08.154 "assigned_rate_limits": { 00:40:08.154 "rw_ios_per_sec": 0, 00:40:08.154 "rw_mbytes_per_sec": 0, 00:40:08.154 "r_mbytes_per_sec": 0, 00:40:08.154 "w_mbytes_per_sec": 0 00:40:08.154 }, 00:40:08.154 "claimed": false, 00:40:08.154 "zoned": false, 00:40:08.154 "supported_io_types": { 00:40:08.154 "read": true, 00:40:08.154 "write": true, 00:40:08.154 "unmap": false, 00:40:08.154 "write_zeroes": true, 00:40:08.154 "flush": false, 00:40:08.154 "reset": true, 00:40:08.154 "compare": false, 00:40:08.154 "compare_and_write": false, 00:40:08.154 "abort": false, 00:40:08.154 "nvme_admin": false, 00:40:08.154 "nvme_io": false 00:40:08.154 }, 00:40:08.154 "memory_domains": [ 00:40:08.154 { 00:40:08.154 "dma_device_id": "system", 00:40:08.154 "dma_device_type": 1 00:40:08.154 }, 00:40:08.154 { 00:40:08.154 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:40:08.154 "dma_device_type": 2 00:40:08.154 }, 00:40:08.154 { 00:40:08.154 "dma_device_id": "system", 00:40:08.154 "dma_device_type": 1 00:40:08.154 }, 00:40:08.154 { 00:40:08.154 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:40:08.154 "dma_device_type": 2 00:40:08.154 } 00:40:08.154 ], 00:40:08.154 "driver_specific": { 00:40:08.154 "raid": { 00:40:08.154 "uuid": "008ba7ee-00ea-4fd8-80f2-c995f4c4efba", 00:40:08.154 "strip_size_kb": 0, 00:40:08.154 "state": "online", 00:40:08.154 "raid_level": "raid1", 00:40:08.154 "superblock": true, 00:40:08.154 "num_base_bdevs": 2, 00:40:08.154 "num_base_bdevs_discovered": 2, 00:40:08.154 "num_base_bdevs_operational": 2, 00:40:08.154 "base_bdevs_list": [ 00:40:08.154 { 00:40:08.154 "name": "pt1", 00:40:08.154 "uuid": "00000000-0000-0000-0000-000000000001", 00:40:08.154 "is_configured": true, 00:40:08.154 "data_offset": 256, 00:40:08.154 "data_size": 7936 00:40:08.154 }, 00:40:08.154 { 00:40:08.154 "name": "pt2", 00:40:08.154 "uuid": "00000000-0000-0000-0000-000000000002", 00:40:08.154 "is_configured": true, 00:40:08.154 "data_offset": 256, 00:40:08.154 "data_size": 7936 00:40:08.154 } 00:40:08.154 ] 00:40:08.154 } 00:40:08.154 } 00:40:08.154 }' 00:40:08.154 11:50:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:40:08.154 11:50:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:40:08.154 pt2' 00:40:08.154 11:50:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:40:08.154 11:50:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:40:08.154 11:50:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:40:08.413 11:50:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:40:08.413 "name": "pt1", 00:40:08.413 "aliases": [ 00:40:08.413 "00000000-0000-0000-0000-000000000001" 00:40:08.413 ], 00:40:08.413 "product_name": "passthru", 00:40:08.413 "block_size": 4096, 00:40:08.413 "num_blocks": 8192, 00:40:08.413 "uuid": "00000000-0000-0000-0000-000000000001", 00:40:08.413 "assigned_rate_limits": { 00:40:08.413 "rw_ios_per_sec": 0, 00:40:08.413 "rw_mbytes_per_sec": 0, 00:40:08.413 "r_mbytes_per_sec": 0, 00:40:08.413 "w_mbytes_per_sec": 0 00:40:08.413 }, 00:40:08.413 "claimed": true, 00:40:08.413 "claim_type": "exclusive_write", 00:40:08.413 "zoned": false, 00:40:08.413 "supported_io_types": { 00:40:08.413 "read": true, 00:40:08.413 "write": true, 00:40:08.413 "unmap": true, 00:40:08.413 "write_zeroes": true, 00:40:08.413 "flush": true, 00:40:08.413 "reset": true, 00:40:08.413 "compare": false, 00:40:08.413 "compare_and_write": false, 00:40:08.413 "abort": true, 00:40:08.413 "nvme_admin": false, 00:40:08.413 "nvme_io": false 00:40:08.413 }, 00:40:08.413 "memory_domains": [ 00:40:08.413 { 00:40:08.413 "dma_device_id": "system", 00:40:08.413 "dma_device_type": 1 00:40:08.413 }, 00:40:08.413 { 00:40:08.413 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:40:08.413 "dma_device_type": 2 00:40:08.413 } 00:40:08.413 ], 00:40:08.413 "driver_specific": { 00:40:08.413 "passthru": { 00:40:08.413 "name": "pt1", 00:40:08.413 "base_bdev_name": "malloc1" 00:40:08.413 } 00:40:08.413 } 00:40:08.413 }' 00:40:08.413 11:50:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:40:08.413 11:50:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:40:08.413 11:50:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:40:08.413 11:50:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:40:08.413 11:50:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:40:08.672 11:50:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:40:08.672 11:50:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:40:08.672 11:50:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:40:08.672 11:50:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:40:08.672 11:50:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:40:08.672 11:50:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:40:08.672 11:50:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:40:08.672 11:50:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:40:08.672 11:50:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:40:08.672 11:50:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:40:08.931 11:50:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:40:08.931 "name": "pt2", 00:40:08.931 "aliases": [ 00:40:08.931 "00000000-0000-0000-0000-000000000002" 00:40:08.931 ], 00:40:08.931 "product_name": "passthru", 00:40:08.931 "block_size": 4096, 00:40:08.931 "num_blocks": 8192, 00:40:08.931 "uuid": "00000000-0000-0000-0000-000000000002", 00:40:08.931 "assigned_rate_limits": { 00:40:08.931 "rw_ios_per_sec": 0, 00:40:08.931 "rw_mbytes_per_sec": 0, 00:40:08.931 "r_mbytes_per_sec": 0, 00:40:08.931 "w_mbytes_per_sec": 0 00:40:08.931 }, 00:40:08.931 "claimed": true, 00:40:08.931 "claim_type": "exclusive_write", 00:40:08.931 "zoned": false, 00:40:08.931 "supported_io_types": { 00:40:08.931 "read": true, 00:40:08.931 "write": true, 00:40:08.931 "unmap": true, 00:40:08.931 "write_zeroes": true, 00:40:08.931 "flush": true, 00:40:08.931 "reset": true, 00:40:08.931 "compare": false, 00:40:08.931 "compare_and_write": false, 00:40:08.931 "abort": true, 00:40:08.931 "nvme_admin": false, 00:40:08.931 "nvme_io": false 00:40:08.931 }, 00:40:08.931 "memory_domains": [ 00:40:08.931 { 00:40:08.931 "dma_device_id": "system", 00:40:08.931 "dma_device_type": 1 00:40:08.931 }, 00:40:08.931 { 00:40:08.931 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:40:08.931 "dma_device_type": 2 00:40:08.931 } 00:40:08.931 ], 00:40:08.931 "driver_specific": { 00:40:08.931 "passthru": { 00:40:08.931 "name": "pt2", 00:40:08.931 "base_bdev_name": "malloc2" 00:40:08.931 } 00:40:08.931 } 00:40:08.931 }' 00:40:08.931 11:50:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:40:08.931 11:50:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:40:08.931 11:50:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:40:08.931 11:50:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:40:08.931 11:50:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:40:08.931 11:50:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:40:08.931 11:50:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:40:08.931 11:50:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:40:09.189 11:50:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:40:09.189 11:50:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:40:09.189 11:50:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:40:09.189 11:50:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:40:09.189 11:50:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:40:09.190 11:50:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:40:09.190 [2024-06-10 11:50:53.130559] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:40:09.449 11:50:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # '[' 008ba7ee-00ea-4fd8-80f2-c995f4c4efba '!=' 008ba7ee-00ea-4fd8-80f2-c995f4c4efba ']' 00:40:09.449 11:50:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:40:09.449 11:50:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@213 -- # case $1 in 00:40:09.449 11:50:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@214 -- # return 0 00:40:09.449 11:50:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:40:09.449 [2024-06-10 11:50:53.314903] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:40:09.449 11:50:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:40:09.449 11:50:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:40:09.449 11:50:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:40:09.449 11:50:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:40:09.449 11:50:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:40:09.449 11:50:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:40:09.449 11:50:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:40:09.449 11:50:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:40:09.449 11:50:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:40:09.449 11:50:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:40:09.449 11:50:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:09.449 11:50:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:40:09.708 11:50:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:40:09.708 "name": "raid_bdev1", 00:40:09.708 "uuid": "008ba7ee-00ea-4fd8-80f2-c995f4c4efba", 00:40:09.708 "strip_size_kb": 0, 00:40:09.708 "state": "online", 00:40:09.708 "raid_level": "raid1", 00:40:09.708 "superblock": true, 00:40:09.708 "num_base_bdevs": 2, 00:40:09.708 "num_base_bdevs_discovered": 1, 00:40:09.708 "num_base_bdevs_operational": 1, 00:40:09.708 "base_bdevs_list": [ 00:40:09.708 { 00:40:09.708 "name": null, 00:40:09.708 "uuid": "00000000-0000-0000-0000-000000000000", 00:40:09.708 "is_configured": false, 00:40:09.708 "data_offset": 256, 00:40:09.708 "data_size": 7936 00:40:09.708 }, 00:40:09.708 { 00:40:09.708 "name": "pt2", 00:40:09.708 "uuid": "00000000-0000-0000-0000-000000000002", 00:40:09.708 "is_configured": true, 00:40:09.708 "data_offset": 256, 00:40:09.708 "data_size": 7936 00:40:09.708 } 00:40:09.708 ] 00:40:09.708 }' 00:40:09.708 11:50:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:40:09.708 11:50:53 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:40:10.297 11:50:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:40:10.297 [2024-06-10 11:50:54.149029] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:40:10.297 [2024-06-10 11:50:54.149054] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:40:10.297 [2024-06-10 11:50:54.149090] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:40:10.297 [2024-06-10 11:50:54.149119] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:40:10.297 [2024-06-10 11:50:54.149127] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f6b490 name raid_bdev1, state offline 00:40:10.297 11:50:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:10.297 11:50:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:40:10.575 11:50:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:40:10.575 11:50:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:40:10.575 11:50:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:40:10.575 11:50:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:40:10.575 11:50:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:40:10.835 11:50:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:40:10.835 11:50:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:40:10.835 11:50:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:40:10.835 11:50:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:40:10.835 11:50:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@518 -- # i=1 00:40:10.835 11:50:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:40:10.835 [2024-06-10 11:50:54.674371] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:40:10.835 [2024-06-10 11:50:54.674405] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:40:10.835 [2024-06-10 11:50:54.674415] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2115570 00:40:10.835 [2024-06-10 11:50:54.674423] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:40:10.835 [2024-06-10 11:50:54.675579] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:40:10.835 [2024-06-10 11:50:54.675601] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:40:10.835 [2024-06-10 11:50:54.675649] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:40:10.835 [2024-06-10 11:50:54.675667] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:40:10.835 [2024-06-10 11:50:54.675726] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2119340 00:40:10.835 [2024-06-10 11:50:54.675733] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:40:10.835 [2024-06-10 11:50:54.675845] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2117050 00:40:10.835 [2024-06-10 11:50:54.675935] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2119340 00:40:10.835 [2024-06-10 11:50:54.675943] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2119340 00:40:10.835 [2024-06-10 11:50:54.676006] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:40:10.835 pt2 00:40:10.835 11:50:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:40:10.835 11:50:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:40:10.835 11:50:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:40:10.835 11:50:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:40:10.835 11:50:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:40:10.835 11:50:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:40:10.835 11:50:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:40:10.835 11:50:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:40:10.835 11:50:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:40:10.836 11:50:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:40:10.836 11:50:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:40:10.836 11:50:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:11.095 11:50:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:40:11.095 "name": "raid_bdev1", 00:40:11.095 "uuid": "008ba7ee-00ea-4fd8-80f2-c995f4c4efba", 00:40:11.095 "strip_size_kb": 0, 00:40:11.095 "state": "online", 00:40:11.095 "raid_level": "raid1", 00:40:11.095 "superblock": true, 00:40:11.095 "num_base_bdevs": 2, 00:40:11.095 "num_base_bdevs_discovered": 1, 00:40:11.095 "num_base_bdevs_operational": 1, 00:40:11.095 "base_bdevs_list": [ 00:40:11.095 { 00:40:11.095 "name": null, 00:40:11.095 "uuid": "00000000-0000-0000-0000-000000000000", 00:40:11.095 "is_configured": false, 00:40:11.095 "data_offset": 256, 00:40:11.095 "data_size": 7936 00:40:11.095 }, 00:40:11.095 { 00:40:11.095 "name": "pt2", 00:40:11.095 "uuid": "00000000-0000-0000-0000-000000000002", 00:40:11.095 "is_configured": true, 00:40:11.095 "data_offset": 256, 00:40:11.095 "data_size": 7936 00:40:11.095 } 00:40:11.095 ] 00:40:11.095 }' 00:40:11.095 11:50:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:40:11.095 11:50:54 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:40:11.662 11:50:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:40:11.662 [2024-06-10 11:50:55.520570] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:40:11.662 [2024-06-10 11:50:55.520592] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:40:11.662 [2024-06-10 11:50:55.520629] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:40:11.662 [2024-06-10 11:50:55.520660] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:40:11.662 [2024-06-10 11:50:55.520668] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2119340 name raid_bdev1, state offline 00:40:11.662 11:50:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:40:11.662 11:50:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:11.920 11:50:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:40:11.920 11:50:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:40:11.920 11:50:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:40:11.920 11:50:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:40:11.920 [2024-06-10 11:50:55.853420] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:40:11.920 [2024-06-10 11:50:55.853457] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:40:11.920 [2024-06-10 11:50:55.853469] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21190d0 00:40:11.920 [2024-06-10 11:50:55.853477] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:40:11.920 [2024-06-10 11:50:55.854679] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:40:11.920 [2024-06-10 11:50:55.854706] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:40:11.920 [2024-06-10 11:50:55.854754] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:40:11.920 [2024-06-10 11:50:55.854773] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:40:11.920 [2024-06-10 11:50:55.854844] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:40:11.920 [2024-06-10 11:50:55.854853] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:40:11.920 [2024-06-10 11:50:55.854862] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2116330 name raid_bdev1, state configuring 00:40:11.920 [2024-06-10 11:50:55.854890] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:40:11.920 [2024-06-10 11:50:55.854931] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x211c360 00:40:11.920 [2024-06-10 11:50:55.854937] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:40:11.920 [2024-06-10 11:50:55.855048] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2117050 00:40:11.920 [2024-06-10 11:50:55.855129] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x211c360 00:40:11.920 [2024-06-10 11:50:55.855135] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x211c360 00:40:11.920 [2024-06-10 11:50:55.855201] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:40:11.920 pt1 00:40:12.179 11:50:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:40:12.179 11:50:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:40:12.179 11:50:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:40:12.179 11:50:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:40:12.179 11:50:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:40:12.179 11:50:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:40:12.179 11:50:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:40:12.179 11:50:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:40:12.179 11:50:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:40:12.179 11:50:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:40:12.179 11:50:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:40:12.179 11:50:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:12.179 11:50:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:40:12.179 11:50:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:40:12.179 "name": "raid_bdev1", 00:40:12.179 "uuid": "008ba7ee-00ea-4fd8-80f2-c995f4c4efba", 00:40:12.179 "strip_size_kb": 0, 00:40:12.179 "state": "online", 00:40:12.179 "raid_level": "raid1", 00:40:12.179 "superblock": true, 00:40:12.179 "num_base_bdevs": 2, 00:40:12.179 "num_base_bdevs_discovered": 1, 00:40:12.179 "num_base_bdevs_operational": 1, 00:40:12.179 "base_bdevs_list": [ 00:40:12.179 { 00:40:12.179 "name": null, 00:40:12.179 "uuid": "00000000-0000-0000-0000-000000000000", 00:40:12.179 "is_configured": false, 00:40:12.179 "data_offset": 256, 00:40:12.179 "data_size": 7936 00:40:12.179 }, 00:40:12.179 { 00:40:12.179 "name": "pt2", 00:40:12.179 "uuid": "00000000-0000-0000-0000-000000000002", 00:40:12.179 "is_configured": true, 00:40:12.179 "data_offset": 256, 00:40:12.179 "data_size": 7936 00:40:12.179 } 00:40:12.179 ] 00:40:12.179 }' 00:40:12.179 11:50:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:40:12.180 11:50:56 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:40:12.747 11:50:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:40:12.747 11:50:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:40:12.747 11:50:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:40:12.747 11:50:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:40:12.747 11:50:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:40:13.006 [2024-06-10 11:50:56.840101] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:40:13.006 11:50:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # '[' 008ba7ee-00ea-4fd8-80f2-c995f4c4efba '!=' 008ba7ee-00ea-4fd8-80f2-c995f4c4efba ']' 00:40:13.006 11:50:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@562 -- # killprocess 244837 00:40:13.006 11:50:56 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@949 -- # '[' -z 244837 ']' 00:40:13.006 11:50:56 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@953 -- # kill -0 244837 00:40:13.006 11:50:56 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@954 -- # uname 00:40:13.006 11:50:56 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:40:13.006 11:50:56 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 244837 00:40:13.006 11:50:56 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:40:13.006 11:50:56 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:40:13.006 11:50:56 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@967 -- # echo 'killing process with pid 244837' 00:40:13.006 killing process with pid 244837 00:40:13.006 11:50:56 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@968 -- # kill 244837 00:40:13.006 [2024-06-10 11:50:56.897994] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:40:13.006 [2024-06-10 11:50:56.898032] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:40:13.006 [2024-06-10 11:50:56.898059] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:40:13.006 [2024-06-10 11:50:56.898067] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x211c360 name raid_bdev1, state offline 00:40:13.006 11:50:56 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@973 -- # wait 244837 00:40:13.006 [2024-06-10 11:50:56.913784] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:40:13.266 11:50:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@564 -- # return 0 00:40:13.266 00:40:13.266 real 0m11.806s 00:40:13.266 user 0m21.199s 00:40:13.266 sys 0m2.362s 00:40:13.266 11:50:57 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@1125 -- # xtrace_disable 00:40:13.266 11:50:57 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:40:13.266 ************************************ 00:40:13.266 END TEST raid_superblock_test_4k 00:40:13.266 ************************************ 00:40:13.266 11:50:57 bdev_raid -- bdev/bdev_raid.sh@900 -- # '[' true = true ']' 00:40:13.266 11:50:57 bdev_raid -- bdev/bdev_raid.sh@901 -- # run_test raid_rebuild_test_sb_4k raid_rebuild_test raid1 2 true false true 00:40:13.266 11:50:57 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:40:13.266 11:50:57 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:40:13.266 11:50:57 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:40:13.266 ************************************ 00:40:13.266 START TEST raid_rebuild_test_sb_4k 00:40:13.266 ************************************ 00:40:13.266 11:50:57 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@1124 -- # raid_rebuild_test raid1 2 true false true 00:40:13.266 11:50:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:40:13.266 11:50:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:40:13.266 11:50:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:40:13.266 11:50:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:40:13.266 11:50:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@572 -- # local verify=true 00:40:13.266 11:50:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:40:13.266 11:50:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:40:13.266 11:50:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:40:13.266 11:50:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:40:13.266 11:50:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:40:13.266 11:50:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:40:13.266 11:50:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:40:13.266 11:50:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:40:13.266 11:50:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:40:13.266 11:50:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:40:13.266 11:50:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:40:13.266 11:50:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # local strip_size 00:40:13.266 11:50:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@576 -- # local create_arg 00:40:13.266 11:50:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:40:13.266 11:50:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@578 -- # local data_offset 00:40:13.266 11:50:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:40:13.266 11:50:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:40:13.266 11:50:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:40:13.266 11:50:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:40:13.266 11:50:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@596 -- # raid_pid=246594 00:40:13.266 11:50:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@597 -- # waitforlisten 246594 /var/tmp/spdk-raid.sock 00:40:13.266 11:50:57 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@830 -- # '[' -z 246594 ']' 00:40:13.266 11:50:57 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:40:13.266 11:50:57 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@835 -- # local max_retries=100 00:40:13.266 11:50:57 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:40:13.266 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:40:13.266 11:50:57 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@839 -- # xtrace_disable 00:40:13.266 11:50:57 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:40:13.266 11:50:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:40:13.526 [2024-06-10 11:50:57.243912] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:40:13.526 [2024-06-10 11:50:57.243967] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid246594 ] 00:40:13.526 I/O size of 3145728 is greater than zero copy threshold (65536). 00:40:13.526 Zero copy mechanism will not be used. 00:40:13.526 [2024-06-10 11:50:57.332207] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:40:13.526 [2024-06-10 11:50:57.425395] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:40:13.785 [2024-06-10 11:50:57.488948] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:40:13.785 [2024-06-10 11:50:57.488980] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:40:14.353 11:50:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:40:14.353 11:50:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@863 -- # return 0 00:40:14.353 11:50:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:40:14.353 11:50:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev1_malloc 00:40:14.353 BaseBdev1_malloc 00:40:14.353 11:50:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:40:14.612 [2024-06-10 11:50:58.373092] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:40:14.612 [2024-06-10 11:50:58.373130] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:40:14.612 [2024-06-10 11:50:58.373146] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1759780 00:40:14.612 [2024-06-10 11:50:58.373155] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:40:14.612 [2024-06-10 11:50:58.374498] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:40:14.612 [2024-06-10 11:50:58.374523] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:40:14.612 BaseBdev1 00:40:14.612 11:50:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:40:14.612 11:50:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev2_malloc 00:40:14.612 BaseBdev2_malloc 00:40:14.612 11:50:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:40:14.871 [2024-06-10 11:50:58.702411] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:40:14.871 [2024-06-10 11:50:58.702446] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:40:14.871 [2024-06-10 11:50:58.702460] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1904a50 00:40:14.871 [2024-06-10 11:50:58.702469] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:40:14.871 [2024-06-10 11:50:58.703595] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:40:14.871 [2024-06-10 11:50:58.703630] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:40:14.871 BaseBdev2 00:40:14.871 11:50:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b spare_malloc 00:40:15.129 spare_malloc 00:40:15.129 11:50:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:40:15.129 spare_delay 00:40:15.129 11:50:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:40:15.388 [2024-06-10 11:50:59.223452] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:40:15.388 [2024-06-10 11:50:59.223491] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:40:15.388 [2024-06-10 11:50:59.223504] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1907970 00:40:15.388 [2024-06-10 11:50:59.223513] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:40:15.388 [2024-06-10 11:50:59.224607] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:40:15.388 [2024-06-10 11:50:59.224630] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:40:15.388 spare 00:40:15.388 11:50:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:40:15.647 [2024-06-10 11:50:59.395920] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:40:15.647 [2024-06-10 11:50:59.396925] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:40:15.647 [2024-06-10 11:50:59.397051] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1909410 00:40:15.647 [2024-06-10 11:50:59.397061] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:40:15.647 [2024-06-10 11:50:59.397201] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1908d20 00:40:15.647 [2024-06-10 11:50:59.397310] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1909410 00:40:15.647 [2024-06-10 11:50:59.397317] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1909410 00:40:15.647 [2024-06-10 11:50:59.397385] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:40:15.647 11:50:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:40:15.647 11:50:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:40:15.647 11:50:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:40:15.647 11:50:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:40:15.647 11:50:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:40:15.647 11:50:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:40:15.647 11:50:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:40:15.647 11:50:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:40:15.647 11:50:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:40:15.647 11:50:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:40:15.647 11:50:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:15.647 11:50:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:40:15.647 11:50:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:40:15.647 "name": "raid_bdev1", 00:40:15.647 "uuid": "16219ab5-7e27-4471-9edf-d9adcc26eb3e", 00:40:15.647 "strip_size_kb": 0, 00:40:15.647 "state": "online", 00:40:15.647 "raid_level": "raid1", 00:40:15.647 "superblock": true, 00:40:15.647 "num_base_bdevs": 2, 00:40:15.647 "num_base_bdevs_discovered": 2, 00:40:15.647 "num_base_bdevs_operational": 2, 00:40:15.647 "base_bdevs_list": [ 00:40:15.647 { 00:40:15.647 "name": "BaseBdev1", 00:40:15.647 "uuid": "a43dbddc-c9be-51c9-88a4-dc7c4ffe0a2d", 00:40:15.647 "is_configured": true, 00:40:15.647 "data_offset": 256, 00:40:15.647 "data_size": 7936 00:40:15.647 }, 00:40:15.647 { 00:40:15.647 "name": "BaseBdev2", 00:40:15.647 "uuid": "d1cdc732-d7e7-5bbb-93f2-db930d1b4dfa", 00:40:15.647 "is_configured": true, 00:40:15.647 "data_offset": 256, 00:40:15.647 "data_size": 7936 00:40:15.647 } 00:40:15.647 ] 00:40:15.647 }' 00:40:15.647 11:50:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:40:15.647 11:50:59 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:40:16.213 11:51:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:40:16.213 11:51:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:40:16.471 [2024-06-10 11:51:00.242239] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:40:16.471 11:51:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:40:16.471 11:51:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:16.472 11:51:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:40:16.731 11:51:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:40:16.731 11:51:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:40:16.731 11:51:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:40:16.731 11:51:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:40:16.731 11:51:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:40:16.731 11:51:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:40:16.731 11:51:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:40:16.731 11:51:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # local bdev_list 00:40:16.731 11:51:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:40:16.731 11:51:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # local nbd_list 00:40:16.731 11:51:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@12 -- # local i 00:40:16.731 11:51:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:40:16.731 11:51:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:40:16.731 11:51:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:40:16.731 [2024-06-10 11:51:00.595048] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1908d20 00:40:16.731 /dev/nbd0 00:40:16.731 11:51:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:40:16.731 11:51:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:40:16.731 11:51:00 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:40:16.731 11:51:00 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@868 -- # local i 00:40:16.731 11:51:00 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:40:16.731 11:51:00 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:40:16.731 11:51:00 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:40:16.731 11:51:00 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@872 -- # break 00:40:16.731 11:51:00 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:40:16.731 11:51:00 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:40:16.731 11:51:00 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:40:16.731 1+0 records in 00:40:16.731 1+0 records out 00:40:16.731 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000237916 s, 17.2 MB/s 00:40:16.731 11:51:00 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:40:16.731 11:51:00 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # size=4096 00:40:16.731 11:51:00 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:40:16.731 11:51:00 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:40:16.731 11:51:00 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@888 -- # return 0 00:40:16.731 11:51:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:40:16.731 11:51:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:40:16.731 11:51:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:40:16.731 11:51:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:40:16.731 11:51:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=4096 count=7936 oflag=direct 00:40:17.299 7936+0 records in 00:40:17.299 7936+0 records out 00:40:17.299 32505856 bytes (33 MB, 31 MiB) copied, 0.506451 s, 64.2 MB/s 00:40:17.299 11:51:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:40:17.299 11:51:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:40:17.299 11:51:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:40:17.299 11:51:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # local nbd_list 00:40:17.299 11:51:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@51 -- # local i 00:40:17.299 11:51:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:40:17.299 11:51:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:40:17.557 11:51:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:40:17.557 11:51:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:40:17.557 11:51:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:40:17.557 11:51:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:40:17.557 11:51:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:40:17.557 11:51:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:40:17.557 [2024-06-10 11:51:01.381021] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:40:17.557 11:51:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:40:17.557 11:51:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:40:17.557 11:51:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:40:17.816 [2024-06-10 11:51:01.545482] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:40:17.816 11:51:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:40:17.816 11:51:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:40:17.816 11:51:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:40:17.816 11:51:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:40:17.816 11:51:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:40:17.816 11:51:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:40:17.816 11:51:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:40:17.816 11:51:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:40:17.816 11:51:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:40:17.816 11:51:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:40:17.816 11:51:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:40:17.816 11:51:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:17.816 11:51:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:40:17.816 "name": "raid_bdev1", 00:40:17.816 "uuid": "16219ab5-7e27-4471-9edf-d9adcc26eb3e", 00:40:17.816 "strip_size_kb": 0, 00:40:17.816 "state": "online", 00:40:17.816 "raid_level": "raid1", 00:40:17.816 "superblock": true, 00:40:17.816 "num_base_bdevs": 2, 00:40:17.816 "num_base_bdevs_discovered": 1, 00:40:17.816 "num_base_bdevs_operational": 1, 00:40:17.816 "base_bdevs_list": [ 00:40:17.816 { 00:40:17.816 "name": null, 00:40:17.816 "uuid": "00000000-0000-0000-0000-000000000000", 00:40:17.816 "is_configured": false, 00:40:17.816 "data_offset": 256, 00:40:17.816 "data_size": 7936 00:40:17.816 }, 00:40:17.816 { 00:40:17.816 "name": "BaseBdev2", 00:40:17.816 "uuid": "d1cdc732-d7e7-5bbb-93f2-db930d1b4dfa", 00:40:17.816 "is_configured": true, 00:40:17.816 "data_offset": 256, 00:40:17.816 "data_size": 7936 00:40:17.816 } 00:40:17.816 ] 00:40:17.816 }' 00:40:17.816 11:51:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:40:17.816 11:51:01 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:40:18.382 11:51:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:40:18.640 [2024-06-10 11:51:02.407724] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:40:18.640 [2024-06-10 11:51:02.412159] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1758f60 00:40:18.640 [2024-06-10 11:51:02.413773] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:40:18.640 11:51:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@646 -- # sleep 1 00:40:19.575 11:51:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:40:19.575 11:51:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:40:19.575 11:51:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:40:19.575 11:51:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:40:19.575 11:51:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:40:19.575 11:51:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:19.575 11:51:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:40:19.835 11:51:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:40:19.835 "name": "raid_bdev1", 00:40:19.835 "uuid": "16219ab5-7e27-4471-9edf-d9adcc26eb3e", 00:40:19.835 "strip_size_kb": 0, 00:40:19.835 "state": "online", 00:40:19.835 "raid_level": "raid1", 00:40:19.835 "superblock": true, 00:40:19.835 "num_base_bdevs": 2, 00:40:19.835 "num_base_bdevs_discovered": 2, 00:40:19.835 "num_base_bdevs_operational": 2, 00:40:19.835 "process": { 00:40:19.835 "type": "rebuild", 00:40:19.835 "target": "spare", 00:40:19.835 "progress": { 00:40:19.835 "blocks": 2816, 00:40:19.835 "percent": 35 00:40:19.835 } 00:40:19.835 }, 00:40:19.835 "base_bdevs_list": [ 00:40:19.835 { 00:40:19.835 "name": "spare", 00:40:19.835 "uuid": "ecfbce0a-bea8-5ce1-b34d-c2bc2ad6b7a6", 00:40:19.835 "is_configured": true, 00:40:19.835 "data_offset": 256, 00:40:19.835 "data_size": 7936 00:40:19.835 }, 00:40:19.835 { 00:40:19.835 "name": "BaseBdev2", 00:40:19.835 "uuid": "d1cdc732-d7e7-5bbb-93f2-db930d1b4dfa", 00:40:19.835 "is_configured": true, 00:40:19.835 "data_offset": 256, 00:40:19.835 "data_size": 7936 00:40:19.835 } 00:40:19.835 ] 00:40:19.835 }' 00:40:19.835 11:51:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:40:19.835 11:51:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:40:19.835 11:51:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:40:19.835 11:51:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:40:19.835 11:51:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:40:20.094 [2024-06-10 11:51:03.828607] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:40:20.094 [2024-06-10 11:51:03.924860] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:40:20.094 [2024-06-10 11:51:03.924901] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:40:20.094 [2024-06-10 11:51:03.924912] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:40:20.094 [2024-06-10 11:51:03.924917] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:40:20.094 11:51:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:40:20.094 11:51:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:40:20.094 11:51:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:40:20.094 11:51:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:40:20.094 11:51:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:40:20.094 11:51:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:40:20.094 11:51:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:40:20.094 11:51:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:40:20.094 11:51:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:40:20.094 11:51:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:40:20.094 11:51:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:20.094 11:51:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:40:20.353 11:51:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:40:20.353 "name": "raid_bdev1", 00:40:20.353 "uuid": "16219ab5-7e27-4471-9edf-d9adcc26eb3e", 00:40:20.353 "strip_size_kb": 0, 00:40:20.353 "state": "online", 00:40:20.353 "raid_level": "raid1", 00:40:20.353 "superblock": true, 00:40:20.353 "num_base_bdevs": 2, 00:40:20.353 "num_base_bdevs_discovered": 1, 00:40:20.353 "num_base_bdevs_operational": 1, 00:40:20.353 "base_bdevs_list": [ 00:40:20.353 { 00:40:20.353 "name": null, 00:40:20.353 "uuid": "00000000-0000-0000-0000-000000000000", 00:40:20.353 "is_configured": false, 00:40:20.353 "data_offset": 256, 00:40:20.353 "data_size": 7936 00:40:20.353 }, 00:40:20.353 { 00:40:20.353 "name": "BaseBdev2", 00:40:20.353 "uuid": "d1cdc732-d7e7-5bbb-93f2-db930d1b4dfa", 00:40:20.353 "is_configured": true, 00:40:20.353 "data_offset": 256, 00:40:20.353 "data_size": 7936 00:40:20.353 } 00:40:20.353 ] 00:40:20.353 }' 00:40:20.353 11:51:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:40:20.353 11:51:04 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:40:20.922 11:51:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:40:20.922 11:51:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:40:20.922 11:51:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:40:20.922 11:51:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:40:20.922 11:51:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:40:20.922 11:51:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:20.922 11:51:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:40:20.922 11:51:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:40:20.922 "name": "raid_bdev1", 00:40:20.922 "uuid": "16219ab5-7e27-4471-9edf-d9adcc26eb3e", 00:40:20.922 "strip_size_kb": 0, 00:40:20.922 "state": "online", 00:40:20.922 "raid_level": "raid1", 00:40:20.922 "superblock": true, 00:40:20.922 "num_base_bdevs": 2, 00:40:20.922 "num_base_bdevs_discovered": 1, 00:40:20.922 "num_base_bdevs_operational": 1, 00:40:20.922 "base_bdevs_list": [ 00:40:20.922 { 00:40:20.922 "name": null, 00:40:20.922 "uuid": "00000000-0000-0000-0000-000000000000", 00:40:20.922 "is_configured": false, 00:40:20.922 "data_offset": 256, 00:40:20.922 "data_size": 7936 00:40:20.922 }, 00:40:20.922 { 00:40:20.922 "name": "BaseBdev2", 00:40:20.922 "uuid": "d1cdc732-d7e7-5bbb-93f2-db930d1b4dfa", 00:40:20.922 "is_configured": true, 00:40:20.922 "data_offset": 256, 00:40:20.922 "data_size": 7936 00:40:20.922 } 00:40:20.922 ] 00:40:20.922 }' 00:40:20.922 11:51:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:40:20.922 11:51:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:40:20.922 11:51:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:40:21.182 11:51:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:40:21.182 11:51:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:40:21.182 [2024-06-10 11:51:05.035854] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:40:21.182 [2024-06-10 11:51:05.040342] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1758f60 00:40:21.182 [2024-06-10 11:51:05.041427] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:40:21.182 11:51:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@662 -- # sleep 1 00:40:22.130 11:51:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:40:22.130 11:51:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:40:22.130 11:51:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:40:22.130 11:51:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:40:22.130 11:51:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:40:22.130 11:51:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:22.130 11:51:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:40:22.390 11:51:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:40:22.390 "name": "raid_bdev1", 00:40:22.390 "uuid": "16219ab5-7e27-4471-9edf-d9adcc26eb3e", 00:40:22.390 "strip_size_kb": 0, 00:40:22.390 "state": "online", 00:40:22.390 "raid_level": "raid1", 00:40:22.390 "superblock": true, 00:40:22.390 "num_base_bdevs": 2, 00:40:22.390 "num_base_bdevs_discovered": 2, 00:40:22.390 "num_base_bdevs_operational": 2, 00:40:22.390 "process": { 00:40:22.390 "type": "rebuild", 00:40:22.390 "target": "spare", 00:40:22.390 "progress": { 00:40:22.390 "blocks": 2816, 00:40:22.390 "percent": 35 00:40:22.390 } 00:40:22.390 }, 00:40:22.390 "base_bdevs_list": [ 00:40:22.390 { 00:40:22.390 "name": "spare", 00:40:22.390 "uuid": "ecfbce0a-bea8-5ce1-b34d-c2bc2ad6b7a6", 00:40:22.390 "is_configured": true, 00:40:22.390 "data_offset": 256, 00:40:22.390 "data_size": 7936 00:40:22.390 }, 00:40:22.390 { 00:40:22.390 "name": "BaseBdev2", 00:40:22.390 "uuid": "d1cdc732-d7e7-5bbb-93f2-db930d1b4dfa", 00:40:22.390 "is_configured": true, 00:40:22.390 "data_offset": 256, 00:40:22.390 "data_size": 7936 00:40:22.390 } 00:40:22.390 ] 00:40:22.390 }' 00:40:22.390 11:51:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:40:22.390 11:51:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:40:22.390 11:51:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:40:22.390 11:51:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:40:22.390 11:51:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:40:22.390 11:51:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:40:22.390 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:40:22.390 11:51:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:40:22.390 11:51:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:40:22.390 11:51:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:40:22.390 11:51:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@705 -- # local timeout=796 00:40:22.390 11:51:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:40:22.390 11:51:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:40:22.390 11:51:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:40:22.390 11:51:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:40:22.390 11:51:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:40:22.390 11:51:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:40:22.390 11:51:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:22.390 11:51:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:40:22.651 11:51:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:40:22.651 "name": "raid_bdev1", 00:40:22.651 "uuid": "16219ab5-7e27-4471-9edf-d9adcc26eb3e", 00:40:22.651 "strip_size_kb": 0, 00:40:22.651 "state": "online", 00:40:22.651 "raid_level": "raid1", 00:40:22.651 "superblock": true, 00:40:22.651 "num_base_bdevs": 2, 00:40:22.651 "num_base_bdevs_discovered": 2, 00:40:22.651 "num_base_bdevs_operational": 2, 00:40:22.651 "process": { 00:40:22.651 "type": "rebuild", 00:40:22.651 "target": "spare", 00:40:22.651 "progress": { 00:40:22.651 "blocks": 3584, 00:40:22.651 "percent": 45 00:40:22.651 } 00:40:22.651 }, 00:40:22.651 "base_bdevs_list": [ 00:40:22.651 { 00:40:22.651 "name": "spare", 00:40:22.651 "uuid": "ecfbce0a-bea8-5ce1-b34d-c2bc2ad6b7a6", 00:40:22.651 "is_configured": true, 00:40:22.651 "data_offset": 256, 00:40:22.651 "data_size": 7936 00:40:22.651 }, 00:40:22.651 { 00:40:22.651 "name": "BaseBdev2", 00:40:22.651 "uuid": "d1cdc732-d7e7-5bbb-93f2-db930d1b4dfa", 00:40:22.651 "is_configured": true, 00:40:22.651 "data_offset": 256, 00:40:22.651 "data_size": 7936 00:40:22.651 } 00:40:22.651 ] 00:40:22.651 }' 00:40:22.651 11:51:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:40:22.651 11:51:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:40:22.651 11:51:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:40:22.651 11:51:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:40:22.651 11:51:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@710 -- # sleep 1 00:40:24.041 11:51:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:40:24.041 11:51:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:40:24.041 11:51:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:40:24.041 11:51:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:40:24.041 11:51:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:40:24.041 11:51:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:40:24.041 11:51:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:24.041 11:51:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:40:24.041 11:51:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:40:24.041 "name": "raid_bdev1", 00:40:24.041 "uuid": "16219ab5-7e27-4471-9edf-d9adcc26eb3e", 00:40:24.041 "strip_size_kb": 0, 00:40:24.041 "state": "online", 00:40:24.041 "raid_level": "raid1", 00:40:24.041 "superblock": true, 00:40:24.041 "num_base_bdevs": 2, 00:40:24.041 "num_base_bdevs_discovered": 2, 00:40:24.041 "num_base_bdevs_operational": 2, 00:40:24.041 "process": { 00:40:24.041 "type": "rebuild", 00:40:24.041 "target": "spare", 00:40:24.041 "progress": { 00:40:24.041 "blocks": 6656, 00:40:24.041 "percent": 83 00:40:24.041 } 00:40:24.041 }, 00:40:24.041 "base_bdevs_list": [ 00:40:24.041 { 00:40:24.041 "name": "spare", 00:40:24.041 "uuid": "ecfbce0a-bea8-5ce1-b34d-c2bc2ad6b7a6", 00:40:24.041 "is_configured": true, 00:40:24.041 "data_offset": 256, 00:40:24.041 "data_size": 7936 00:40:24.041 }, 00:40:24.041 { 00:40:24.041 "name": "BaseBdev2", 00:40:24.041 "uuid": "d1cdc732-d7e7-5bbb-93f2-db930d1b4dfa", 00:40:24.041 "is_configured": true, 00:40:24.041 "data_offset": 256, 00:40:24.041 "data_size": 7936 00:40:24.041 } 00:40:24.041 ] 00:40:24.041 }' 00:40:24.041 11:51:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:40:24.041 11:51:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:40:24.041 11:51:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:40:24.041 11:51:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:40:24.041 11:51:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@710 -- # sleep 1 00:40:24.300 [2024-06-10 11:51:08.163780] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:40:24.300 [2024-06-10 11:51:08.163824] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:40:24.300 [2024-06-10 11:51:08.163895] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:40:25.236 11:51:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:40:25.236 11:51:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:40:25.236 11:51:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:40:25.236 11:51:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:40:25.236 11:51:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:40:25.236 11:51:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:40:25.236 11:51:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:25.236 11:51:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:40:25.236 11:51:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:40:25.236 "name": "raid_bdev1", 00:40:25.236 "uuid": "16219ab5-7e27-4471-9edf-d9adcc26eb3e", 00:40:25.236 "strip_size_kb": 0, 00:40:25.236 "state": "online", 00:40:25.236 "raid_level": "raid1", 00:40:25.236 "superblock": true, 00:40:25.236 "num_base_bdevs": 2, 00:40:25.236 "num_base_bdevs_discovered": 2, 00:40:25.236 "num_base_bdevs_operational": 2, 00:40:25.236 "base_bdevs_list": [ 00:40:25.236 { 00:40:25.236 "name": "spare", 00:40:25.236 "uuid": "ecfbce0a-bea8-5ce1-b34d-c2bc2ad6b7a6", 00:40:25.236 "is_configured": true, 00:40:25.236 "data_offset": 256, 00:40:25.236 "data_size": 7936 00:40:25.236 }, 00:40:25.236 { 00:40:25.236 "name": "BaseBdev2", 00:40:25.236 "uuid": "d1cdc732-d7e7-5bbb-93f2-db930d1b4dfa", 00:40:25.236 "is_configured": true, 00:40:25.236 "data_offset": 256, 00:40:25.236 "data_size": 7936 00:40:25.236 } 00:40:25.236 ] 00:40:25.236 }' 00:40:25.236 11:51:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:40:25.236 11:51:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:40:25.236 11:51:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:40:25.236 11:51:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:40:25.236 11:51:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@708 -- # break 00:40:25.236 11:51:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:40:25.236 11:51:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:40:25.236 11:51:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:40:25.236 11:51:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:40:25.236 11:51:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:40:25.236 11:51:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:25.236 11:51:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:40:25.495 11:51:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:40:25.495 "name": "raid_bdev1", 00:40:25.495 "uuid": "16219ab5-7e27-4471-9edf-d9adcc26eb3e", 00:40:25.495 "strip_size_kb": 0, 00:40:25.495 "state": "online", 00:40:25.495 "raid_level": "raid1", 00:40:25.495 "superblock": true, 00:40:25.495 "num_base_bdevs": 2, 00:40:25.495 "num_base_bdevs_discovered": 2, 00:40:25.496 "num_base_bdevs_operational": 2, 00:40:25.496 "base_bdevs_list": [ 00:40:25.496 { 00:40:25.496 "name": "spare", 00:40:25.496 "uuid": "ecfbce0a-bea8-5ce1-b34d-c2bc2ad6b7a6", 00:40:25.496 "is_configured": true, 00:40:25.496 "data_offset": 256, 00:40:25.496 "data_size": 7936 00:40:25.496 }, 00:40:25.496 { 00:40:25.496 "name": "BaseBdev2", 00:40:25.496 "uuid": "d1cdc732-d7e7-5bbb-93f2-db930d1b4dfa", 00:40:25.496 "is_configured": true, 00:40:25.496 "data_offset": 256, 00:40:25.496 "data_size": 7936 00:40:25.496 } 00:40:25.496 ] 00:40:25.496 }' 00:40:25.496 11:51:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:40:25.496 11:51:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:40:25.496 11:51:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:40:25.496 11:51:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:40:25.496 11:51:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:40:25.496 11:51:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:40:25.496 11:51:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:40:25.496 11:51:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:40:25.496 11:51:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:40:25.496 11:51:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:40:25.496 11:51:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:40:25.496 11:51:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:40:25.496 11:51:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:40:25.496 11:51:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:40:25.496 11:51:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:25.496 11:51:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:40:25.755 11:51:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:40:25.755 "name": "raid_bdev1", 00:40:25.755 "uuid": "16219ab5-7e27-4471-9edf-d9adcc26eb3e", 00:40:25.755 "strip_size_kb": 0, 00:40:25.755 "state": "online", 00:40:25.755 "raid_level": "raid1", 00:40:25.755 "superblock": true, 00:40:25.755 "num_base_bdevs": 2, 00:40:25.755 "num_base_bdevs_discovered": 2, 00:40:25.755 "num_base_bdevs_operational": 2, 00:40:25.755 "base_bdevs_list": [ 00:40:25.755 { 00:40:25.755 "name": "spare", 00:40:25.755 "uuid": "ecfbce0a-bea8-5ce1-b34d-c2bc2ad6b7a6", 00:40:25.755 "is_configured": true, 00:40:25.755 "data_offset": 256, 00:40:25.755 "data_size": 7936 00:40:25.755 }, 00:40:25.755 { 00:40:25.755 "name": "BaseBdev2", 00:40:25.755 "uuid": "d1cdc732-d7e7-5bbb-93f2-db930d1b4dfa", 00:40:25.755 "is_configured": true, 00:40:25.755 "data_offset": 256, 00:40:25.755 "data_size": 7936 00:40:25.755 } 00:40:25.755 ] 00:40:25.755 }' 00:40:25.755 11:51:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:40:25.755 11:51:09 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:40:26.322 11:51:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:40:26.322 [2024-06-10 11:51:10.141703] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:40:26.322 [2024-06-10 11:51:10.141728] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:40:26.322 [2024-06-10 11:51:10.141767] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:40:26.322 [2024-06-10 11:51:10.141806] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:40:26.322 [2024-06-10 11:51:10.141815] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1909410 name raid_bdev1, state offline 00:40:26.322 11:51:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:26.322 11:51:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # jq length 00:40:26.581 11:51:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:40:26.581 11:51:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:40:26.581 11:51:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:40:26.581 11:51:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:40:26.581 11:51:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:40:26.581 11:51:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:40:26.581 11:51:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # local bdev_list 00:40:26.581 11:51:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:40:26.581 11:51:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # local nbd_list 00:40:26.581 11:51:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@12 -- # local i 00:40:26.581 11:51:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:40:26.581 11:51:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:40:26.581 11:51:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:40:26.581 /dev/nbd0 00:40:26.840 11:51:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:40:26.840 11:51:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:40:26.840 11:51:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:40:26.840 11:51:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@868 -- # local i 00:40:26.840 11:51:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:40:26.840 11:51:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:40:26.840 11:51:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:40:26.840 11:51:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@872 -- # break 00:40:26.840 11:51:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:40:26.840 11:51:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:40:26.840 11:51:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:40:26.840 1+0 records in 00:40:26.840 1+0 records out 00:40:26.840 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000124248 s, 33.0 MB/s 00:40:26.840 11:51:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:40:26.840 11:51:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # size=4096 00:40:26.840 11:51:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:40:26.840 11:51:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:40:26.840 11:51:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@888 -- # return 0 00:40:26.840 11:51:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:40:26.840 11:51:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:40:26.840 11:51:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:40:26.840 /dev/nbd1 00:40:26.840 11:51:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:40:26.840 11:51:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:40:26.840 11:51:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:40:26.840 11:51:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@868 -- # local i 00:40:26.840 11:51:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:40:26.840 11:51:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:40:26.840 11:51:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:40:26.840 11:51:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@872 -- # break 00:40:26.840 11:51:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:40:26.840 11:51:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:40:26.840 11:51:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:40:26.840 1+0 records in 00:40:26.840 1+0 records out 00:40:26.840 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000283546 s, 14.4 MB/s 00:40:26.840 11:51:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:40:26.840 11:51:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # size=4096 00:40:26.840 11:51:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:40:26.840 11:51:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:40:26.840 11:51:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@888 -- # return 0 00:40:26.840 11:51:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:40:26.840 11:51:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:40:26.840 11:51:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:40:27.099 11:51:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:40:27.099 11:51:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:40:27.099 11:51:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:40:27.099 11:51:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # local nbd_list 00:40:27.099 11:51:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@51 -- # local i 00:40:27.099 11:51:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:40:27.099 11:51:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:40:27.099 11:51:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:40:27.099 11:51:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:40:27.099 11:51:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:40:27.099 11:51:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:40:27.099 11:51:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:40:27.100 11:51:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:40:27.100 11:51:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:40:27.100 11:51:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:40:27.100 11:51:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:40:27.100 11:51:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:40:27.359 11:51:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:40:27.359 11:51:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:40:27.359 11:51:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:40:27.359 11:51:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:40:27.359 11:51:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:40:27.359 11:51:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:40:27.359 11:51:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:40:27.359 11:51:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:40:27.359 11:51:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:40:27.359 11:51:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:40:27.617 11:51:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:40:27.617 [2024-06-10 11:51:11.544850] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:40:27.617 [2024-06-10 11:51:11.544893] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:40:27.617 [2024-06-10 11:51:11.544908] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1909690 00:40:27.617 [2024-06-10 11:51:11.544916] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:40:27.617 [2024-06-10 11:51:11.546134] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:40:27.617 [2024-06-10 11:51:11.546159] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:40:27.617 [2024-06-10 11:51:11.546218] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:40:27.617 [2024-06-10 11:51:11.546237] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:40:27.617 [2024-06-10 11:51:11.546310] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:40:27.617 spare 00:40:27.877 11:51:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:40:27.877 11:51:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:40:27.877 11:51:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:40:27.878 11:51:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:40:27.878 11:51:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:40:27.878 11:51:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:40:27.878 11:51:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:40:27.878 11:51:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:40:27.878 11:51:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:40:27.878 11:51:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:40:27.878 11:51:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:27.878 11:51:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:40:27.878 [2024-06-10 11:51:11.646606] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1906670 00:40:27.878 [2024-06-10 11:51:11.646619] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:40:27.878 [2024-06-10 11:51:11.646757] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1906950 00:40:27.878 [2024-06-10 11:51:11.646861] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1906670 00:40:27.878 [2024-06-10 11:51:11.646873] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1906670 00:40:27.878 [2024-06-10 11:51:11.646942] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:40:27.878 11:51:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:40:27.878 "name": "raid_bdev1", 00:40:27.878 "uuid": "16219ab5-7e27-4471-9edf-d9adcc26eb3e", 00:40:27.878 "strip_size_kb": 0, 00:40:27.878 "state": "online", 00:40:27.878 "raid_level": "raid1", 00:40:27.878 "superblock": true, 00:40:27.878 "num_base_bdevs": 2, 00:40:27.878 "num_base_bdevs_discovered": 2, 00:40:27.878 "num_base_bdevs_operational": 2, 00:40:27.878 "base_bdevs_list": [ 00:40:27.878 { 00:40:27.878 "name": "spare", 00:40:27.878 "uuid": "ecfbce0a-bea8-5ce1-b34d-c2bc2ad6b7a6", 00:40:27.878 "is_configured": true, 00:40:27.878 "data_offset": 256, 00:40:27.878 "data_size": 7936 00:40:27.878 }, 00:40:27.878 { 00:40:27.878 "name": "BaseBdev2", 00:40:27.878 "uuid": "d1cdc732-d7e7-5bbb-93f2-db930d1b4dfa", 00:40:27.878 "is_configured": true, 00:40:27.878 "data_offset": 256, 00:40:27.878 "data_size": 7936 00:40:27.878 } 00:40:27.878 ] 00:40:27.878 }' 00:40:27.878 11:51:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:40:27.878 11:51:11 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:40:28.445 11:51:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:40:28.445 11:51:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:40:28.445 11:51:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:40:28.445 11:51:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:40:28.445 11:51:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:40:28.445 11:51:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:28.445 11:51:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:40:28.703 11:51:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:40:28.703 "name": "raid_bdev1", 00:40:28.703 "uuid": "16219ab5-7e27-4471-9edf-d9adcc26eb3e", 00:40:28.703 "strip_size_kb": 0, 00:40:28.703 "state": "online", 00:40:28.703 "raid_level": "raid1", 00:40:28.703 "superblock": true, 00:40:28.703 "num_base_bdevs": 2, 00:40:28.703 "num_base_bdevs_discovered": 2, 00:40:28.703 "num_base_bdevs_operational": 2, 00:40:28.703 "base_bdevs_list": [ 00:40:28.703 { 00:40:28.703 "name": "spare", 00:40:28.703 "uuid": "ecfbce0a-bea8-5ce1-b34d-c2bc2ad6b7a6", 00:40:28.703 "is_configured": true, 00:40:28.703 "data_offset": 256, 00:40:28.703 "data_size": 7936 00:40:28.703 }, 00:40:28.703 { 00:40:28.703 "name": "BaseBdev2", 00:40:28.703 "uuid": "d1cdc732-d7e7-5bbb-93f2-db930d1b4dfa", 00:40:28.703 "is_configured": true, 00:40:28.703 "data_offset": 256, 00:40:28.703 "data_size": 7936 00:40:28.703 } 00:40:28.703 ] 00:40:28.703 }' 00:40:28.703 11:51:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:40:28.703 11:51:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:40:28.703 11:51:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:40:28.703 11:51:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:40:28.703 11:51:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:28.703 11:51:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:40:28.962 11:51:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:40:28.962 11:51:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:40:28.962 [2024-06-10 11:51:12.808178] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:40:28.962 11:51:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:40:28.962 11:51:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:40:28.962 11:51:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:40:28.962 11:51:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:40:28.962 11:51:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:40:28.962 11:51:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:40:28.962 11:51:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:40:28.962 11:51:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:40:28.962 11:51:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:40:28.962 11:51:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:40:28.962 11:51:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:28.962 11:51:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:40:29.222 11:51:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:40:29.222 "name": "raid_bdev1", 00:40:29.222 "uuid": "16219ab5-7e27-4471-9edf-d9adcc26eb3e", 00:40:29.222 "strip_size_kb": 0, 00:40:29.222 "state": "online", 00:40:29.222 "raid_level": "raid1", 00:40:29.222 "superblock": true, 00:40:29.222 "num_base_bdevs": 2, 00:40:29.222 "num_base_bdevs_discovered": 1, 00:40:29.222 "num_base_bdevs_operational": 1, 00:40:29.222 "base_bdevs_list": [ 00:40:29.222 { 00:40:29.222 "name": null, 00:40:29.222 "uuid": "00000000-0000-0000-0000-000000000000", 00:40:29.222 "is_configured": false, 00:40:29.222 "data_offset": 256, 00:40:29.222 "data_size": 7936 00:40:29.222 }, 00:40:29.222 { 00:40:29.222 "name": "BaseBdev2", 00:40:29.222 "uuid": "d1cdc732-d7e7-5bbb-93f2-db930d1b4dfa", 00:40:29.222 "is_configured": true, 00:40:29.222 "data_offset": 256, 00:40:29.222 "data_size": 7936 00:40:29.222 } 00:40:29.222 ] 00:40:29.222 }' 00:40:29.222 11:51:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:40:29.222 11:51:13 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:40:29.790 11:51:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:40:29.790 [2024-06-10 11:51:13.658401] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:40:29.790 [2024-06-10 11:51:13.658516] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:40:29.790 [2024-06-10 11:51:13.658528] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:40:29.790 [2024-06-10 11:51:13.658548] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:40:29.790 [2024-06-10 11:51:13.663412] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1907da0 00:40:29.790 [2024-06-10 11:51:13.665234] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:40:29.790 11:51:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@755 -- # sleep 1 00:40:31.167 11:51:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:40:31.167 11:51:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:40:31.167 11:51:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:40:31.167 11:51:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:40:31.167 11:51:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:40:31.167 11:51:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:31.167 11:51:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:40:31.167 11:51:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:40:31.167 "name": "raid_bdev1", 00:40:31.167 "uuid": "16219ab5-7e27-4471-9edf-d9adcc26eb3e", 00:40:31.167 "strip_size_kb": 0, 00:40:31.167 "state": "online", 00:40:31.167 "raid_level": "raid1", 00:40:31.167 "superblock": true, 00:40:31.167 "num_base_bdevs": 2, 00:40:31.167 "num_base_bdevs_discovered": 2, 00:40:31.167 "num_base_bdevs_operational": 2, 00:40:31.167 "process": { 00:40:31.167 "type": "rebuild", 00:40:31.167 "target": "spare", 00:40:31.167 "progress": { 00:40:31.167 "blocks": 2816, 00:40:31.167 "percent": 35 00:40:31.167 } 00:40:31.167 }, 00:40:31.167 "base_bdevs_list": [ 00:40:31.167 { 00:40:31.167 "name": "spare", 00:40:31.167 "uuid": "ecfbce0a-bea8-5ce1-b34d-c2bc2ad6b7a6", 00:40:31.167 "is_configured": true, 00:40:31.167 "data_offset": 256, 00:40:31.167 "data_size": 7936 00:40:31.167 }, 00:40:31.167 { 00:40:31.167 "name": "BaseBdev2", 00:40:31.167 "uuid": "d1cdc732-d7e7-5bbb-93f2-db930d1b4dfa", 00:40:31.167 "is_configured": true, 00:40:31.167 "data_offset": 256, 00:40:31.167 "data_size": 7936 00:40:31.167 } 00:40:31.167 ] 00:40:31.167 }' 00:40:31.168 11:51:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:40:31.168 11:51:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:40:31.168 11:51:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:40:31.168 11:51:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:40:31.168 11:51:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:40:31.168 [2024-06-10 11:51:15.099546] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:40:31.427 [2024-06-10 11:51:15.176147] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:40:31.427 [2024-06-10 11:51:15.176182] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:40:31.427 [2024-06-10 11:51:15.176192] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:40:31.427 [2024-06-10 11:51:15.176198] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:40:31.427 11:51:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:40:31.427 11:51:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:40:31.427 11:51:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:40:31.427 11:51:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:40:31.427 11:51:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:40:31.427 11:51:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:40:31.427 11:51:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:40:31.427 11:51:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:40:31.427 11:51:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:40:31.427 11:51:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:40:31.427 11:51:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:31.427 11:51:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:40:31.686 11:51:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:40:31.686 "name": "raid_bdev1", 00:40:31.686 "uuid": "16219ab5-7e27-4471-9edf-d9adcc26eb3e", 00:40:31.686 "strip_size_kb": 0, 00:40:31.686 "state": "online", 00:40:31.686 "raid_level": "raid1", 00:40:31.686 "superblock": true, 00:40:31.686 "num_base_bdevs": 2, 00:40:31.686 "num_base_bdevs_discovered": 1, 00:40:31.686 "num_base_bdevs_operational": 1, 00:40:31.686 "base_bdevs_list": [ 00:40:31.686 { 00:40:31.686 "name": null, 00:40:31.686 "uuid": "00000000-0000-0000-0000-000000000000", 00:40:31.686 "is_configured": false, 00:40:31.686 "data_offset": 256, 00:40:31.686 "data_size": 7936 00:40:31.686 }, 00:40:31.686 { 00:40:31.686 "name": "BaseBdev2", 00:40:31.686 "uuid": "d1cdc732-d7e7-5bbb-93f2-db930d1b4dfa", 00:40:31.686 "is_configured": true, 00:40:31.686 "data_offset": 256, 00:40:31.686 "data_size": 7936 00:40:31.686 } 00:40:31.686 ] 00:40:31.686 }' 00:40:31.686 11:51:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:40:31.686 11:51:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:40:31.945 11:51:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:40:32.204 [2024-06-10 11:51:16.031152] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:40:32.204 [2024-06-10 11:51:16.031189] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:40:32.204 [2024-06-10 11:51:16.031204] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1907ba0 00:40:32.204 [2024-06-10 11:51:16.031212] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:40:32.204 [2024-06-10 11:51:16.031471] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:40:32.204 [2024-06-10 11:51:16.031485] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:40:32.204 [2024-06-10 11:51:16.031543] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:40:32.204 [2024-06-10 11:51:16.031552] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:40:32.204 [2024-06-10 11:51:16.031559] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:40:32.204 [2024-06-10 11:51:16.031572] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:40:32.204 [2024-06-10 11:51:16.035958] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x190a230 00:40:32.204 spare 00:40:32.204 [2024-06-10 11:51:16.037018] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:40:32.204 11:51:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@762 -- # sleep 1 00:40:33.201 11:51:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:40:33.201 11:51:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:40:33.201 11:51:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:40:33.201 11:51:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:40:33.201 11:51:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:40:33.201 11:51:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:33.201 11:51:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:40:33.461 11:51:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:40:33.461 "name": "raid_bdev1", 00:40:33.461 "uuid": "16219ab5-7e27-4471-9edf-d9adcc26eb3e", 00:40:33.461 "strip_size_kb": 0, 00:40:33.461 "state": "online", 00:40:33.461 "raid_level": "raid1", 00:40:33.461 "superblock": true, 00:40:33.461 "num_base_bdevs": 2, 00:40:33.461 "num_base_bdevs_discovered": 2, 00:40:33.461 "num_base_bdevs_operational": 2, 00:40:33.461 "process": { 00:40:33.461 "type": "rebuild", 00:40:33.461 "target": "spare", 00:40:33.461 "progress": { 00:40:33.461 "blocks": 2816, 00:40:33.461 "percent": 35 00:40:33.461 } 00:40:33.461 }, 00:40:33.461 "base_bdevs_list": [ 00:40:33.461 { 00:40:33.461 "name": "spare", 00:40:33.461 "uuid": "ecfbce0a-bea8-5ce1-b34d-c2bc2ad6b7a6", 00:40:33.461 "is_configured": true, 00:40:33.461 "data_offset": 256, 00:40:33.461 "data_size": 7936 00:40:33.461 }, 00:40:33.461 { 00:40:33.461 "name": "BaseBdev2", 00:40:33.461 "uuid": "d1cdc732-d7e7-5bbb-93f2-db930d1b4dfa", 00:40:33.461 "is_configured": true, 00:40:33.461 "data_offset": 256, 00:40:33.461 "data_size": 7936 00:40:33.461 } 00:40:33.461 ] 00:40:33.461 }' 00:40:33.461 11:51:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:40:33.461 11:51:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:40:33.461 11:51:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:40:33.461 11:51:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:40:33.461 11:51:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:40:33.721 [2024-06-10 11:51:17.467473] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:40:33.721 [2024-06-10 11:51:17.548124] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:40:33.721 [2024-06-10 11:51:17.548155] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:40:33.721 [2024-06-10 11:51:17.548164] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:40:33.721 [2024-06-10 11:51:17.548170] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:40:33.721 11:51:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:40:33.721 11:51:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:40:33.721 11:51:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:40:33.721 11:51:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:40:33.721 11:51:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:40:33.721 11:51:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:40:33.721 11:51:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:40:33.721 11:51:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:40:33.721 11:51:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:40:33.721 11:51:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:40:33.721 11:51:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:33.721 11:51:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:40:33.980 11:51:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:40:33.980 "name": "raid_bdev1", 00:40:33.980 "uuid": "16219ab5-7e27-4471-9edf-d9adcc26eb3e", 00:40:33.980 "strip_size_kb": 0, 00:40:33.980 "state": "online", 00:40:33.980 "raid_level": "raid1", 00:40:33.980 "superblock": true, 00:40:33.980 "num_base_bdevs": 2, 00:40:33.980 "num_base_bdevs_discovered": 1, 00:40:33.980 "num_base_bdevs_operational": 1, 00:40:33.980 "base_bdevs_list": [ 00:40:33.980 { 00:40:33.980 "name": null, 00:40:33.980 "uuid": "00000000-0000-0000-0000-000000000000", 00:40:33.980 "is_configured": false, 00:40:33.980 "data_offset": 256, 00:40:33.980 "data_size": 7936 00:40:33.980 }, 00:40:33.980 { 00:40:33.980 "name": "BaseBdev2", 00:40:33.980 "uuid": "d1cdc732-d7e7-5bbb-93f2-db930d1b4dfa", 00:40:33.980 "is_configured": true, 00:40:33.980 "data_offset": 256, 00:40:33.980 "data_size": 7936 00:40:33.980 } 00:40:33.980 ] 00:40:33.980 }' 00:40:33.980 11:51:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:40:33.980 11:51:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:40:34.565 11:51:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:40:34.565 11:51:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:40:34.565 11:51:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:40:34.565 11:51:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:40:34.565 11:51:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:40:34.565 11:51:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:34.565 11:51:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:40:34.565 11:51:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:40:34.565 "name": "raid_bdev1", 00:40:34.565 "uuid": "16219ab5-7e27-4471-9edf-d9adcc26eb3e", 00:40:34.565 "strip_size_kb": 0, 00:40:34.565 "state": "online", 00:40:34.565 "raid_level": "raid1", 00:40:34.565 "superblock": true, 00:40:34.565 "num_base_bdevs": 2, 00:40:34.565 "num_base_bdevs_discovered": 1, 00:40:34.565 "num_base_bdevs_operational": 1, 00:40:34.565 "base_bdevs_list": [ 00:40:34.565 { 00:40:34.565 "name": null, 00:40:34.565 "uuid": "00000000-0000-0000-0000-000000000000", 00:40:34.565 "is_configured": false, 00:40:34.565 "data_offset": 256, 00:40:34.565 "data_size": 7936 00:40:34.565 }, 00:40:34.565 { 00:40:34.565 "name": "BaseBdev2", 00:40:34.565 "uuid": "d1cdc732-d7e7-5bbb-93f2-db930d1b4dfa", 00:40:34.565 "is_configured": true, 00:40:34.565 "data_offset": 256, 00:40:34.566 "data_size": 7936 00:40:34.566 } 00:40:34.566 ] 00:40:34.566 }' 00:40:34.566 11:51:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:40:34.566 11:51:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:40:34.566 11:51:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:40:34.566 11:51:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:40:34.566 11:51:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:40:34.824 11:51:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:40:35.083 [2024-06-10 11:51:18.780003] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:40:35.083 [2024-06-10 11:51:18.780038] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:40:35.083 [2024-06-10 11:51:18.780052] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1909c00 00:40:35.083 [2024-06-10 11:51:18.780060] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:40:35.083 [2024-06-10 11:51:18.780313] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:40:35.083 [2024-06-10 11:51:18.780327] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:40:35.083 [2024-06-10 11:51:18.780375] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:40:35.083 [2024-06-10 11:51:18.780386] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:40:35.083 [2024-06-10 11:51:18.780394] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:40:35.083 BaseBdev1 00:40:35.083 11:51:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@773 -- # sleep 1 00:40:36.020 11:51:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:40:36.020 11:51:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:40:36.020 11:51:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:40:36.020 11:51:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:40:36.020 11:51:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:40:36.020 11:51:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:40:36.020 11:51:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:40:36.020 11:51:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:40:36.020 11:51:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:40:36.020 11:51:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:40:36.020 11:51:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:36.020 11:51:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:40:36.278 11:51:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:40:36.278 "name": "raid_bdev1", 00:40:36.278 "uuid": "16219ab5-7e27-4471-9edf-d9adcc26eb3e", 00:40:36.278 "strip_size_kb": 0, 00:40:36.278 "state": "online", 00:40:36.278 "raid_level": "raid1", 00:40:36.278 "superblock": true, 00:40:36.278 "num_base_bdevs": 2, 00:40:36.278 "num_base_bdevs_discovered": 1, 00:40:36.278 "num_base_bdevs_operational": 1, 00:40:36.278 "base_bdevs_list": [ 00:40:36.278 { 00:40:36.278 "name": null, 00:40:36.278 "uuid": "00000000-0000-0000-0000-000000000000", 00:40:36.278 "is_configured": false, 00:40:36.278 "data_offset": 256, 00:40:36.278 "data_size": 7936 00:40:36.278 }, 00:40:36.278 { 00:40:36.278 "name": "BaseBdev2", 00:40:36.278 "uuid": "d1cdc732-d7e7-5bbb-93f2-db930d1b4dfa", 00:40:36.278 "is_configured": true, 00:40:36.278 "data_offset": 256, 00:40:36.278 "data_size": 7936 00:40:36.278 } 00:40:36.278 ] 00:40:36.278 }' 00:40:36.278 11:51:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:40:36.278 11:51:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:40:36.536 11:51:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:40:36.537 11:51:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:40:36.537 11:51:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:40:36.537 11:51:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:40:36.537 11:51:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:40:36.537 11:51:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:36.537 11:51:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:40:36.796 11:51:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:40:36.796 "name": "raid_bdev1", 00:40:36.796 "uuid": "16219ab5-7e27-4471-9edf-d9adcc26eb3e", 00:40:36.796 "strip_size_kb": 0, 00:40:36.796 "state": "online", 00:40:36.796 "raid_level": "raid1", 00:40:36.796 "superblock": true, 00:40:36.796 "num_base_bdevs": 2, 00:40:36.796 "num_base_bdevs_discovered": 1, 00:40:36.796 "num_base_bdevs_operational": 1, 00:40:36.796 "base_bdevs_list": [ 00:40:36.796 { 00:40:36.796 "name": null, 00:40:36.796 "uuid": "00000000-0000-0000-0000-000000000000", 00:40:36.796 "is_configured": false, 00:40:36.796 "data_offset": 256, 00:40:36.796 "data_size": 7936 00:40:36.796 }, 00:40:36.796 { 00:40:36.796 "name": "BaseBdev2", 00:40:36.796 "uuid": "d1cdc732-d7e7-5bbb-93f2-db930d1b4dfa", 00:40:36.796 "is_configured": true, 00:40:36.796 "data_offset": 256, 00:40:36.796 "data_size": 7936 00:40:36.796 } 00:40:36.796 ] 00:40:36.796 }' 00:40:36.796 11:51:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:40:36.796 11:51:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:40:36.796 11:51:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:40:36.796 11:51:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:40:36.796 11:51:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:40:36.796 11:51:20 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@649 -- # local es=0 00:40:36.796 11:51:20 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:40:36.796 11:51:20 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:40:36.796 11:51:20 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:40:36.796 11:51:20 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:40:36.796 11:51:20 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:40:36.796 11:51:20 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:40:36.797 11:51:20 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:40:36.797 11:51:20 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:40:36.797 11:51:20 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:40:36.797 11:51:20 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:40:37.056 [2024-06-10 11:51:20.881473] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:40:37.056 [2024-06-10 11:51:20.881573] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:40:37.056 [2024-06-10 11:51:20.881584] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:40:37.056 request: 00:40:37.056 { 00:40:37.056 "raid_bdev": "raid_bdev1", 00:40:37.056 "base_bdev": "BaseBdev1", 00:40:37.056 "method": "bdev_raid_add_base_bdev", 00:40:37.056 "req_id": 1 00:40:37.056 } 00:40:37.056 Got JSON-RPC error response 00:40:37.056 response: 00:40:37.056 { 00:40:37.056 "code": -22, 00:40:37.056 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:40:37.056 } 00:40:37.056 11:51:20 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@652 -- # es=1 00:40:37.056 11:51:20 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:40:37.056 11:51:20 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:40:37.056 11:51:20 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:40:37.056 11:51:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@777 -- # sleep 1 00:40:37.992 11:51:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:40:37.992 11:51:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:40:37.992 11:51:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:40:37.992 11:51:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:40:37.992 11:51:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:40:37.992 11:51:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:40:37.992 11:51:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:40:37.992 11:51:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:40:37.992 11:51:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:40:37.992 11:51:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:40:37.992 11:51:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:37.992 11:51:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:40:38.252 11:51:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:40:38.252 "name": "raid_bdev1", 00:40:38.252 "uuid": "16219ab5-7e27-4471-9edf-d9adcc26eb3e", 00:40:38.252 "strip_size_kb": 0, 00:40:38.252 "state": "online", 00:40:38.252 "raid_level": "raid1", 00:40:38.252 "superblock": true, 00:40:38.252 "num_base_bdevs": 2, 00:40:38.252 "num_base_bdevs_discovered": 1, 00:40:38.252 "num_base_bdevs_operational": 1, 00:40:38.252 "base_bdevs_list": [ 00:40:38.252 { 00:40:38.252 "name": null, 00:40:38.252 "uuid": "00000000-0000-0000-0000-000000000000", 00:40:38.252 "is_configured": false, 00:40:38.252 "data_offset": 256, 00:40:38.252 "data_size": 7936 00:40:38.252 }, 00:40:38.252 { 00:40:38.252 "name": "BaseBdev2", 00:40:38.252 "uuid": "d1cdc732-d7e7-5bbb-93f2-db930d1b4dfa", 00:40:38.252 "is_configured": true, 00:40:38.252 "data_offset": 256, 00:40:38.252 "data_size": 7936 00:40:38.252 } 00:40:38.252 ] 00:40:38.252 }' 00:40:38.252 11:51:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:40:38.252 11:51:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:40:38.820 11:51:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:40:38.820 11:51:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:40:38.820 11:51:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:40:38.820 11:51:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:40:38.820 11:51:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:40:38.820 11:51:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:38.820 11:51:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:40:38.820 11:51:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:40:38.820 "name": "raid_bdev1", 00:40:38.820 "uuid": "16219ab5-7e27-4471-9edf-d9adcc26eb3e", 00:40:38.820 "strip_size_kb": 0, 00:40:38.820 "state": "online", 00:40:38.820 "raid_level": "raid1", 00:40:38.820 "superblock": true, 00:40:38.820 "num_base_bdevs": 2, 00:40:38.820 "num_base_bdevs_discovered": 1, 00:40:38.820 "num_base_bdevs_operational": 1, 00:40:38.820 "base_bdevs_list": [ 00:40:38.820 { 00:40:38.820 "name": null, 00:40:38.820 "uuid": "00000000-0000-0000-0000-000000000000", 00:40:38.820 "is_configured": false, 00:40:38.820 "data_offset": 256, 00:40:38.820 "data_size": 7936 00:40:38.820 }, 00:40:38.820 { 00:40:38.820 "name": "BaseBdev2", 00:40:38.820 "uuid": "d1cdc732-d7e7-5bbb-93f2-db930d1b4dfa", 00:40:38.820 "is_configured": true, 00:40:38.820 "data_offset": 256, 00:40:38.820 "data_size": 7936 00:40:38.820 } 00:40:38.820 ] 00:40:38.820 }' 00:40:38.820 11:51:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:40:39.078 11:51:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:40:39.078 11:51:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:40:39.078 11:51:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:40:39.078 11:51:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@782 -- # killprocess 246594 00:40:39.078 11:51:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@949 -- # '[' -z 246594 ']' 00:40:39.078 11:51:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@953 -- # kill -0 246594 00:40:39.078 11:51:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@954 -- # uname 00:40:39.078 11:51:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:40:39.078 11:51:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 246594 00:40:39.078 11:51:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:40:39.078 11:51:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:40:39.078 11:51:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@967 -- # echo 'killing process with pid 246594' 00:40:39.078 killing process with pid 246594 00:40:39.078 11:51:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@968 -- # kill 246594 00:40:39.078 Received shutdown signal, test time was about 60.000000 seconds 00:40:39.078 00:40:39.078 Latency(us) 00:40:39.078 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:40:39.078 =================================================================================================================== 00:40:39.078 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:40:39.078 [2024-06-10 11:51:22.860796] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:40:39.078 [2024-06-10 11:51:22.860859] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:40:39.078 [2024-06-10 11:51:22.860893] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:40:39.078 [2024-06-10 11:51:22.860907] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1906670 name raid_bdev1, state offline 00:40:39.078 11:51:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@973 -- # wait 246594 00:40:39.078 [2024-06-10 11:51:22.891222] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:40:39.336 11:51:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@784 -- # return 0 00:40:39.336 00:40:39.336 real 0m25.908s 00:40:39.336 user 0m39.011s 00:40:39.336 sys 0m4.139s 00:40:39.336 11:51:23 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@1125 -- # xtrace_disable 00:40:39.336 11:51:23 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:40:39.336 ************************************ 00:40:39.336 END TEST raid_rebuild_test_sb_4k 00:40:39.336 ************************************ 00:40:39.336 11:51:23 bdev_raid -- bdev/bdev_raid.sh@904 -- # base_malloc_params='-m 32' 00:40:39.336 11:51:23 bdev_raid -- bdev/bdev_raid.sh@905 -- # run_test raid_state_function_test_sb_md_separate raid_state_function_test raid1 2 true 00:40:39.336 11:51:23 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:40:39.336 11:51:23 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:40:39.336 11:51:23 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:40:39.336 ************************************ 00:40:39.336 START TEST raid_state_function_test_sb_md_separate 00:40:39.336 ************************************ 00:40:39.336 11:51:23 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@1124 -- # raid_state_function_test raid1 2 true 00:40:39.336 11:51:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:40:39.336 11:51:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:40:39.336 11:51:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:40:39.336 11:51:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:40:39.336 11:51:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:40:39.336 11:51:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:40:39.336 11:51:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:40:39.336 11:51:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:40:39.336 11:51:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:40:39.336 11:51:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:40:39.336 11:51:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:40:39.336 11:51:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:40:39.336 11:51:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:40:39.336 11:51:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:40:39.336 11:51:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:40:39.336 11:51:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # local strip_size 00:40:39.336 11:51:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:40:39.336 11:51:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:40:39.336 11:51:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:40:39.336 11:51:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:40:39.336 11:51:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:40:39.336 11:51:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:40:39.336 11:51:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:40:39.336 11:51:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@244 -- # raid_pid=250936 00:40:39.336 11:51:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 250936' 00:40:39.336 Process raid pid: 250936 00:40:39.336 11:51:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@246 -- # waitforlisten 250936 /var/tmp/spdk-raid.sock 00:40:39.336 11:51:23 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@830 -- # '[' -z 250936 ']' 00:40:39.336 11:51:23 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:40:39.336 11:51:23 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@835 -- # local max_retries=100 00:40:39.336 11:51:23 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:40:39.336 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:40:39.336 11:51:23 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@839 -- # xtrace_disable 00:40:39.336 11:51:23 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:40:39.336 [2024-06-10 11:51:23.216873] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:40:39.336 [2024-06-10 11:51:23.216925] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:40:39.596 [2024-06-10 11:51:23.303076] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:40:39.596 [2024-06-10 11:51:23.387970] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:40:39.596 [2024-06-10 11:51:23.441630] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:40:39.596 [2024-06-10 11:51:23.441653] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:40:40.163 11:51:24 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:40:40.163 11:51:24 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@863 -- # return 0 00:40:40.163 11:51:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:40:40.423 [2024-06-10 11:51:24.189979] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:40:40.423 [2024-06-10 11:51:24.190013] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:40:40.423 [2024-06-10 11:51:24.190020] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:40:40.423 [2024-06-10 11:51:24.190028] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:40:40.423 11:51:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:40:40.423 11:51:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:40:40.423 11:51:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:40:40.423 11:51:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:40:40.423 11:51:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:40:40.423 11:51:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:40:40.423 11:51:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:40:40.423 11:51:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:40:40.423 11:51:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:40:40.423 11:51:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:40:40.423 11:51:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:40.423 11:51:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:40:40.681 11:51:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:40:40.681 "name": "Existed_Raid", 00:40:40.681 "uuid": "88a89b6c-a8a8-4dd9-96de-61bad0b9f629", 00:40:40.681 "strip_size_kb": 0, 00:40:40.681 "state": "configuring", 00:40:40.681 "raid_level": "raid1", 00:40:40.681 "superblock": true, 00:40:40.681 "num_base_bdevs": 2, 00:40:40.681 "num_base_bdevs_discovered": 0, 00:40:40.681 "num_base_bdevs_operational": 2, 00:40:40.681 "base_bdevs_list": [ 00:40:40.681 { 00:40:40.681 "name": "BaseBdev1", 00:40:40.681 "uuid": "00000000-0000-0000-0000-000000000000", 00:40:40.681 "is_configured": false, 00:40:40.681 "data_offset": 0, 00:40:40.681 "data_size": 0 00:40:40.681 }, 00:40:40.681 { 00:40:40.681 "name": "BaseBdev2", 00:40:40.681 "uuid": "00000000-0000-0000-0000-000000000000", 00:40:40.681 "is_configured": false, 00:40:40.681 "data_offset": 0, 00:40:40.681 "data_size": 0 00:40:40.681 } 00:40:40.681 ] 00:40:40.681 }' 00:40:40.681 11:51:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:40:40.681 11:51:24 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:40:40.941 11:51:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:40:41.200 [2024-06-10 11:51:24.995982] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:40:41.200 [2024-06-10 11:51:24.996008] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xcc4510 name Existed_Raid, state configuring 00:40:41.200 11:51:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:40:41.459 [2024-06-10 11:51:25.168442] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:40:41.459 [2024-06-10 11:51:25.168465] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:40:41.459 [2024-06-10 11:51:25.168471] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:40:41.459 [2024-06-10 11:51:25.168478] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:40:41.459 11:51:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev1 00:40:41.460 [2024-06-10 11:51:25.349981] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:40:41.460 BaseBdev1 00:40:41.460 11:51:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:40:41.460 11:51:25 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:40:41.460 11:51:25 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:40:41.460 11:51:25 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # local i 00:40:41.460 11:51:25 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:40:41.460 11:51:25 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:40:41.460 11:51:25 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:40:41.719 11:51:25 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:40:41.978 [ 00:40:41.978 { 00:40:41.978 "name": "BaseBdev1", 00:40:41.978 "aliases": [ 00:40:41.978 "0a1ab01a-8911-4617-b0e8-0771412c3a24" 00:40:41.978 ], 00:40:41.978 "product_name": "Malloc disk", 00:40:41.978 "block_size": 4096, 00:40:41.978 "num_blocks": 8192, 00:40:41.978 "uuid": "0a1ab01a-8911-4617-b0e8-0771412c3a24", 00:40:41.978 "md_size": 32, 00:40:41.978 "md_interleave": false, 00:40:41.978 "dif_type": 0, 00:40:41.978 "assigned_rate_limits": { 00:40:41.978 "rw_ios_per_sec": 0, 00:40:41.978 "rw_mbytes_per_sec": 0, 00:40:41.978 "r_mbytes_per_sec": 0, 00:40:41.978 "w_mbytes_per_sec": 0 00:40:41.978 }, 00:40:41.978 "claimed": true, 00:40:41.978 "claim_type": "exclusive_write", 00:40:41.978 "zoned": false, 00:40:41.978 "supported_io_types": { 00:40:41.978 "read": true, 00:40:41.979 "write": true, 00:40:41.979 "unmap": true, 00:40:41.979 "write_zeroes": true, 00:40:41.979 "flush": true, 00:40:41.979 "reset": true, 00:40:41.979 "compare": false, 00:40:41.979 "compare_and_write": false, 00:40:41.979 "abort": true, 00:40:41.979 "nvme_admin": false, 00:40:41.979 "nvme_io": false 00:40:41.979 }, 00:40:41.979 "memory_domains": [ 00:40:41.979 { 00:40:41.979 "dma_device_id": "system", 00:40:41.979 "dma_device_type": 1 00:40:41.979 }, 00:40:41.979 { 00:40:41.979 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:40:41.979 "dma_device_type": 2 00:40:41.979 } 00:40:41.979 ], 00:40:41.979 "driver_specific": {} 00:40:41.979 } 00:40:41.979 ] 00:40:41.979 11:51:25 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@906 -- # return 0 00:40:41.979 11:51:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:40:41.979 11:51:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:40:41.979 11:51:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:40:41.979 11:51:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:40:41.979 11:51:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:40:41.979 11:51:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:40:41.979 11:51:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:40:41.979 11:51:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:40:41.979 11:51:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:40:41.979 11:51:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:40:41.979 11:51:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:41.979 11:51:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:40:41.979 11:51:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:40:41.979 "name": "Existed_Raid", 00:40:41.979 "uuid": "699fd971-ce71-4191-93d2-262750bd8122", 00:40:41.979 "strip_size_kb": 0, 00:40:41.979 "state": "configuring", 00:40:41.979 "raid_level": "raid1", 00:40:41.979 "superblock": true, 00:40:41.979 "num_base_bdevs": 2, 00:40:41.979 "num_base_bdevs_discovered": 1, 00:40:41.979 "num_base_bdevs_operational": 2, 00:40:41.979 "base_bdevs_list": [ 00:40:41.979 { 00:40:41.979 "name": "BaseBdev1", 00:40:41.979 "uuid": "0a1ab01a-8911-4617-b0e8-0771412c3a24", 00:40:41.979 "is_configured": true, 00:40:41.979 "data_offset": 256, 00:40:41.979 "data_size": 7936 00:40:41.979 }, 00:40:41.979 { 00:40:41.979 "name": "BaseBdev2", 00:40:41.979 "uuid": "00000000-0000-0000-0000-000000000000", 00:40:41.979 "is_configured": false, 00:40:41.979 "data_offset": 0, 00:40:41.979 "data_size": 0 00:40:41.979 } 00:40:41.979 ] 00:40:41.979 }' 00:40:41.979 11:51:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:40:41.979 11:51:25 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:40:42.547 11:51:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:40:42.806 [2024-06-10 11:51:26.545087] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:40:42.806 [2024-06-10 11:51:26.545117] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xcc3e00 name Existed_Raid, state configuring 00:40:42.806 11:51:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:40:42.806 [2024-06-10 11:51:26.717554] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:40:42.806 [2024-06-10 11:51:26.718660] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:40:42.806 [2024-06-10 11:51:26.718686] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:40:42.806 11:51:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:40:42.806 11:51:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:40:42.806 11:51:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:40:42.806 11:51:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:40:42.806 11:51:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:40:42.806 11:51:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:40:42.806 11:51:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:40:42.806 11:51:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:40:42.806 11:51:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:40:42.806 11:51:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:40:42.806 11:51:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:40:42.807 11:51:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:40:42.807 11:51:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:42.807 11:51:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:40:43.066 11:51:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:40:43.066 "name": "Existed_Raid", 00:40:43.066 "uuid": "f862fc37-0bac-48e0-b073-65005cbb71ba", 00:40:43.066 "strip_size_kb": 0, 00:40:43.066 "state": "configuring", 00:40:43.066 "raid_level": "raid1", 00:40:43.066 "superblock": true, 00:40:43.066 "num_base_bdevs": 2, 00:40:43.066 "num_base_bdevs_discovered": 1, 00:40:43.066 "num_base_bdevs_operational": 2, 00:40:43.066 "base_bdevs_list": [ 00:40:43.066 { 00:40:43.066 "name": "BaseBdev1", 00:40:43.066 "uuid": "0a1ab01a-8911-4617-b0e8-0771412c3a24", 00:40:43.066 "is_configured": true, 00:40:43.066 "data_offset": 256, 00:40:43.066 "data_size": 7936 00:40:43.066 }, 00:40:43.066 { 00:40:43.066 "name": "BaseBdev2", 00:40:43.066 "uuid": "00000000-0000-0000-0000-000000000000", 00:40:43.066 "is_configured": false, 00:40:43.066 "data_offset": 0, 00:40:43.066 "data_size": 0 00:40:43.066 } 00:40:43.066 ] 00:40:43.066 }' 00:40:43.066 11:51:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:40:43.066 11:51:26 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:40:43.634 11:51:27 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev2 00:40:43.892 [2024-06-10 11:51:27.591353] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:40:43.893 [2024-06-10 11:51:27.591462] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xcc3540 00:40:43.893 [2024-06-10 11:51:27.591471] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:40:43.893 [2024-06-10 11:51:27.591512] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xcc2f80 00:40:43.893 [2024-06-10 11:51:27.591578] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xcc3540 00:40:43.893 [2024-06-10 11:51:27.591585] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xcc3540 00:40:43.893 [2024-06-10 11:51:27.591632] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:40:43.893 BaseBdev2 00:40:43.893 11:51:27 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:40:43.893 11:51:27 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:40:43.893 11:51:27 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:40:43.893 11:51:27 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # local i 00:40:43.893 11:51:27 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:40:43.893 11:51:27 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:40:43.893 11:51:27 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:40:43.893 11:51:27 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:40:44.152 [ 00:40:44.152 { 00:40:44.152 "name": "BaseBdev2", 00:40:44.152 "aliases": [ 00:40:44.152 "5d81b2bb-9b68-429a-89ef-e61814933b6f" 00:40:44.152 ], 00:40:44.152 "product_name": "Malloc disk", 00:40:44.152 "block_size": 4096, 00:40:44.152 "num_blocks": 8192, 00:40:44.152 "uuid": "5d81b2bb-9b68-429a-89ef-e61814933b6f", 00:40:44.152 "md_size": 32, 00:40:44.152 "md_interleave": false, 00:40:44.152 "dif_type": 0, 00:40:44.152 "assigned_rate_limits": { 00:40:44.152 "rw_ios_per_sec": 0, 00:40:44.152 "rw_mbytes_per_sec": 0, 00:40:44.152 "r_mbytes_per_sec": 0, 00:40:44.152 "w_mbytes_per_sec": 0 00:40:44.152 }, 00:40:44.152 "claimed": true, 00:40:44.152 "claim_type": "exclusive_write", 00:40:44.152 "zoned": false, 00:40:44.152 "supported_io_types": { 00:40:44.152 "read": true, 00:40:44.152 "write": true, 00:40:44.152 "unmap": true, 00:40:44.152 "write_zeroes": true, 00:40:44.152 "flush": true, 00:40:44.152 "reset": true, 00:40:44.152 "compare": false, 00:40:44.152 "compare_and_write": false, 00:40:44.152 "abort": true, 00:40:44.152 "nvme_admin": false, 00:40:44.152 "nvme_io": false 00:40:44.152 }, 00:40:44.152 "memory_domains": [ 00:40:44.152 { 00:40:44.152 "dma_device_id": "system", 00:40:44.152 "dma_device_type": 1 00:40:44.152 }, 00:40:44.152 { 00:40:44.152 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:40:44.152 "dma_device_type": 2 00:40:44.152 } 00:40:44.152 ], 00:40:44.152 "driver_specific": {} 00:40:44.152 } 00:40:44.152 ] 00:40:44.152 11:51:27 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@906 -- # return 0 00:40:44.152 11:51:27 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:40:44.152 11:51:27 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:40:44.152 11:51:27 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:40:44.152 11:51:27 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:40:44.152 11:51:27 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:40:44.152 11:51:27 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:40:44.152 11:51:27 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:40:44.152 11:51:27 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:40:44.152 11:51:27 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:40:44.152 11:51:27 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:40:44.152 11:51:27 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:40:44.152 11:51:27 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:40:44.152 11:51:27 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:44.152 11:51:27 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:40:44.412 11:51:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:40:44.412 "name": "Existed_Raid", 00:40:44.412 "uuid": "f862fc37-0bac-48e0-b073-65005cbb71ba", 00:40:44.412 "strip_size_kb": 0, 00:40:44.412 "state": "online", 00:40:44.412 "raid_level": "raid1", 00:40:44.412 "superblock": true, 00:40:44.412 "num_base_bdevs": 2, 00:40:44.412 "num_base_bdevs_discovered": 2, 00:40:44.412 "num_base_bdevs_operational": 2, 00:40:44.412 "base_bdevs_list": [ 00:40:44.412 { 00:40:44.412 "name": "BaseBdev1", 00:40:44.412 "uuid": "0a1ab01a-8911-4617-b0e8-0771412c3a24", 00:40:44.412 "is_configured": true, 00:40:44.412 "data_offset": 256, 00:40:44.412 "data_size": 7936 00:40:44.412 }, 00:40:44.412 { 00:40:44.412 "name": "BaseBdev2", 00:40:44.412 "uuid": "5d81b2bb-9b68-429a-89ef-e61814933b6f", 00:40:44.412 "is_configured": true, 00:40:44.412 "data_offset": 256, 00:40:44.412 "data_size": 7936 00:40:44.412 } 00:40:44.412 ] 00:40:44.412 }' 00:40:44.412 11:51:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:40:44.412 11:51:28 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:40:44.671 11:51:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:40:44.671 11:51:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:40:44.671 11:51:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:40:44.671 11:51:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:40:44.671 11:51:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:40:44.671 11:51:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:40:44.671 11:51:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:40:44.671 11:51:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:40:44.930 [2024-06-10 11:51:28.750525] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:40:44.930 11:51:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:40:44.930 "name": "Existed_Raid", 00:40:44.930 "aliases": [ 00:40:44.930 "f862fc37-0bac-48e0-b073-65005cbb71ba" 00:40:44.930 ], 00:40:44.930 "product_name": "Raid Volume", 00:40:44.930 "block_size": 4096, 00:40:44.930 "num_blocks": 7936, 00:40:44.930 "uuid": "f862fc37-0bac-48e0-b073-65005cbb71ba", 00:40:44.930 "md_size": 32, 00:40:44.930 "md_interleave": false, 00:40:44.930 "dif_type": 0, 00:40:44.930 "assigned_rate_limits": { 00:40:44.930 "rw_ios_per_sec": 0, 00:40:44.930 "rw_mbytes_per_sec": 0, 00:40:44.930 "r_mbytes_per_sec": 0, 00:40:44.930 "w_mbytes_per_sec": 0 00:40:44.930 }, 00:40:44.930 "claimed": false, 00:40:44.930 "zoned": false, 00:40:44.930 "supported_io_types": { 00:40:44.930 "read": true, 00:40:44.930 "write": true, 00:40:44.930 "unmap": false, 00:40:44.930 "write_zeroes": true, 00:40:44.930 "flush": false, 00:40:44.930 "reset": true, 00:40:44.930 "compare": false, 00:40:44.930 "compare_and_write": false, 00:40:44.930 "abort": false, 00:40:44.930 "nvme_admin": false, 00:40:44.930 "nvme_io": false 00:40:44.930 }, 00:40:44.930 "memory_domains": [ 00:40:44.930 { 00:40:44.930 "dma_device_id": "system", 00:40:44.930 "dma_device_type": 1 00:40:44.930 }, 00:40:44.930 { 00:40:44.930 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:40:44.930 "dma_device_type": 2 00:40:44.930 }, 00:40:44.930 { 00:40:44.930 "dma_device_id": "system", 00:40:44.930 "dma_device_type": 1 00:40:44.930 }, 00:40:44.930 { 00:40:44.930 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:40:44.930 "dma_device_type": 2 00:40:44.930 } 00:40:44.930 ], 00:40:44.930 "driver_specific": { 00:40:44.930 "raid": { 00:40:44.930 "uuid": "f862fc37-0bac-48e0-b073-65005cbb71ba", 00:40:44.930 "strip_size_kb": 0, 00:40:44.930 "state": "online", 00:40:44.930 "raid_level": "raid1", 00:40:44.930 "superblock": true, 00:40:44.930 "num_base_bdevs": 2, 00:40:44.930 "num_base_bdevs_discovered": 2, 00:40:44.930 "num_base_bdevs_operational": 2, 00:40:44.930 "base_bdevs_list": [ 00:40:44.930 { 00:40:44.930 "name": "BaseBdev1", 00:40:44.930 "uuid": "0a1ab01a-8911-4617-b0e8-0771412c3a24", 00:40:44.930 "is_configured": true, 00:40:44.930 "data_offset": 256, 00:40:44.930 "data_size": 7936 00:40:44.930 }, 00:40:44.930 { 00:40:44.930 "name": "BaseBdev2", 00:40:44.930 "uuid": "5d81b2bb-9b68-429a-89ef-e61814933b6f", 00:40:44.931 "is_configured": true, 00:40:44.931 "data_offset": 256, 00:40:44.931 "data_size": 7936 00:40:44.931 } 00:40:44.931 ] 00:40:44.931 } 00:40:44.931 } 00:40:44.931 }' 00:40:44.931 11:51:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:40:44.931 11:51:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:40:44.931 BaseBdev2' 00:40:44.931 11:51:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:40:44.931 11:51:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:40:44.931 11:51:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:40:45.190 11:51:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:40:45.190 "name": "BaseBdev1", 00:40:45.190 "aliases": [ 00:40:45.190 "0a1ab01a-8911-4617-b0e8-0771412c3a24" 00:40:45.190 ], 00:40:45.190 "product_name": "Malloc disk", 00:40:45.190 "block_size": 4096, 00:40:45.190 "num_blocks": 8192, 00:40:45.190 "uuid": "0a1ab01a-8911-4617-b0e8-0771412c3a24", 00:40:45.190 "md_size": 32, 00:40:45.190 "md_interleave": false, 00:40:45.190 "dif_type": 0, 00:40:45.190 "assigned_rate_limits": { 00:40:45.190 "rw_ios_per_sec": 0, 00:40:45.190 "rw_mbytes_per_sec": 0, 00:40:45.190 "r_mbytes_per_sec": 0, 00:40:45.190 "w_mbytes_per_sec": 0 00:40:45.190 }, 00:40:45.190 "claimed": true, 00:40:45.190 "claim_type": "exclusive_write", 00:40:45.190 "zoned": false, 00:40:45.190 "supported_io_types": { 00:40:45.190 "read": true, 00:40:45.190 "write": true, 00:40:45.190 "unmap": true, 00:40:45.190 "write_zeroes": true, 00:40:45.190 "flush": true, 00:40:45.190 "reset": true, 00:40:45.190 "compare": false, 00:40:45.190 "compare_and_write": false, 00:40:45.190 "abort": true, 00:40:45.190 "nvme_admin": false, 00:40:45.190 "nvme_io": false 00:40:45.190 }, 00:40:45.190 "memory_domains": [ 00:40:45.190 { 00:40:45.190 "dma_device_id": "system", 00:40:45.190 "dma_device_type": 1 00:40:45.190 }, 00:40:45.190 { 00:40:45.190 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:40:45.190 "dma_device_type": 2 00:40:45.190 } 00:40:45.190 ], 00:40:45.190 "driver_specific": {} 00:40:45.190 }' 00:40:45.190 11:51:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:40:45.190 11:51:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:40:45.190 11:51:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:40:45.190 11:51:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:40:45.191 11:51:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:40:45.450 11:51:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:40:45.450 11:51:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:40:45.450 11:51:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:40:45.450 11:51:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:40:45.450 11:51:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:40:45.450 11:51:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:40:45.450 11:51:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:40:45.450 11:51:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:40:45.450 11:51:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:40:45.450 11:51:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:40:45.709 11:51:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:40:45.709 "name": "BaseBdev2", 00:40:45.709 "aliases": [ 00:40:45.709 "5d81b2bb-9b68-429a-89ef-e61814933b6f" 00:40:45.709 ], 00:40:45.709 "product_name": "Malloc disk", 00:40:45.709 "block_size": 4096, 00:40:45.709 "num_blocks": 8192, 00:40:45.709 "uuid": "5d81b2bb-9b68-429a-89ef-e61814933b6f", 00:40:45.709 "md_size": 32, 00:40:45.709 "md_interleave": false, 00:40:45.709 "dif_type": 0, 00:40:45.709 "assigned_rate_limits": { 00:40:45.709 "rw_ios_per_sec": 0, 00:40:45.709 "rw_mbytes_per_sec": 0, 00:40:45.709 "r_mbytes_per_sec": 0, 00:40:45.709 "w_mbytes_per_sec": 0 00:40:45.709 }, 00:40:45.709 "claimed": true, 00:40:45.709 "claim_type": "exclusive_write", 00:40:45.709 "zoned": false, 00:40:45.709 "supported_io_types": { 00:40:45.709 "read": true, 00:40:45.709 "write": true, 00:40:45.709 "unmap": true, 00:40:45.709 "write_zeroes": true, 00:40:45.709 "flush": true, 00:40:45.709 "reset": true, 00:40:45.709 "compare": false, 00:40:45.709 "compare_and_write": false, 00:40:45.709 "abort": true, 00:40:45.709 "nvme_admin": false, 00:40:45.709 "nvme_io": false 00:40:45.709 }, 00:40:45.709 "memory_domains": [ 00:40:45.709 { 00:40:45.709 "dma_device_id": "system", 00:40:45.709 "dma_device_type": 1 00:40:45.709 }, 00:40:45.709 { 00:40:45.709 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:40:45.709 "dma_device_type": 2 00:40:45.709 } 00:40:45.709 ], 00:40:45.709 "driver_specific": {} 00:40:45.709 }' 00:40:45.709 11:51:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:40:45.709 11:51:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:40:45.709 11:51:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:40:45.709 11:51:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:40:45.709 11:51:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:40:45.709 11:51:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:40:45.709 11:51:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:40:45.969 11:51:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:40:45.969 11:51:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:40:45.969 11:51:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:40:45.969 11:51:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:40:45.969 11:51:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:40:45.969 11:51:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:40:46.227 [2024-06-10 11:51:29.921415] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:40:46.227 11:51:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@275 -- # local expected_state 00:40:46.227 11:51:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:40:46.227 11:51:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@213 -- # case $1 in 00:40:46.227 11:51:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@214 -- # return 0 00:40:46.227 11:51:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:40:46.227 11:51:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:40:46.227 11:51:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:40:46.227 11:51:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:40:46.227 11:51:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:40:46.227 11:51:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:40:46.227 11:51:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:40:46.227 11:51:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:40:46.227 11:51:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:40:46.227 11:51:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:40:46.227 11:51:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:40:46.227 11:51:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:46.227 11:51:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:40:46.227 11:51:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:40:46.227 "name": "Existed_Raid", 00:40:46.227 "uuid": "f862fc37-0bac-48e0-b073-65005cbb71ba", 00:40:46.227 "strip_size_kb": 0, 00:40:46.227 "state": "online", 00:40:46.227 "raid_level": "raid1", 00:40:46.227 "superblock": true, 00:40:46.227 "num_base_bdevs": 2, 00:40:46.227 "num_base_bdevs_discovered": 1, 00:40:46.227 "num_base_bdevs_operational": 1, 00:40:46.227 "base_bdevs_list": [ 00:40:46.227 { 00:40:46.227 "name": null, 00:40:46.227 "uuid": "00000000-0000-0000-0000-000000000000", 00:40:46.227 "is_configured": false, 00:40:46.227 "data_offset": 256, 00:40:46.227 "data_size": 7936 00:40:46.227 }, 00:40:46.227 { 00:40:46.227 "name": "BaseBdev2", 00:40:46.227 "uuid": "5d81b2bb-9b68-429a-89ef-e61814933b6f", 00:40:46.227 "is_configured": true, 00:40:46.227 "data_offset": 256, 00:40:46.227 "data_size": 7936 00:40:46.227 } 00:40:46.227 ] 00:40:46.227 }' 00:40:46.227 11:51:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:40:46.228 11:51:30 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:40:46.793 11:51:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:40:46.793 11:51:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:40:46.793 11:51:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:40:46.793 11:51:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:47.051 11:51:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:40:47.052 11:51:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:40:47.052 11:51:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:40:47.052 [2024-06-10 11:51:30.930349] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:40:47.052 [2024-06-10 11:51:30.930413] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:40:47.052 [2024-06-10 11:51:30.941383] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:40:47.052 [2024-06-10 11:51:30.941430] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:40:47.052 [2024-06-10 11:51:30.941439] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xcc3540 name Existed_Raid, state offline 00:40:47.052 11:51:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:40:47.052 11:51:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:40:47.052 11:51:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:40:47.052 11:51:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:47.311 11:51:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:40:47.311 11:51:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:40:47.311 11:51:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:40:47.311 11:51:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@341 -- # killprocess 250936 00:40:47.311 11:51:31 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@949 -- # '[' -z 250936 ']' 00:40:47.311 11:51:31 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@953 -- # kill -0 250936 00:40:47.311 11:51:31 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@954 -- # uname 00:40:47.311 11:51:31 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:40:47.311 11:51:31 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 250936 00:40:47.311 11:51:31 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:40:47.311 11:51:31 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:40:47.311 11:51:31 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@967 -- # echo 'killing process with pid 250936' 00:40:47.311 killing process with pid 250936 00:40:47.311 11:51:31 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@968 -- # kill 250936 00:40:47.311 [2024-06-10 11:51:31.184036] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:40:47.311 11:51:31 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@973 -- # wait 250936 00:40:47.311 [2024-06-10 11:51:31.184891] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:40:47.570 11:51:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@343 -- # return 0 00:40:47.570 00:40:47.570 real 0m8.212s 00:40:47.570 user 0m14.355s 00:40:47.570 sys 0m1.669s 00:40:47.570 11:51:31 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@1125 -- # xtrace_disable 00:40:47.570 11:51:31 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:40:47.570 ************************************ 00:40:47.570 END TEST raid_state_function_test_sb_md_separate 00:40:47.570 ************************************ 00:40:47.570 11:51:31 bdev_raid -- bdev/bdev_raid.sh@906 -- # run_test raid_superblock_test_md_separate raid_superblock_test raid1 2 00:40:47.570 11:51:31 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:40:47.570 11:51:31 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:40:47.570 11:51:31 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:40:47.570 ************************************ 00:40:47.570 START TEST raid_superblock_test_md_separate 00:40:47.570 ************************************ 00:40:47.570 11:51:31 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@1124 -- # raid_superblock_test raid1 2 00:40:47.570 11:51:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:40:47.570 11:51:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:40:47.570 11:51:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:40:47.570 11:51:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:40:47.570 11:51:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:40:47.570 11:51:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:40:47.570 11:51:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:40:47.570 11:51:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:40:47.570 11:51:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:40:47.570 11:51:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@398 -- # local strip_size 00:40:47.570 11:51:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:40:47.570 11:51:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:40:47.570 11:51:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:40:47.570 11:51:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:40:47.570 11:51:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:40:47.570 11:51:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@411 -- # raid_pid=252281 00:40:47.570 11:51:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@412 -- # waitforlisten 252281 /var/tmp/spdk-raid.sock 00:40:47.570 11:51:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:40:47.570 11:51:31 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@830 -- # '[' -z 252281 ']' 00:40:47.570 11:51:31 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:40:47.570 11:51:31 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@835 -- # local max_retries=100 00:40:47.570 11:51:31 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:40:47.570 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:40:47.570 11:51:31 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@839 -- # xtrace_disable 00:40:47.570 11:51:31 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:40:47.829 [2024-06-10 11:51:31.521173] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:40:47.829 [2024-06-10 11:51:31.521227] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid252281 ] 00:40:47.829 [2024-06-10 11:51:31.608150] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:40:47.829 [2024-06-10 11:51:31.686310] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:40:47.829 [2024-06-10 11:51:31.744307] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:40:47.829 [2024-06-10 11:51:31.744339] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:40:48.400 11:51:32 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:40:48.400 11:51:32 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@863 -- # return 0 00:40:48.400 11:51:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:40:48.400 11:51:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:40:48.400 11:51:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:40:48.400 11:51:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:40:48.400 11:51:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:40:48.400 11:51:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:40:48.400 11:51:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:40:48.400 11:51:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:40:48.400 11:51:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b malloc1 00:40:48.659 malloc1 00:40:48.659 11:51:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:40:48.918 [2024-06-10 11:51:32.655468] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:40:48.918 [2024-06-10 11:51:32.655504] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:40:48.918 [2024-06-10 11:51:32.655519] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x113cc90 00:40:48.918 [2024-06-10 11:51:32.655528] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:40:48.918 [2024-06-10 11:51:32.656487] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:40:48.918 [2024-06-10 11:51:32.656509] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:40:48.918 pt1 00:40:48.918 11:51:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:40:48.918 11:51:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:40:48.918 11:51:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:40:48.918 11:51:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:40:48.918 11:51:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:40:48.918 11:51:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:40:48.918 11:51:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:40:48.918 11:51:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:40:48.918 11:51:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b malloc2 00:40:48.918 malloc2 00:40:49.176 11:51:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:40:49.176 [2024-06-10 11:51:33.021208] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:40:49.176 [2024-06-10 11:51:33.021243] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:40:49.176 [2024-06-10 11:51:33.021256] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1294810 00:40:49.176 [2024-06-10 11:51:33.021264] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:40:49.176 [2024-06-10 11:51:33.022224] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:40:49.176 [2024-06-10 11:51:33.022248] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:40:49.176 pt2 00:40:49.176 11:51:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:40:49.176 11:51:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:40:49.176 11:51:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:40:49.435 [2024-06-10 11:51:33.193666] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:40:49.435 [2024-06-10 11:51:33.194562] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:40:49.435 [2024-06-10 11:51:33.194664] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1294ee0 00:40:49.435 [2024-06-10 11:51:33.194673] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:40:49.435 [2024-06-10 11:51:33.194717] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x113af90 00:40:49.435 [2024-06-10 11:51:33.194792] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1294ee0 00:40:49.435 [2024-06-10 11:51:33.194798] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1294ee0 00:40:49.435 [2024-06-10 11:51:33.194841] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:40:49.435 11:51:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:40:49.435 11:51:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:40:49.435 11:51:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:40:49.435 11:51:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:40:49.435 11:51:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:40:49.435 11:51:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:40:49.435 11:51:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:40:49.435 11:51:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:40:49.435 11:51:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:40:49.435 11:51:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:40:49.435 11:51:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:49.435 11:51:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:40:49.694 11:51:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:40:49.694 "name": "raid_bdev1", 00:40:49.694 "uuid": "21e9688d-7686-4990-9496-3f10179f23d5", 00:40:49.694 "strip_size_kb": 0, 00:40:49.694 "state": "online", 00:40:49.694 "raid_level": "raid1", 00:40:49.694 "superblock": true, 00:40:49.694 "num_base_bdevs": 2, 00:40:49.694 "num_base_bdevs_discovered": 2, 00:40:49.694 "num_base_bdevs_operational": 2, 00:40:49.694 "base_bdevs_list": [ 00:40:49.694 { 00:40:49.694 "name": "pt1", 00:40:49.694 "uuid": "00000000-0000-0000-0000-000000000001", 00:40:49.694 "is_configured": true, 00:40:49.694 "data_offset": 256, 00:40:49.694 "data_size": 7936 00:40:49.694 }, 00:40:49.694 { 00:40:49.694 "name": "pt2", 00:40:49.694 "uuid": "00000000-0000-0000-0000-000000000002", 00:40:49.694 "is_configured": true, 00:40:49.694 "data_offset": 256, 00:40:49.694 "data_size": 7936 00:40:49.694 } 00:40:49.694 ] 00:40:49.694 }' 00:40:49.694 11:51:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:40:49.694 11:51:33 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:40:49.953 11:51:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:40:49.953 11:51:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:40:49.953 11:51:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:40:49.953 11:51:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:40:49.953 11:51:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:40:49.953 11:51:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:40:49.953 11:51:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:40:49.953 11:51:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:40:50.211 [2024-06-10 11:51:34.027995] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:40:50.212 11:51:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:40:50.212 "name": "raid_bdev1", 00:40:50.212 "aliases": [ 00:40:50.212 "21e9688d-7686-4990-9496-3f10179f23d5" 00:40:50.212 ], 00:40:50.212 "product_name": "Raid Volume", 00:40:50.212 "block_size": 4096, 00:40:50.212 "num_blocks": 7936, 00:40:50.212 "uuid": "21e9688d-7686-4990-9496-3f10179f23d5", 00:40:50.212 "md_size": 32, 00:40:50.212 "md_interleave": false, 00:40:50.212 "dif_type": 0, 00:40:50.212 "assigned_rate_limits": { 00:40:50.212 "rw_ios_per_sec": 0, 00:40:50.212 "rw_mbytes_per_sec": 0, 00:40:50.212 "r_mbytes_per_sec": 0, 00:40:50.212 "w_mbytes_per_sec": 0 00:40:50.212 }, 00:40:50.212 "claimed": false, 00:40:50.212 "zoned": false, 00:40:50.212 "supported_io_types": { 00:40:50.212 "read": true, 00:40:50.212 "write": true, 00:40:50.212 "unmap": false, 00:40:50.212 "write_zeroes": true, 00:40:50.212 "flush": false, 00:40:50.212 "reset": true, 00:40:50.212 "compare": false, 00:40:50.212 "compare_and_write": false, 00:40:50.212 "abort": false, 00:40:50.212 "nvme_admin": false, 00:40:50.212 "nvme_io": false 00:40:50.212 }, 00:40:50.212 "memory_domains": [ 00:40:50.212 { 00:40:50.212 "dma_device_id": "system", 00:40:50.212 "dma_device_type": 1 00:40:50.212 }, 00:40:50.212 { 00:40:50.212 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:40:50.212 "dma_device_type": 2 00:40:50.212 }, 00:40:50.212 { 00:40:50.212 "dma_device_id": "system", 00:40:50.212 "dma_device_type": 1 00:40:50.212 }, 00:40:50.212 { 00:40:50.212 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:40:50.212 "dma_device_type": 2 00:40:50.212 } 00:40:50.212 ], 00:40:50.212 "driver_specific": { 00:40:50.212 "raid": { 00:40:50.212 "uuid": "21e9688d-7686-4990-9496-3f10179f23d5", 00:40:50.212 "strip_size_kb": 0, 00:40:50.212 "state": "online", 00:40:50.212 "raid_level": "raid1", 00:40:50.212 "superblock": true, 00:40:50.212 "num_base_bdevs": 2, 00:40:50.212 "num_base_bdevs_discovered": 2, 00:40:50.212 "num_base_bdevs_operational": 2, 00:40:50.212 "base_bdevs_list": [ 00:40:50.212 { 00:40:50.212 "name": "pt1", 00:40:50.212 "uuid": "00000000-0000-0000-0000-000000000001", 00:40:50.212 "is_configured": true, 00:40:50.212 "data_offset": 256, 00:40:50.212 "data_size": 7936 00:40:50.212 }, 00:40:50.212 { 00:40:50.212 "name": "pt2", 00:40:50.212 "uuid": "00000000-0000-0000-0000-000000000002", 00:40:50.212 "is_configured": true, 00:40:50.212 "data_offset": 256, 00:40:50.212 "data_size": 7936 00:40:50.212 } 00:40:50.212 ] 00:40:50.212 } 00:40:50.212 } 00:40:50.212 }' 00:40:50.212 11:51:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:40:50.212 11:51:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:40:50.212 pt2' 00:40:50.212 11:51:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:40:50.212 11:51:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:40:50.212 11:51:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:40:50.471 11:51:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:40:50.471 "name": "pt1", 00:40:50.471 "aliases": [ 00:40:50.471 "00000000-0000-0000-0000-000000000001" 00:40:50.471 ], 00:40:50.471 "product_name": "passthru", 00:40:50.471 "block_size": 4096, 00:40:50.471 "num_blocks": 8192, 00:40:50.471 "uuid": "00000000-0000-0000-0000-000000000001", 00:40:50.471 "md_size": 32, 00:40:50.471 "md_interleave": false, 00:40:50.471 "dif_type": 0, 00:40:50.471 "assigned_rate_limits": { 00:40:50.471 "rw_ios_per_sec": 0, 00:40:50.471 "rw_mbytes_per_sec": 0, 00:40:50.471 "r_mbytes_per_sec": 0, 00:40:50.471 "w_mbytes_per_sec": 0 00:40:50.471 }, 00:40:50.471 "claimed": true, 00:40:50.471 "claim_type": "exclusive_write", 00:40:50.471 "zoned": false, 00:40:50.471 "supported_io_types": { 00:40:50.471 "read": true, 00:40:50.471 "write": true, 00:40:50.471 "unmap": true, 00:40:50.471 "write_zeroes": true, 00:40:50.471 "flush": true, 00:40:50.471 "reset": true, 00:40:50.471 "compare": false, 00:40:50.471 "compare_and_write": false, 00:40:50.471 "abort": true, 00:40:50.471 "nvme_admin": false, 00:40:50.471 "nvme_io": false 00:40:50.471 }, 00:40:50.471 "memory_domains": [ 00:40:50.471 { 00:40:50.471 "dma_device_id": "system", 00:40:50.471 "dma_device_type": 1 00:40:50.471 }, 00:40:50.471 { 00:40:50.471 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:40:50.471 "dma_device_type": 2 00:40:50.471 } 00:40:50.471 ], 00:40:50.471 "driver_specific": { 00:40:50.471 "passthru": { 00:40:50.471 "name": "pt1", 00:40:50.471 "base_bdev_name": "malloc1" 00:40:50.471 } 00:40:50.471 } 00:40:50.471 }' 00:40:50.471 11:51:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:40:50.471 11:51:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:40:50.471 11:51:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:40:50.471 11:51:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:40:50.471 11:51:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:40:50.730 11:51:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:40:50.730 11:51:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:40:50.730 11:51:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:40:50.730 11:51:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:40:50.730 11:51:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:40:50.730 11:51:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:40:50.730 11:51:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:40:50.730 11:51:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:40:50.730 11:51:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:40:50.730 11:51:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:40:50.989 11:51:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:40:50.989 "name": "pt2", 00:40:50.989 "aliases": [ 00:40:50.989 "00000000-0000-0000-0000-000000000002" 00:40:50.989 ], 00:40:50.989 "product_name": "passthru", 00:40:50.989 "block_size": 4096, 00:40:50.989 "num_blocks": 8192, 00:40:50.989 "uuid": "00000000-0000-0000-0000-000000000002", 00:40:50.989 "md_size": 32, 00:40:50.989 "md_interleave": false, 00:40:50.989 "dif_type": 0, 00:40:50.989 "assigned_rate_limits": { 00:40:50.989 "rw_ios_per_sec": 0, 00:40:50.989 "rw_mbytes_per_sec": 0, 00:40:50.989 "r_mbytes_per_sec": 0, 00:40:50.989 "w_mbytes_per_sec": 0 00:40:50.989 }, 00:40:50.989 "claimed": true, 00:40:50.989 "claim_type": "exclusive_write", 00:40:50.989 "zoned": false, 00:40:50.989 "supported_io_types": { 00:40:50.989 "read": true, 00:40:50.989 "write": true, 00:40:50.989 "unmap": true, 00:40:50.989 "write_zeroes": true, 00:40:50.989 "flush": true, 00:40:50.989 "reset": true, 00:40:50.989 "compare": false, 00:40:50.989 "compare_and_write": false, 00:40:50.989 "abort": true, 00:40:50.989 "nvme_admin": false, 00:40:50.989 "nvme_io": false 00:40:50.989 }, 00:40:50.989 "memory_domains": [ 00:40:50.989 { 00:40:50.989 "dma_device_id": "system", 00:40:50.989 "dma_device_type": 1 00:40:50.989 }, 00:40:50.989 { 00:40:50.989 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:40:50.989 "dma_device_type": 2 00:40:50.989 } 00:40:50.989 ], 00:40:50.989 "driver_specific": { 00:40:50.989 "passthru": { 00:40:50.989 "name": "pt2", 00:40:50.989 "base_bdev_name": "malloc2" 00:40:50.989 } 00:40:50.989 } 00:40:50.989 }' 00:40:50.989 11:51:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:40:50.989 11:51:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:40:50.989 11:51:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:40:50.989 11:51:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:40:50.989 11:51:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:40:50.989 11:51:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:40:50.989 11:51:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:40:51.266 11:51:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:40:51.266 11:51:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:40:51.266 11:51:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:40:51.266 11:51:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:40:51.266 11:51:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:40:51.266 11:51:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:40:51.266 11:51:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:40:51.524 [2024-06-10 11:51:35.239079] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:40:51.524 11:51:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=21e9688d-7686-4990-9496-3f10179f23d5 00:40:51.524 11:51:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@435 -- # '[' -z 21e9688d-7686-4990-9496-3f10179f23d5 ']' 00:40:51.525 11:51:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:40:51.525 [2024-06-10 11:51:35.419397] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:40:51.525 [2024-06-10 11:51:35.419417] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:40:51.525 [2024-06-10 11:51:35.419462] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:40:51.525 [2024-06-10 11:51:35.419501] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:40:51.525 [2024-06-10 11:51:35.419509] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1294ee0 name raid_bdev1, state offline 00:40:51.525 11:51:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:51.525 11:51:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:40:51.783 11:51:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:40:51.783 11:51:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:40:51.783 11:51:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:40:51.783 11:51:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:40:52.042 11:51:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:40:52.042 11:51:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:40:52.042 11:51:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:40:52.042 11:51:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:40:52.301 11:51:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:40:52.301 11:51:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:40:52.301 11:51:36 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@649 -- # local es=0 00:40:52.301 11:51:36 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:40:52.301 11:51:36 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:40:52.301 11:51:36 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:40:52.301 11:51:36 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:40:52.301 11:51:36 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:40:52.301 11:51:36 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:40:52.301 11:51:36 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:40:52.301 11:51:36 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:40:52.301 11:51:36 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:40:52.302 11:51:36 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:40:52.560 [2024-06-10 11:51:36.297652] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:40:52.560 [2024-06-10 11:51:36.298624] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:40:52.560 [2024-06-10 11:51:36.298670] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:40:52.560 [2024-06-10 11:51:36.298699] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:40:52.560 [2024-06-10 11:51:36.298712] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:40:52.560 [2024-06-10 11:51:36.298719] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x127e890 name raid_bdev1, state configuring 00:40:52.560 request: 00:40:52.560 { 00:40:52.560 "name": "raid_bdev1", 00:40:52.560 "raid_level": "raid1", 00:40:52.560 "base_bdevs": [ 00:40:52.560 "malloc1", 00:40:52.560 "malloc2" 00:40:52.560 ], 00:40:52.560 "superblock": false, 00:40:52.560 "method": "bdev_raid_create", 00:40:52.560 "req_id": 1 00:40:52.560 } 00:40:52.560 Got JSON-RPC error response 00:40:52.560 response: 00:40:52.560 { 00:40:52.560 "code": -17, 00:40:52.560 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:40:52.560 } 00:40:52.560 11:51:36 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@652 -- # es=1 00:40:52.560 11:51:36 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:40:52.561 11:51:36 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:40:52.561 11:51:36 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:40:52.561 11:51:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:52.561 11:51:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:40:52.561 11:51:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:40:52.561 11:51:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:40:52.561 11:51:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:40:52.820 [2024-06-10 11:51:36.642515] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:40:52.820 [2024-06-10 11:51:36.642558] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:40:52.820 [2024-06-10 11:51:36.642572] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x113d150 00:40:52.820 [2024-06-10 11:51:36.642580] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:40:52.820 [2024-06-10 11:51:36.643688] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:40:52.820 [2024-06-10 11:51:36.643711] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:40:52.820 [2024-06-10 11:51:36.643749] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:40:52.820 [2024-06-10 11:51:36.643771] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:40:52.820 pt1 00:40:52.820 11:51:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:40:52.820 11:51:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:40:52.820 11:51:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:40:52.820 11:51:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:40:52.820 11:51:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:40:52.820 11:51:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:40:52.820 11:51:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:40:52.820 11:51:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:40:52.820 11:51:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:40:52.820 11:51:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:40:52.820 11:51:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:52.820 11:51:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:40:53.079 11:51:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:40:53.079 "name": "raid_bdev1", 00:40:53.079 "uuid": "21e9688d-7686-4990-9496-3f10179f23d5", 00:40:53.079 "strip_size_kb": 0, 00:40:53.079 "state": "configuring", 00:40:53.079 "raid_level": "raid1", 00:40:53.079 "superblock": true, 00:40:53.079 "num_base_bdevs": 2, 00:40:53.079 "num_base_bdevs_discovered": 1, 00:40:53.079 "num_base_bdevs_operational": 2, 00:40:53.079 "base_bdevs_list": [ 00:40:53.079 { 00:40:53.079 "name": "pt1", 00:40:53.079 "uuid": "00000000-0000-0000-0000-000000000001", 00:40:53.079 "is_configured": true, 00:40:53.079 "data_offset": 256, 00:40:53.079 "data_size": 7936 00:40:53.079 }, 00:40:53.079 { 00:40:53.079 "name": null, 00:40:53.079 "uuid": "00000000-0000-0000-0000-000000000002", 00:40:53.079 "is_configured": false, 00:40:53.079 "data_offset": 256, 00:40:53.079 "data_size": 7936 00:40:53.079 } 00:40:53.079 ] 00:40:53.079 }' 00:40:53.079 11:51:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:40:53.079 11:51:36 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:40:53.650 11:51:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:40:53.650 11:51:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:40:53.650 11:51:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:40:53.650 11:51:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:40:53.650 [2024-06-10 11:51:37.480674] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:40:53.650 [2024-06-10 11:51:37.480711] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:40:53.650 [2024-06-10 11:51:37.480724] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x127fba0 00:40:53.650 [2024-06-10 11:51:37.480733] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:40:53.650 [2024-06-10 11:51:37.480874] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:40:53.650 [2024-06-10 11:51:37.480885] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:40:53.650 [2024-06-10 11:51:37.480932] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:40:53.650 [2024-06-10 11:51:37.480945] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:40:53.650 [2024-06-10 11:51:37.481021] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x12805c0 00:40:53.650 [2024-06-10 11:51:37.481028] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:40:53.650 [2024-06-10 11:51:37.481068] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x113d410 00:40:53.650 [2024-06-10 11:51:37.481135] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x12805c0 00:40:53.650 [2024-06-10 11:51:37.481141] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x12805c0 00:40:53.650 [2024-06-10 11:51:37.481185] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:40:53.650 pt2 00:40:53.650 11:51:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:40:53.650 11:51:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:40:53.650 11:51:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:40:53.650 11:51:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:40:53.650 11:51:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:40:53.650 11:51:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:40:53.650 11:51:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:40:53.650 11:51:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:40:53.650 11:51:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:40:53.650 11:51:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:40:53.650 11:51:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:40:53.650 11:51:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:40:53.650 11:51:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:53.650 11:51:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:40:53.991 11:51:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:40:53.991 "name": "raid_bdev1", 00:40:53.991 "uuid": "21e9688d-7686-4990-9496-3f10179f23d5", 00:40:53.991 "strip_size_kb": 0, 00:40:53.991 "state": "online", 00:40:53.991 "raid_level": "raid1", 00:40:53.991 "superblock": true, 00:40:53.991 "num_base_bdevs": 2, 00:40:53.991 "num_base_bdevs_discovered": 2, 00:40:53.991 "num_base_bdevs_operational": 2, 00:40:53.991 "base_bdevs_list": [ 00:40:53.991 { 00:40:53.991 "name": "pt1", 00:40:53.991 "uuid": "00000000-0000-0000-0000-000000000001", 00:40:53.991 "is_configured": true, 00:40:53.991 "data_offset": 256, 00:40:53.991 "data_size": 7936 00:40:53.991 }, 00:40:53.991 { 00:40:53.991 "name": "pt2", 00:40:53.991 "uuid": "00000000-0000-0000-0000-000000000002", 00:40:53.991 "is_configured": true, 00:40:53.991 "data_offset": 256, 00:40:53.991 "data_size": 7936 00:40:53.991 } 00:40:53.991 ] 00:40:53.991 }' 00:40:53.991 11:51:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:40:53.991 11:51:37 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:40:54.254 11:51:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:40:54.254 11:51:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:40:54.254 11:51:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:40:54.254 11:51:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:40:54.254 11:51:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:40:54.254 11:51:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:40:54.254 11:51:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:40:54.254 11:51:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:40:54.513 [2024-06-10 11:51:38.319027] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:40:54.513 11:51:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:40:54.513 "name": "raid_bdev1", 00:40:54.513 "aliases": [ 00:40:54.513 "21e9688d-7686-4990-9496-3f10179f23d5" 00:40:54.513 ], 00:40:54.513 "product_name": "Raid Volume", 00:40:54.513 "block_size": 4096, 00:40:54.513 "num_blocks": 7936, 00:40:54.513 "uuid": "21e9688d-7686-4990-9496-3f10179f23d5", 00:40:54.513 "md_size": 32, 00:40:54.513 "md_interleave": false, 00:40:54.513 "dif_type": 0, 00:40:54.513 "assigned_rate_limits": { 00:40:54.513 "rw_ios_per_sec": 0, 00:40:54.513 "rw_mbytes_per_sec": 0, 00:40:54.513 "r_mbytes_per_sec": 0, 00:40:54.513 "w_mbytes_per_sec": 0 00:40:54.513 }, 00:40:54.513 "claimed": false, 00:40:54.513 "zoned": false, 00:40:54.513 "supported_io_types": { 00:40:54.513 "read": true, 00:40:54.513 "write": true, 00:40:54.513 "unmap": false, 00:40:54.513 "write_zeroes": true, 00:40:54.513 "flush": false, 00:40:54.513 "reset": true, 00:40:54.513 "compare": false, 00:40:54.513 "compare_and_write": false, 00:40:54.513 "abort": false, 00:40:54.513 "nvme_admin": false, 00:40:54.513 "nvme_io": false 00:40:54.513 }, 00:40:54.513 "memory_domains": [ 00:40:54.513 { 00:40:54.513 "dma_device_id": "system", 00:40:54.513 "dma_device_type": 1 00:40:54.513 }, 00:40:54.513 { 00:40:54.513 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:40:54.513 "dma_device_type": 2 00:40:54.513 }, 00:40:54.513 { 00:40:54.513 "dma_device_id": "system", 00:40:54.513 "dma_device_type": 1 00:40:54.513 }, 00:40:54.513 { 00:40:54.513 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:40:54.513 "dma_device_type": 2 00:40:54.513 } 00:40:54.513 ], 00:40:54.513 "driver_specific": { 00:40:54.513 "raid": { 00:40:54.513 "uuid": "21e9688d-7686-4990-9496-3f10179f23d5", 00:40:54.513 "strip_size_kb": 0, 00:40:54.513 "state": "online", 00:40:54.513 "raid_level": "raid1", 00:40:54.513 "superblock": true, 00:40:54.513 "num_base_bdevs": 2, 00:40:54.513 "num_base_bdevs_discovered": 2, 00:40:54.513 "num_base_bdevs_operational": 2, 00:40:54.513 "base_bdevs_list": [ 00:40:54.513 { 00:40:54.513 "name": "pt1", 00:40:54.513 "uuid": "00000000-0000-0000-0000-000000000001", 00:40:54.513 "is_configured": true, 00:40:54.513 "data_offset": 256, 00:40:54.513 "data_size": 7936 00:40:54.513 }, 00:40:54.513 { 00:40:54.513 "name": "pt2", 00:40:54.513 "uuid": "00000000-0000-0000-0000-000000000002", 00:40:54.513 "is_configured": true, 00:40:54.513 "data_offset": 256, 00:40:54.514 "data_size": 7936 00:40:54.514 } 00:40:54.514 ] 00:40:54.514 } 00:40:54.514 } 00:40:54.514 }' 00:40:54.514 11:51:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:40:54.514 11:51:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:40:54.514 pt2' 00:40:54.514 11:51:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:40:54.514 11:51:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:40:54.514 11:51:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:40:54.772 11:51:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:40:54.772 "name": "pt1", 00:40:54.772 "aliases": [ 00:40:54.772 "00000000-0000-0000-0000-000000000001" 00:40:54.772 ], 00:40:54.772 "product_name": "passthru", 00:40:54.772 "block_size": 4096, 00:40:54.772 "num_blocks": 8192, 00:40:54.772 "uuid": "00000000-0000-0000-0000-000000000001", 00:40:54.772 "md_size": 32, 00:40:54.772 "md_interleave": false, 00:40:54.772 "dif_type": 0, 00:40:54.772 "assigned_rate_limits": { 00:40:54.772 "rw_ios_per_sec": 0, 00:40:54.772 "rw_mbytes_per_sec": 0, 00:40:54.772 "r_mbytes_per_sec": 0, 00:40:54.772 "w_mbytes_per_sec": 0 00:40:54.772 }, 00:40:54.772 "claimed": true, 00:40:54.772 "claim_type": "exclusive_write", 00:40:54.772 "zoned": false, 00:40:54.772 "supported_io_types": { 00:40:54.772 "read": true, 00:40:54.772 "write": true, 00:40:54.772 "unmap": true, 00:40:54.772 "write_zeroes": true, 00:40:54.772 "flush": true, 00:40:54.772 "reset": true, 00:40:54.772 "compare": false, 00:40:54.772 "compare_and_write": false, 00:40:54.772 "abort": true, 00:40:54.772 "nvme_admin": false, 00:40:54.772 "nvme_io": false 00:40:54.772 }, 00:40:54.772 "memory_domains": [ 00:40:54.772 { 00:40:54.772 "dma_device_id": "system", 00:40:54.772 "dma_device_type": 1 00:40:54.772 }, 00:40:54.772 { 00:40:54.772 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:40:54.772 "dma_device_type": 2 00:40:54.772 } 00:40:54.772 ], 00:40:54.772 "driver_specific": { 00:40:54.772 "passthru": { 00:40:54.772 "name": "pt1", 00:40:54.772 "base_bdev_name": "malloc1" 00:40:54.772 } 00:40:54.772 } 00:40:54.772 }' 00:40:54.772 11:51:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:40:54.772 11:51:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:40:54.772 11:51:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:40:54.772 11:51:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:40:54.772 11:51:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:40:54.772 11:51:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:40:54.772 11:51:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:40:55.031 11:51:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:40:55.031 11:51:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:40:55.031 11:51:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:40:55.031 11:51:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:40:55.031 11:51:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:40:55.031 11:51:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:40:55.031 11:51:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:40:55.031 11:51:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:40:55.291 11:51:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:40:55.291 "name": "pt2", 00:40:55.291 "aliases": [ 00:40:55.291 "00000000-0000-0000-0000-000000000002" 00:40:55.291 ], 00:40:55.291 "product_name": "passthru", 00:40:55.291 "block_size": 4096, 00:40:55.291 "num_blocks": 8192, 00:40:55.291 "uuid": "00000000-0000-0000-0000-000000000002", 00:40:55.291 "md_size": 32, 00:40:55.291 "md_interleave": false, 00:40:55.291 "dif_type": 0, 00:40:55.291 "assigned_rate_limits": { 00:40:55.291 "rw_ios_per_sec": 0, 00:40:55.291 "rw_mbytes_per_sec": 0, 00:40:55.291 "r_mbytes_per_sec": 0, 00:40:55.291 "w_mbytes_per_sec": 0 00:40:55.291 }, 00:40:55.291 "claimed": true, 00:40:55.291 "claim_type": "exclusive_write", 00:40:55.291 "zoned": false, 00:40:55.291 "supported_io_types": { 00:40:55.291 "read": true, 00:40:55.291 "write": true, 00:40:55.291 "unmap": true, 00:40:55.291 "write_zeroes": true, 00:40:55.291 "flush": true, 00:40:55.291 "reset": true, 00:40:55.291 "compare": false, 00:40:55.291 "compare_and_write": false, 00:40:55.291 "abort": true, 00:40:55.291 "nvme_admin": false, 00:40:55.291 "nvme_io": false 00:40:55.291 }, 00:40:55.291 "memory_domains": [ 00:40:55.291 { 00:40:55.291 "dma_device_id": "system", 00:40:55.291 "dma_device_type": 1 00:40:55.291 }, 00:40:55.291 { 00:40:55.291 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:40:55.291 "dma_device_type": 2 00:40:55.291 } 00:40:55.291 ], 00:40:55.291 "driver_specific": { 00:40:55.291 "passthru": { 00:40:55.291 "name": "pt2", 00:40:55.291 "base_bdev_name": "malloc2" 00:40:55.291 } 00:40:55.291 } 00:40:55.291 }' 00:40:55.291 11:51:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:40:55.291 11:51:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:40:55.291 11:51:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:40:55.291 11:51:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:40:55.291 11:51:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:40:55.291 11:51:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:40:55.291 11:51:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:40:55.291 11:51:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:40:55.551 11:51:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:40:55.551 11:51:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:40:55.551 11:51:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:40:55.551 11:51:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:40:55.551 11:51:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:40:55.551 11:51:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:40:55.551 [2024-06-10 11:51:39.482054] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:40:55.810 11:51:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # '[' 21e9688d-7686-4990-9496-3f10179f23d5 '!=' 21e9688d-7686-4990-9496-3f10179f23d5 ']' 00:40:55.810 11:51:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:40:55.810 11:51:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@213 -- # case $1 in 00:40:55.810 11:51:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@214 -- # return 0 00:40:55.810 11:51:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:40:55.810 [2024-06-10 11:51:39.662479] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:40:55.810 11:51:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:40:55.810 11:51:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:40:55.810 11:51:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:40:55.810 11:51:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:40:55.810 11:51:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:40:55.810 11:51:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:40:55.810 11:51:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:40:55.810 11:51:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:40:55.810 11:51:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:40:55.810 11:51:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:40:55.810 11:51:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:55.810 11:51:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:40:56.069 11:51:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:40:56.069 "name": "raid_bdev1", 00:40:56.069 "uuid": "21e9688d-7686-4990-9496-3f10179f23d5", 00:40:56.069 "strip_size_kb": 0, 00:40:56.069 "state": "online", 00:40:56.069 "raid_level": "raid1", 00:40:56.069 "superblock": true, 00:40:56.069 "num_base_bdevs": 2, 00:40:56.069 "num_base_bdevs_discovered": 1, 00:40:56.069 "num_base_bdevs_operational": 1, 00:40:56.069 "base_bdevs_list": [ 00:40:56.069 { 00:40:56.069 "name": null, 00:40:56.069 "uuid": "00000000-0000-0000-0000-000000000000", 00:40:56.069 "is_configured": false, 00:40:56.069 "data_offset": 256, 00:40:56.069 "data_size": 7936 00:40:56.069 }, 00:40:56.069 { 00:40:56.069 "name": "pt2", 00:40:56.069 "uuid": "00000000-0000-0000-0000-000000000002", 00:40:56.069 "is_configured": true, 00:40:56.069 "data_offset": 256, 00:40:56.069 "data_size": 7936 00:40:56.069 } 00:40:56.069 ] 00:40:56.069 }' 00:40:56.069 11:51:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:40:56.069 11:51:39 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:40:56.636 11:51:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:40:56.637 [2024-06-10 11:51:40.536685] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:40:56.637 [2024-06-10 11:51:40.536707] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:40:56.637 [2024-06-10 11:51:40.536748] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:40:56.637 [2024-06-10 11:51:40.536778] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:40:56.637 [2024-06-10 11:51:40.536786] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12805c0 name raid_bdev1, state offline 00:40:56.637 11:51:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:56.637 11:51:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:40:56.896 11:51:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:40:56.896 11:51:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:40:56.896 11:51:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:40:56.896 11:51:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:40:56.896 11:51:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:40:57.155 11:51:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:40:57.155 11:51:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:40:57.155 11:51:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:40:57.155 11:51:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:40:57.155 11:51:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@518 -- # i=1 00:40:57.155 11:51:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:40:57.155 [2024-06-10 11:51:41.070038] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:40:57.155 [2024-06-10 11:51:41.070070] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:40:57.155 [2024-06-10 11:51:41.070082] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1281ea0 00:40:57.155 [2024-06-10 11:51:41.070090] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:40:57.155 [2024-06-10 11:51:41.071113] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:40:57.155 [2024-06-10 11:51:41.071134] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:40:57.155 [2024-06-10 11:51:41.071168] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:40:57.155 [2024-06-10 11:51:41.071187] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:40:57.155 [2024-06-10 11:51:41.071243] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x113b200 00:40:57.155 [2024-06-10 11:51:41.071250] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:40:57.155 [2024-06-10 11:51:41.071288] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12939e0 00:40:57.155 [2024-06-10 11:51:41.071352] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x113b200 00:40:57.155 [2024-06-10 11:51:41.071358] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x113b200 00:40:57.155 [2024-06-10 11:51:41.071405] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:40:57.155 pt2 00:40:57.155 11:51:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:40:57.155 11:51:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:40:57.155 11:51:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:40:57.155 11:51:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:40:57.155 11:51:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:40:57.155 11:51:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:40:57.155 11:51:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:40:57.155 11:51:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:40:57.155 11:51:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:40:57.155 11:51:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:40:57.155 11:51:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:57.155 11:51:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:40:57.415 11:51:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:40:57.415 "name": "raid_bdev1", 00:40:57.415 "uuid": "21e9688d-7686-4990-9496-3f10179f23d5", 00:40:57.415 "strip_size_kb": 0, 00:40:57.415 "state": "online", 00:40:57.415 "raid_level": "raid1", 00:40:57.415 "superblock": true, 00:40:57.415 "num_base_bdevs": 2, 00:40:57.415 "num_base_bdevs_discovered": 1, 00:40:57.415 "num_base_bdevs_operational": 1, 00:40:57.415 "base_bdevs_list": [ 00:40:57.415 { 00:40:57.415 "name": null, 00:40:57.415 "uuid": "00000000-0000-0000-0000-000000000000", 00:40:57.415 "is_configured": false, 00:40:57.415 "data_offset": 256, 00:40:57.415 "data_size": 7936 00:40:57.415 }, 00:40:57.415 { 00:40:57.415 "name": "pt2", 00:40:57.415 "uuid": "00000000-0000-0000-0000-000000000002", 00:40:57.415 "is_configured": true, 00:40:57.415 "data_offset": 256, 00:40:57.415 "data_size": 7936 00:40:57.415 } 00:40:57.415 ] 00:40:57.415 }' 00:40:57.415 11:51:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:40:57.415 11:51:41 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:40:57.983 11:51:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:40:58.243 [2024-06-10 11:51:41.940306] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:40:58.243 [2024-06-10 11:51:41.940328] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:40:58.243 [2024-06-10 11:51:41.940370] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:40:58.243 [2024-06-10 11:51:41.940399] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:40:58.243 [2024-06-10 11:51:41.940407] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x113b200 name raid_bdev1, state offline 00:40:58.243 11:51:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:58.243 11:51:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:40:58.243 11:51:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:40:58.243 11:51:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:40:58.243 11:51:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:40:58.243 11:51:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:40:58.501 [2024-06-10 11:51:42.297231] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:40:58.501 [2024-06-10 11:51:42.297265] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:40:58.501 [2024-06-10 11:51:42.297278] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x113aba0 00:40:58.501 [2024-06-10 11:51:42.297287] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:40:58.501 [2024-06-10 11:51:42.298312] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:40:58.501 [2024-06-10 11:51:42.298334] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:40:58.501 [2024-06-10 11:51:42.298369] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:40:58.501 [2024-06-10 11:51:42.298389] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:40:58.501 [2024-06-10 11:51:42.298451] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:40:58.501 [2024-06-10 11:51:42.298460] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:40:58.501 [2024-06-10 11:51:42.298469] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1282310 name raid_bdev1, state configuring 00:40:58.501 [2024-06-10 11:51:42.298484] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:40:58.501 [2024-06-10 11:51:42.298517] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1294a40 00:40:58.501 [2024-06-10 11:51:42.298524] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:40:58.501 [2024-06-10 11:51:42.298559] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12939e0 00:40:58.501 [2024-06-10 11:51:42.298623] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1294a40 00:40:58.501 [2024-06-10 11:51:42.298629] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1294a40 00:40:58.501 [2024-06-10 11:51:42.298672] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:40:58.501 pt1 00:40:58.501 11:51:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:40:58.501 11:51:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:40:58.501 11:51:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:40:58.501 11:51:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:40:58.501 11:51:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:40:58.501 11:51:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:40:58.501 11:51:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:40:58.501 11:51:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:40:58.501 11:51:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:40:58.501 11:51:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:40:58.501 11:51:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:40:58.501 11:51:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:58.501 11:51:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:40:58.760 11:51:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:40:58.760 "name": "raid_bdev1", 00:40:58.760 "uuid": "21e9688d-7686-4990-9496-3f10179f23d5", 00:40:58.760 "strip_size_kb": 0, 00:40:58.760 "state": "online", 00:40:58.760 "raid_level": "raid1", 00:40:58.760 "superblock": true, 00:40:58.760 "num_base_bdevs": 2, 00:40:58.760 "num_base_bdevs_discovered": 1, 00:40:58.760 "num_base_bdevs_operational": 1, 00:40:58.760 "base_bdevs_list": [ 00:40:58.760 { 00:40:58.760 "name": null, 00:40:58.760 "uuid": "00000000-0000-0000-0000-000000000000", 00:40:58.760 "is_configured": false, 00:40:58.760 "data_offset": 256, 00:40:58.760 "data_size": 7936 00:40:58.760 }, 00:40:58.760 { 00:40:58.760 "name": "pt2", 00:40:58.760 "uuid": "00000000-0000-0000-0000-000000000002", 00:40:58.760 "is_configured": true, 00:40:58.760 "data_offset": 256, 00:40:58.760 "data_size": 7936 00:40:58.760 } 00:40:58.760 ] 00:40:58.760 }' 00:40:58.760 11:51:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:40:58.760 11:51:42 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:40:59.331 11:51:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:40:59.331 11:51:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:40:59.331 11:51:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:40:59.331 11:51:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:40:59.331 11:51:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:40:59.591 [2024-06-10 11:51:43.328068] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:40:59.591 11:51:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # '[' 21e9688d-7686-4990-9496-3f10179f23d5 '!=' 21e9688d-7686-4990-9496-3f10179f23d5 ']' 00:40:59.591 11:51:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@562 -- # killprocess 252281 00:40:59.591 11:51:43 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@949 -- # '[' -z 252281 ']' 00:40:59.591 11:51:43 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@953 -- # kill -0 252281 00:40:59.591 11:51:43 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@954 -- # uname 00:40:59.591 11:51:43 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:40:59.591 11:51:43 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 252281 00:40:59.591 11:51:43 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:40:59.591 11:51:43 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:40:59.591 11:51:43 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@967 -- # echo 'killing process with pid 252281' 00:40:59.591 killing process with pid 252281 00:40:59.591 11:51:43 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@968 -- # kill 252281 00:40:59.591 [2024-06-10 11:51:43.394168] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:40:59.591 [2024-06-10 11:51:43.394210] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:40:59.591 [2024-06-10 11:51:43.394242] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:40:59.591 [2024-06-10 11:51:43.394250] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1294a40 name raid_bdev1, state offline 00:40:59.591 11:51:43 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@973 -- # wait 252281 00:40:59.591 [2024-06-10 11:51:43.418956] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:40:59.850 11:51:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@564 -- # return 0 00:40:59.850 00:40:59.850 real 0m12.157s 00:40:59.850 user 0m21.803s 00:40:59.850 sys 0m2.456s 00:40:59.850 11:51:43 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@1125 -- # xtrace_disable 00:40:59.850 11:51:43 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:40:59.850 ************************************ 00:40:59.850 END TEST raid_superblock_test_md_separate 00:40:59.850 ************************************ 00:40:59.850 11:51:43 bdev_raid -- bdev/bdev_raid.sh@907 -- # '[' true = true ']' 00:40:59.850 11:51:43 bdev_raid -- bdev/bdev_raid.sh@908 -- # run_test raid_rebuild_test_sb_md_separate raid_rebuild_test raid1 2 true false true 00:40:59.850 11:51:43 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:40:59.850 11:51:43 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:40:59.850 11:51:43 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:40:59.850 ************************************ 00:40:59.850 START TEST raid_rebuild_test_sb_md_separate 00:40:59.850 ************************************ 00:40:59.851 11:51:43 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@1124 -- # raid_rebuild_test raid1 2 true false true 00:40:59.851 11:51:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:40:59.851 11:51:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:40:59.851 11:51:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:40:59.851 11:51:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:40:59.851 11:51:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@572 -- # local verify=true 00:40:59.851 11:51:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:40:59.851 11:51:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:40:59.851 11:51:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:40:59.851 11:51:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:40:59.851 11:51:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:40:59.851 11:51:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:40:59.851 11:51:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:40:59.851 11:51:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:40:59.851 11:51:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:40:59.851 11:51:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:40:59.851 11:51:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:40:59.851 11:51:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # local strip_size 00:40:59.851 11:51:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@576 -- # local create_arg 00:40:59.851 11:51:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:40:59.851 11:51:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@578 -- # local data_offset 00:40:59.851 11:51:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:40:59.851 11:51:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:40:59.851 11:51:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:40:59.851 11:51:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:40:59.851 11:51:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@596 -- # raid_pid=254188 00:40:59.851 11:51:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@597 -- # waitforlisten 254188 /var/tmp/spdk-raid.sock 00:40:59.851 11:51:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:40:59.851 11:51:43 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@830 -- # '[' -z 254188 ']' 00:40:59.851 11:51:43 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:40:59.851 11:51:43 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@835 -- # local max_retries=100 00:40:59.851 11:51:43 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:40:59.851 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:40:59.851 11:51:43 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@839 -- # xtrace_disable 00:40:59.851 11:51:43 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:40:59.851 [2024-06-10 11:51:43.776287] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:40:59.851 [2024-06-10 11:51:43.776340] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid254188 ] 00:40:59.851 I/O size of 3145728 is greater than zero copy threshold (65536). 00:40:59.851 Zero copy mechanism will not be used. 00:41:00.110 [2024-06-10 11:51:43.862246] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:41:00.110 [2024-06-10 11:51:43.940360] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:41:00.110 [2024-06-10 11:51:43.995420] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:41:00.110 [2024-06-10 11:51:43.995452] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:41:00.678 11:51:44 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:41:00.678 11:51:44 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@863 -- # return 0 00:41:00.678 11:51:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:41:00.678 11:51:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev1_malloc 00:41:00.938 BaseBdev1_malloc 00:41:00.938 11:51:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:41:01.197 [2024-06-10 11:51:44.919182] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:41:01.197 [2024-06-10 11:51:44.919222] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:41:01.197 [2024-06-10 11:51:44.919238] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11df230 00:41:01.197 [2024-06-10 11:51:44.919247] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:41:01.197 [2024-06-10 11:51:44.920297] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:41:01.197 [2024-06-10 11:51:44.920320] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:41:01.197 BaseBdev1 00:41:01.197 11:51:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:41:01.197 11:51:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev2_malloc 00:41:01.197 BaseBdev2_malloc 00:41:01.197 11:51:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:41:01.461 [2024-06-10 11:51:45.288710] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:41:01.461 [2024-06-10 11:51:45.288753] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:41:01.461 [2024-06-10 11:51:45.288768] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1337180 00:41:01.461 [2024-06-10 11:51:45.288777] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:41:01.461 [2024-06-10 11:51:45.289708] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:41:01.461 [2024-06-10 11:51:45.289732] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:41:01.461 BaseBdev2 00:41:01.461 11:51:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b spare_malloc 00:41:01.720 spare_malloc 00:41:01.720 11:51:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:41:01.720 spare_delay 00:41:01.720 11:51:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:41:01.979 [2024-06-10 11:51:45.814488] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:41:01.979 [2024-06-10 11:51:45.814528] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:41:01.979 [2024-06-10 11:51:45.814543] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13178c0 00:41:01.979 [2024-06-10 11:51:45.814552] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:41:01.979 [2024-06-10 11:51:45.815412] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:41:01.979 [2024-06-10 11:51:45.815433] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:41:01.979 spare 00:41:01.979 11:51:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:41:02.238 [2024-06-10 11:51:45.986951] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:41:02.238 [2024-06-10 11:51:45.987750] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:41:02.238 [2024-06-10 11:51:45.987880] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1318d40 00:41:02.238 [2024-06-10 11:51:45.987889] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:41:02.238 [2024-06-10 11:51:45.987935] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1317c10 00:41:02.238 [2024-06-10 11:51:45.988012] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1318d40 00:41:02.238 [2024-06-10 11:51:45.988018] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1318d40 00:41:02.238 [2024-06-10 11:51:45.988060] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:41:02.238 11:51:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:41:02.238 11:51:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:41:02.238 11:51:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:41:02.238 11:51:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:41:02.238 11:51:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:41:02.238 11:51:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:41:02.238 11:51:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:41:02.238 11:51:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:41:02.238 11:51:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:41:02.238 11:51:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:41:02.238 11:51:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:41:02.238 11:51:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:41:02.238 11:51:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:41:02.238 "name": "raid_bdev1", 00:41:02.239 "uuid": "b62618f8-0f9c-4b3b-a6aa-604c9aee4e88", 00:41:02.239 "strip_size_kb": 0, 00:41:02.239 "state": "online", 00:41:02.239 "raid_level": "raid1", 00:41:02.239 "superblock": true, 00:41:02.239 "num_base_bdevs": 2, 00:41:02.239 "num_base_bdevs_discovered": 2, 00:41:02.239 "num_base_bdevs_operational": 2, 00:41:02.239 "base_bdevs_list": [ 00:41:02.239 { 00:41:02.239 "name": "BaseBdev1", 00:41:02.239 "uuid": "4a27a1ff-64b2-5147-a0d1-129e393962a0", 00:41:02.239 "is_configured": true, 00:41:02.239 "data_offset": 256, 00:41:02.239 "data_size": 7936 00:41:02.239 }, 00:41:02.239 { 00:41:02.239 "name": "BaseBdev2", 00:41:02.239 "uuid": "14b9885e-03b2-5b48-bcfa-6f62b2a63fe8", 00:41:02.239 "is_configured": true, 00:41:02.239 "data_offset": 256, 00:41:02.239 "data_size": 7936 00:41:02.239 } 00:41:02.239 ] 00:41:02.239 }' 00:41:02.239 11:51:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:41:02.497 11:51:46 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:41:02.756 11:51:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:41:02.756 11:51:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:41:03.016 [2024-06-10 11:51:46.813239] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:41:03.016 11:51:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:41:03.016 11:51:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:41:03.016 11:51:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:41:03.275 11:51:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:41:03.275 11:51:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:41:03.275 11:51:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:41:03.275 11:51:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:41:03.275 11:51:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:41:03.275 11:51:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:41:03.275 11:51:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:41:03.275 11:51:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # local bdev_list 00:41:03.275 11:51:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:41:03.275 11:51:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # local nbd_list 00:41:03.275 11:51:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@12 -- # local i 00:41:03.275 11:51:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:41:03.275 11:51:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:41:03.275 11:51:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:41:03.275 [2024-06-10 11:51:47.162025] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1317c10 00:41:03.275 /dev/nbd0 00:41:03.275 11:51:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:41:03.275 11:51:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:41:03.275 11:51:47 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:41:03.275 11:51:47 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@868 -- # local i 00:41:03.275 11:51:47 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:41:03.276 11:51:47 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:41:03.276 11:51:47 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:41:03.276 11:51:47 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@872 -- # break 00:41:03.276 11:51:47 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:41:03.276 11:51:47 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:41:03.276 11:51:47 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:41:03.276 1+0 records in 00:41:03.276 1+0 records out 00:41:03.276 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000215877 s, 19.0 MB/s 00:41:03.276 11:51:47 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:41:03.276 11:51:47 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # size=4096 00:41:03.276 11:51:47 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:41:03.276 11:51:47 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:41:03.276 11:51:47 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@888 -- # return 0 00:41:03.276 11:51:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:41:03.276 11:51:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:41:03.276 11:51:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:41:03.276 11:51:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:41:03.276 11:51:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=4096 count=7936 oflag=direct 00:41:03.843 7936+0 records in 00:41:03.843 7936+0 records out 00:41:03.843 32505856 bytes (33 MB, 31 MiB) copied, 0.498517 s, 65.2 MB/s 00:41:03.843 11:51:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:41:03.843 11:51:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:41:03.843 11:51:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:41:03.843 11:51:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # local nbd_list 00:41:03.843 11:51:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@51 -- # local i 00:41:03.843 11:51:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:41:03.843 11:51:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:41:04.102 11:51:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:41:04.102 [2024-06-10 11:51:47.915186] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:41:04.102 11:51:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:41:04.102 11:51:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:41:04.102 11:51:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:41:04.102 11:51:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:41:04.102 11:51:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:41:04.102 11:51:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:41:04.102 11:51:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:41:04.102 11:51:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:41:04.361 [2024-06-10 11:51:48.079646] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:41:04.361 11:51:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:41:04.361 11:51:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:41:04.361 11:51:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:41:04.361 11:51:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:41:04.361 11:51:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:41:04.361 11:51:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:41:04.361 11:51:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:41:04.361 11:51:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:41:04.361 11:51:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:41:04.361 11:51:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:41:04.361 11:51:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:41:04.361 11:51:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:41:04.361 11:51:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:41:04.361 "name": "raid_bdev1", 00:41:04.361 "uuid": "b62618f8-0f9c-4b3b-a6aa-604c9aee4e88", 00:41:04.361 "strip_size_kb": 0, 00:41:04.361 "state": "online", 00:41:04.361 "raid_level": "raid1", 00:41:04.361 "superblock": true, 00:41:04.361 "num_base_bdevs": 2, 00:41:04.361 "num_base_bdevs_discovered": 1, 00:41:04.361 "num_base_bdevs_operational": 1, 00:41:04.361 "base_bdevs_list": [ 00:41:04.361 { 00:41:04.361 "name": null, 00:41:04.361 "uuid": "00000000-0000-0000-0000-000000000000", 00:41:04.361 "is_configured": false, 00:41:04.361 "data_offset": 256, 00:41:04.361 "data_size": 7936 00:41:04.361 }, 00:41:04.361 { 00:41:04.361 "name": "BaseBdev2", 00:41:04.361 "uuid": "14b9885e-03b2-5b48-bcfa-6f62b2a63fe8", 00:41:04.361 "is_configured": true, 00:41:04.361 "data_offset": 256, 00:41:04.361 "data_size": 7936 00:41:04.361 } 00:41:04.361 ] 00:41:04.361 }' 00:41:04.361 11:51:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:41:04.361 11:51:48 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:41:04.929 11:51:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:41:05.188 [2024-06-10 11:51:48.917821] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:41:05.188 [2024-06-10 11:51:48.919846] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1317c10 00:41:05.188 [2024-06-10 11:51:48.921407] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:41:05.188 11:51:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@646 -- # sleep 1 00:41:06.125 11:51:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:41:06.125 11:51:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:41:06.125 11:51:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:41:06.125 11:51:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:41:06.125 11:51:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:41:06.125 11:51:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:41:06.125 11:51:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:41:06.385 11:51:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:41:06.385 "name": "raid_bdev1", 00:41:06.385 "uuid": "b62618f8-0f9c-4b3b-a6aa-604c9aee4e88", 00:41:06.385 "strip_size_kb": 0, 00:41:06.385 "state": "online", 00:41:06.385 "raid_level": "raid1", 00:41:06.385 "superblock": true, 00:41:06.385 "num_base_bdevs": 2, 00:41:06.385 "num_base_bdevs_discovered": 2, 00:41:06.385 "num_base_bdevs_operational": 2, 00:41:06.385 "process": { 00:41:06.385 "type": "rebuild", 00:41:06.385 "target": "spare", 00:41:06.385 "progress": { 00:41:06.385 "blocks": 2816, 00:41:06.385 "percent": 35 00:41:06.385 } 00:41:06.385 }, 00:41:06.385 "base_bdevs_list": [ 00:41:06.385 { 00:41:06.385 "name": "spare", 00:41:06.385 "uuid": "b587f987-79cb-50ee-8d3d-2560568d8f1d", 00:41:06.385 "is_configured": true, 00:41:06.385 "data_offset": 256, 00:41:06.385 "data_size": 7936 00:41:06.385 }, 00:41:06.385 { 00:41:06.385 "name": "BaseBdev2", 00:41:06.385 "uuid": "14b9885e-03b2-5b48-bcfa-6f62b2a63fe8", 00:41:06.385 "is_configured": true, 00:41:06.385 "data_offset": 256, 00:41:06.385 "data_size": 7936 00:41:06.385 } 00:41:06.385 ] 00:41:06.385 }' 00:41:06.385 11:51:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:41:06.385 11:51:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:41:06.385 11:51:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:41:06.385 11:51:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:41:06.385 11:51:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:41:06.644 [2024-06-10 11:51:50.378161] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:41:06.644 [2024-06-10 11:51:50.432568] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:41:06.644 [2024-06-10 11:51:50.432602] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:41:06.644 [2024-06-10 11:51:50.432612] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:41:06.644 [2024-06-10 11:51:50.432618] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:41:06.644 11:51:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:41:06.644 11:51:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:41:06.644 11:51:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:41:06.644 11:51:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:41:06.644 11:51:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:41:06.644 11:51:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:41:06.644 11:51:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:41:06.644 11:51:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:41:06.644 11:51:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:41:06.644 11:51:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:41:06.644 11:51:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:41:06.644 11:51:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:41:06.903 11:51:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:41:06.903 "name": "raid_bdev1", 00:41:06.903 "uuid": "b62618f8-0f9c-4b3b-a6aa-604c9aee4e88", 00:41:06.903 "strip_size_kb": 0, 00:41:06.903 "state": "online", 00:41:06.903 "raid_level": "raid1", 00:41:06.903 "superblock": true, 00:41:06.903 "num_base_bdevs": 2, 00:41:06.903 "num_base_bdevs_discovered": 1, 00:41:06.903 "num_base_bdevs_operational": 1, 00:41:06.903 "base_bdevs_list": [ 00:41:06.903 { 00:41:06.903 "name": null, 00:41:06.903 "uuid": "00000000-0000-0000-0000-000000000000", 00:41:06.903 "is_configured": false, 00:41:06.903 "data_offset": 256, 00:41:06.903 "data_size": 7936 00:41:06.903 }, 00:41:06.903 { 00:41:06.903 "name": "BaseBdev2", 00:41:06.903 "uuid": "14b9885e-03b2-5b48-bcfa-6f62b2a63fe8", 00:41:06.903 "is_configured": true, 00:41:06.903 "data_offset": 256, 00:41:06.903 "data_size": 7936 00:41:06.903 } 00:41:06.903 ] 00:41:06.903 }' 00:41:06.903 11:51:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:41:06.903 11:51:50 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:41:07.162 11:51:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:41:07.162 11:51:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:41:07.162 11:51:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:41:07.162 11:51:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:41:07.162 11:51:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:41:07.162 11:51:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:41:07.421 11:51:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:41:07.421 11:51:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:41:07.421 "name": "raid_bdev1", 00:41:07.421 "uuid": "b62618f8-0f9c-4b3b-a6aa-604c9aee4e88", 00:41:07.421 "strip_size_kb": 0, 00:41:07.421 "state": "online", 00:41:07.421 "raid_level": "raid1", 00:41:07.421 "superblock": true, 00:41:07.421 "num_base_bdevs": 2, 00:41:07.421 "num_base_bdevs_discovered": 1, 00:41:07.421 "num_base_bdevs_operational": 1, 00:41:07.421 "base_bdevs_list": [ 00:41:07.421 { 00:41:07.421 "name": null, 00:41:07.421 "uuid": "00000000-0000-0000-0000-000000000000", 00:41:07.421 "is_configured": false, 00:41:07.421 "data_offset": 256, 00:41:07.421 "data_size": 7936 00:41:07.421 }, 00:41:07.421 { 00:41:07.421 "name": "BaseBdev2", 00:41:07.421 "uuid": "14b9885e-03b2-5b48-bcfa-6f62b2a63fe8", 00:41:07.421 "is_configured": true, 00:41:07.421 "data_offset": 256, 00:41:07.421 "data_size": 7936 00:41:07.421 } 00:41:07.421 ] 00:41:07.421 }' 00:41:07.421 11:51:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:41:07.421 11:51:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:41:07.421 11:51:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:41:07.421 11:51:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:41:07.421 11:51:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:41:07.699 [2024-06-10 11:51:51.486278] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:41:07.699 [2024-06-10 11:51:51.488607] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1318590 00:41:07.699 [2024-06-10 11:51:51.489680] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:41:07.699 11:51:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@662 -- # sleep 1 00:41:08.644 11:51:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:41:08.644 11:51:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:41:08.644 11:51:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:41:08.644 11:51:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:41:08.644 11:51:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:41:08.644 11:51:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:41:08.644 11:51:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:41:08.903 11:51:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:41:08.903 "name": "raid_bdev1", 00:41:08.903 "uuid": "b62618f8-0f9c-4b3b-a6aa-604c9aee4e88", 00:41:08.903 "strip_size_kb": 0, 00:41:08.903 "state": "online", 00:41:08.903 "raid_level": "raid1", 00:41:08.903 "superblock": true, 00:41:08.903 "num_base_bdevs": 2, 00:41:08.903 "num_base_bdevs_discovered": 2, 00:41:08.903 "num_base_bdevs_operational": 2, 00:41:08.903 "process": { 00:41:08.903 "type": "rebuild", 00:41:08.903 "target": "spare", 00:41:08.903 "progress": { 00:41:08.903 "blocks": 2816, 00:41:08.903 "percent": 35 00:41:08.903 } 00:41:08.903 }, 00:41:08.903 "base_bdevs_list": [ 00:41:08.903 { 00:41:08.903 "name": "spare", 00:41:08.903 "uuid": "b587f987-79cb-50ee-8d3d-2560568d8f1d", 00:41:08.903 "is_configured": true, 00:41:08.903 "data_offset": 256, 00:41:08.903 "data_size": 7936 00:41:08.903 }, 00:41:08.903 { 00:41:08.903 "name": "BaseBdev2", 00:41:08.903 "uuid": "14b9885e-03b2-5b48-bcfa-6f62b2a63fe8", 00:41:08.903 "is_configured": true, 00:41:08.903 "data_offset": 256, 00:41:08.903 "data_size": 7936 00:41:08.903 } 00:41:08.903 ] 00:41:08.903 }' 00:41:08.903 11:51:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:41:08.903 11:51:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:41:08.903 11:51:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:41:08.903 11:51:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:41:08.903 11:51:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:41:08.903 11:51:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:41:08.903 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:41:08.903 11:51:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:41:08.903 11:51:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:41:08.903 11:51:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:41:08.903 11:51:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@705 -- # local timeout=842 00:41:08.903 11:51:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:41:08.903 11:51:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:41:08.903 11:51:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:41:08.903 11:51:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:41:08.903 11:51:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:41:08.904 11:51:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:41:08.904 11:51:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:41:08.904 11:51:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:41:09.162 11:51:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:41:09.162 "name": "raid_bdev1", 00:41:09.162 "uuid": "b62618f8-0f9c-4b3b-a6aa-604c9aee4e88", 00:41:09.162 "strip_size_kb": 0, 00:41:09.162 "state": "online", 00:41:09.162 "raid_level": "raid1", 00:41:09.162 "superblock": true, 00:41:09.162 "num_base_bdevs": 2, 00:41:09.162 "num_base_bdevs_discovered": 2, 00:41:09.162 "num_base_bdevs_operational": 2, 00:41:09.162 "process": { 00:41:09.162 "type": "rebuild", 00:41:09.162 "target": "spare", 00:41:09.162 "progress": { 00:41:09.162 "blocks": 3328, 00:41:09.162 "percent": 41 00:41:09.162 } 00:41:09.162 }, 00:41:09.162 "base_bdevs_list": [ 00:41:09.162 { 00:41:09.162 "name": "spare", 00:41:09.162 "uuid": "b587f987-79cb-50ee-8d3d-2560568d8f1d", 00:41:09.162 "is_configured": true, 00:41:09.162 "data_offset": 256, 00:41:09.162 "data_size": 7936 00:41:09.162 }, 00:41:09.162 { 00:41:09.162 "name": "BaseBdev2", 00:41:09.162 "uuid": "14b9885e-03b2-5b48-bcfa-6f62b2a63fe8", 00:41:09.162 "is_configured": true, 00:41:09.162 "data_offset": 256, 00:41:09.162 "data_size": 7936 00:41:09.162 } 00:41:09.162 ] 00:41:09.162 }' 00:41:09.162 11:51:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:41:09.162 11:51:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:41:09.162 11:51:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:41:09.162 11:51:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:41:09.162 11:51:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@710 -- # sleep 1 00:41:10.099 11:51:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:41:10.099 11:51:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:41:10.099 11:51:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:41:10.099 11:51:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:41:10.099 11:51:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:41:10.099 11:51:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:41:10.099 11:51:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:41:10.099 11:51:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:41:10.358 11:51:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:41:10.358 "name": "raid_bdev1", 00:41:10.358 "uuid": "b62618f8-0f9c-4b3b-a6aa-604c9aee4e88", 00:41:10.358 "strip_size_kb": 0, 00:41:10.358 "state": "online", 00:41:10.358 "raid_level": "raid1", 00:41:10.358 "superblock": true, 00:41:10.358 "num_base_bdevs": 2, 00:41:10.358 "num_base_bdevs_discovered": 2, 00:41:10.358 "num_base_bdevs_operational": 2, 00:41:10.358 "process": { 00:41:10.358 "type": "rebuild", 00:41:10.358 "target": "spare", 00:41:10.358 "progress": { 00:41:10.358 "blocks": 6656, 00:41:10.358 "percent": 83 00:41:10.358 } 00:41:10.358 }, 00:41:10.358 "base_bdevs_list": [ 00:41:10.358 { 00:41:10.358 "name": "spare", 00:41:10.358 "uuid": "b587f987-79cb-50ee-8d3d-2560568d8f1d", 00:41:10.358 "is_configured": true, 00:41:10.358 "data_offset": 256, 00:41:10.358 "data_size": 7936 00:41:10.358 }, 00:41:10.358 { 00:41:10.358 "name": "BaseBdev2", 00:41:10.358 "uuid": "14b9885e-03b2-5b48-bcfa-6f62b2a63fe8", 00:41:10.358 "is_configured": true, 00:41:10.358 "data_offset": 256, 00:41:10.358 "data_size": 7936 00:41:10.358 } 00:41:10.358 ] 00:41:10.358 }' 00:41:10.358 11:51:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:41:10.358 11:51:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:41:10.358 11:51:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:41:10.358 11:51:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:41:10.358 11:51:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@710 -- # sleep 1 00:41:10.926 [2024-06-10 11:51:54.612499] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:41:10.926 [2024-06-10 11:51:54.612546] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:41:10.926 [2024-06-10 11:51:54.612613] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:41:11.494 11:51:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:41:11.494 11:51:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:41:11.494 11:51:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:41:11.494 11:51:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:41:11.494 11:51:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:41:11.494 11:51:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:41:11.494 11:51:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:41:11.494 11:51:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:41:11.494 11:51:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:41:11.494 "name": "raid_bdev1", 00:41:11.494 "uuid": "b62618f8-0f9c-4b3b-a6aa-604c9aee4e88", 00:41:11.494 "strip_size_kb": 0, 00:41:11.494 "state": "online", 00:41:11.494 "raid_level": "raid1", 00:41:11.494 "superblock": true, 00:41:11.494 "num_base_bdevs": 2, 00:41:11.494 "num_base_bdevs_discovered": 2, 00:41:11.494 "num_base_bdevs_operational": 2, 00:41:11.494 "base_bdevs_list": [ 00:41:11.494 { 00:41:11.494 "name": "spare", 00:41:11.494 "uuid": "b587f987-79cb-50ee-8d3d-2560568d8f1d", 00:41:11.494 "is_configured": true, 00:41:11.494 "data_offset": 256, 00:41:11.494 "data_size": 7936 00:41:11.494 }, 00:41:11.494 { 00:41:11.494 "name": "BaseBdev2", 00:41:11.494 "uuid": "14b9885e-03b2-5b48-bcfa-6f62b2a63fe8", 00:41:11.494 "is_configured": true, 00:41:11.494 "data_offset": 256, 00:41:11.494 "data_size": 7936 00:41:11.494 } 00:41:11.494 ] 00:41:11.494 }' 00:41:11.494 11:51:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:41:11.494 11:51:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:41:11.494 11:51:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:41:11.753 11:51:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:41:11.753 11:51:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@708 -- # break 00:41:11.753 11:51:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:41:11.753 11:51:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:41:11.753 11:51:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:41:11.753 11:51:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:41:11.753 11:51:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:41:11.753 11:51:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:41:11.753 11:51:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:41:11.753 11:51:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:41:11.753 "name": "raid_bdev1", 00:41:11.753 "uuid": "b62618f8-0f9c-4b3b-a6aa-604c9aee4e88", 00:41:11.753 "strip_size_kb": 0, 00:41:11.753 "state": "online", 00:41:11.753 "raid_level": "raid1", 00:41:11.753 "superblock": true, 00:41:11.753 "num_base_bdevs": 2, 00:41:11.753 "num_base_bdevs_discovered": 2, 00:41:11.753 "num_base_bdevs_operational": 2, 00:41:11.753 "base_bdevs_list": [ 00:41:11.753 { 00:41:11.753 "name": "spare", 00:41:11.753 "uuid": "b587f987-79cb-50ee-8d3d-2560568d8f1d", 00:41:11.753 "is_configured": true, 00:41:11.753 "data_offset": 256, 00:41:11.753 "data_size": 7936 00:41:11.753 }, 00:41:11.753 { 00:41:11.753 "name": "BaseBdev2", 00:41:11.753 "uuid": "14b9885e-03b2-5b48-bcfa-6f62b2a63fe8", 00:41:11.753 "is_configured": true, 00:41:11.753 "data_offset": 256, 00:41:11.753 "data_size": 7936 00:41:11.753 } 00:41:11.753 ] 00:41:11.753 }' 00:41:11.753 11:51:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:41:11.753 11:51:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:41:11.754 11:51:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:41:11.754 11:51:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:41:11.754 11:51:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:41:11.754 11:51:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:41:11.754 11:51:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:41:11.754 11:51:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:41:11.754 11:51:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:41:11.754 11:51:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:41:11.754 11:51:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:41:11.754 11:51:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:41:11.754 11:51:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:41:11.754 11:51:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:41:11.754 11:51:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:41:11.754 11:51:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:41:12.013 11:51:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:41:12.013 "name": "raid_bdev1", 00:41:12.013 "uuid": "b62618f8-0f9c-4b3b-a6aa-604c9aee4e88", 00:41:12.013 "strip_size_kb": 0, 00:41:12.013 "state": "online", 00:41:12.013 "raid_level": "raid1", 00:41:12.013 "superblock": true, 00:41:12.013 "num_base_bdevs": 2, 00:41:12.013 "num_base_bdevs_discovered": 2, 00:41:12.013 "num_base_bdevs_operational": 2, 00:41:12.013 "base_bdevs_list": [ 00:41:12.013 { 00:41:12.013 "name": "spare", 00:41:12.013 "uuid": "b587f987-79cb-50ee-8d3d-2560568d8f1d", 00:41:12.013 "is_configured": true, 00:41:12.013 "data_offset": 256, 00:41:12.013 "data_size": 7936 00:41:12.013 }, 00:41:12.013 { 00:41:12.013 "name": "BaseBdev2", 00:41:12.013 "uuid": "14b9885e-03b2-5b48-bcfa-6f62b2a63fe8", 00:41:12.013 "is_configured": true, 00:41:12.013 "data_offset": 256, 00:41:12.013 "data_size": 7936 00:41:12.013 } 00:41:12.013 ] 00:41:12.013 }' 00:41:12.013 11:51:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:41:12.013 11:51:55 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:41:12.581 11:51:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:41:12.581 [2024-06-10 11:51:56.452825] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:41:12.581 [2024-06-10 11:51:56.452846] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:41:12.581 [2024-06-10 11:51:56.452902] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:41:12.581 [2024-06-10 11:51:56.452941] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:41:12.581 [2024-06-10 11:51:56.452949] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1318d40 name raid_bdev1, state offline 00:41:12.581 11:51:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:41:12.581 11:51:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # jq length 00:41:12.840 11:51:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:41:12.841 11:51:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:41:12.841 11:51:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:41:12.841 11:51:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:41:12.841 11:51:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:41:12.841 11:51:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:41:12.841 11:51:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # local bdev_list 00:41:12.841 11:51:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:41:12.841 11:51:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # local nbd_list 00:41:12.841 11:51:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@12 -- # local i 00:41:12.841 11:51:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:41:12.841 11:51:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:41:12.841 11:51:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:41:13.100 /dev/nbd0 00:41:13.100 11:51:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:41:13.100 11:51:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:41:13.100 11:51:56 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:41:13.100 11:51:56 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@868 -- # local i 00:41:13.100 11:51:56 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:41:13.100 11:51:56 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:41:13.100 11:51:56 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:41:13.100 11:51:56 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@872 -- # break 00:41:13.100 11:51:56 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:41:13.100 11:51:56 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:41:13.100 11:51:56 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:41:13.100 1+0 records in 00:41:13.100 1+0 records out 00:41:13.100 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000247658 s, 16.5 MB/s 00:41:13.100 11:51:56 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:41:13.100 11:51:56 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # size=4096 00:41:13.100 11:51:56 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:41:13.100 11:51:56 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:41:13.100 11:51:56 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@888 -- # return 0 00:41:13.100 11:51:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:41:13.100 11:51:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:41:13.100 11:51:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:41:13.100 /dev/nbd1 00:41:13.100 11:51:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:41:13.360 11:51:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:41:13.360 11:51:57 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:41:13.360 11:51:57 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@868 -- # local i 00:41:13.360 11:51:57 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:41:13.360 11:51:57 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:41:13.360 11:51:57 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:41:13.360 11:51:57 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@872 -- # break 00:41:13.360 11:51:57 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:41:13.360 11:51:57 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:41:13.360 11:51:57 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:41:13.360 1+0 records in 00:41:13.360 1+0 records out 00:41:13.360 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000297452 s, 13.8 MB/s 00:41:13.360 11:51:57 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:41:13.360 11:51:57 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # size=4096 00:41:13.360 11:51:57 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:41:13.360 11:51:57 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:41:13.360 11:51:57 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@888 -- # return 0 00:41:13.360 11:51:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:41:13.360 11:51:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:41:13.360 11:51:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:41:13.360 11:51:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:41:13.360 11:51:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:41:13.360 11:51:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:41:13.360 11:51:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # local nbd_list 00:41:13.360 11:51:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@51 -- # local i 00:41:13.360 11:51:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:41:13.360 11:51:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:41:13.620 11:51:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:41:13.620 11:51:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:41:13.620 11:51:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:41:13.620 11:51:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:41:13.620 11:51:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:41:13.620 11:51:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:41:13.620 11:51:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:41:13.620 11:51:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:41:13.620 11:51:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:41:13.620 11:51:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:41:13.620 11:51:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:41:13.620 11:51:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:41:13.620 11:51:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:41:13.620 11:51:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:41:13.620 11:51:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:41:13.620 11:51:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:41:13.620 11:51:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:41:13.620 11:51:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:41:13.620 11:51:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:41:13.620 11:51:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:41:13.879 11:51:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:41:14.139 [2024-06-10 11:51:57.849636] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:41:14.139 [2024-06-10 11:51:57.849682] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:41:14.139 [2024-06-10 11:51:57.849697] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1318fc0 00:41:14.139 [2024-06-10 11:51:57.849706] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:41:14.139 [2024-06-10 11:51:57.850801] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:41:14.139 [2024-06-10 11:51:57.850826] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:41:14.139 [2024-06-10 11:51:57.850878] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:41:14.139 [2024-06-10 11:51:57.850900] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:41:14.139 [2024-06-10 11:51:57.850971] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:41:14.139 spare 00:41:14.139 11:51:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:41:14.139 11:51:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:41:14.139 11:51:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:41:14.139 11:51:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:41:14.139 11:51:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:41:14.139 11:51:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:41:14.139 11:51:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:41:14.139 11:51:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:41:14.139 11:51:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:41:14.139 11:51:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:41:14.139 11:51:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:41:14.139 11:51:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:41:14.139 [2024-06-10 11:51:57.951263] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x13797b0 00:41:14.139 [2024-06-10 11:51:57.951277] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:41:14.139 [2024-06-10 11:51:57.951334] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1379470 00:41:14.139 [2024-06-10 11:51:57.951430] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x13797b0 00:41:14.139 [2024-06-10 11:51:57.951437] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x13797b0 00:41:14.139 [2024-06-10 11:51:57.951493] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:41:14.139 11:51:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:41:14.139 "name": "raid_bdev1", 00:41:14.139 "uuid": "b62618f8-0f9c-4b3b-a6aa-604c9aee4e88", 00:41:14.139 "strip_size_kb": 0, 00:41:14.139 "state": "online", 00:41:14.139 "raid_level": "raid1", 00:41:14.139 "superblock": true, 00:41:14.139 "num_base_bdevs": 2, 00:41:14.139 "num_base_bdevs_discovered": 2, 00:41:14.139 "num_base_bdevs_operational": 2, 00:41:14.139 "base_bdevs_list": [ 00:41:14.139 { 00:41:14.139 "name": "spare", 00:41:14.139 "uuid": "b587f987-79cb-50ee-8d3d-2560568d8f1d", 00:41:14.139 "is_configured": true, 00:41:14.139 "data_offset": 256, 00:41:14.139 "data_size": 7936 00:41:14.139 }, 00:41:14.139 { 00:41:14.139 "name": "BaseBdev2", 00:41:14.139 "uuid": "14b9885e-03b2-5b48-bcfa-6f62b2a63fe8", 00:41:14.139 "is_configured": true, 00:41:14.139 "data_offset": 256, 00:41:14.139 "data_size": 7936 00:41:14.139 } 00:41:14.139 ] 00:41:14.139 }' 00:41:14.139 11:51:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:41:14.139 11:51:58 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:41:14.712 11:51:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:41:14.712 11:51:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:41:14.712 11:51:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:41:14.712 11:51:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:41:14.712 11:51:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:41:14.712 11:51:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:41:14.712 11:51:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:41:14.975 11:51:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:41:14.975 "name": "raid_bdev1", 00:41:14.975 "uuid": "b62618f8-0f9c-4b3b-a6aa-604c9aee4e88", 00:41:14.975 "strip_size_kb": 0, 00:41:14.975 "state": "online", 00:41:14.975 "raid_level": "raid1", 00:41:14.975 "superblock": true, 00:41:14.975 "num_base_bdevs": 2, 00:41:14.975 "num_base_bdevs_discovered": 2, 00:41:14.975 "num_base_bdevs_operational": 2, 00:41:14.975 "base_bdevs_list": [ 00:41:14.975 { 00:41:14.975 "name": "spare", 00:41:14.975 "uuid": "b587f987-79cb-50ee-8d3d-2560568d8f1d", 00:41:14.975 "is_configured": true, 00:41:14.975 "data_offset": 256, 00:41:14.975 "data_size": 7936 00:41:14.975 }, 00:41:14.975 { 00:41:14.975 "name": "BaseBdev2", 00:41:14.975 "uuid": "14b9885e-03b2-5b48-bcfa-6f62b2a63fe8", 00:41:14.975 "is_configured": true, 00:41:14.975 "data_offset": 256, 00:41:14.975 "data_size": 7936 00:41:14.975 } 00:41:14.975 ] 00:41:14.975 }' 00:41:14.975 11:51:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:41:14.975 11:51:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:41:14.975 11:51:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:41:14.975 11:51:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:41:14.975 11:51:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:41:14.975 11:51:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:41:15.234 11:51:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:41:15.234 11:51:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:41:15.234 [2024-06-10 11:51:59.092913] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:41:15.234 11:51:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:41:15.234 11:51:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:41:15.234 11:51:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:41:15.234 11:51:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:41:15.234 11:51:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:41:15.234 11:51:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:41:15.234 11:51:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:41:15.234 11:51:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:41:15.234 11:51:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:41:15.234 11:51:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:41:15.234 11:51:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:41:15.234 11:51:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:41:15.520 11:51:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:41:15.520 "name": "raid_bdev1", 00:41:15.520 "uuid": "b62618f8-0f9c-4b3b-a6aa-604c9aee4e88", 00:41:15.520 "strip_size_kb": 0, 00:41:15.520 "state": "online", 00:41:15.520 "raid_level": "raid1", 00:41:15.520 "superblock": true, 00:41:15.520 "num_base_bdevs": 2, 00:41:15.520 "num_base_bdevs_discovered": 1, 00:41:15.520 "num_base_bdevs_operational": 1, 00:41:15.520 "base_bdevs_list": [ 00:41:15.520 { 00:41:15.520 "name": null, 00:41:15.520 "uuid": "00000000-0000-0000-0000-000000000000", 00:41:15.520 "is_configured": false, 00:41:15.520 "data_offset": 256, 00:41:15.520 "data_size": 7936 00:41:15.520 }, 00:41:15.520 { 00:41:15.521 "name": "BaseBdev2", 00:41:15.521 "uuid": "14b9885e-03b2-5b48-bcfa-6f62b2a63fe8", 00:41:15.521 "is_configured": true, 00:41:15.521 "data_offset": 256, 00:41:15.521 "data_size": 7936 00:41:15.521 } 00:41:15.521 ] 00:41:15.521 }' 00:41:15.521 11:51:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:41:15.521 11:51:59 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:41:15.821 11:51:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:41:16.080 [2024-06-10 11:51:59.927076] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:41:16.080 [2024-06-10 11:51:59.927207] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:41:16.080 [2024-06-10 11:51:59.927219] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:41:16.080 [2024-06-10 11:51:59.927242] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:41:16.080 [2024-06-10 11:51:59.929219] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11ddb20 00:41:16.080 [2024-06-10 11:51:59.930278] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:41:16.080 11:51:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@755 -- # sleep 1 00:41:17.016 11:52:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:41:17.016 11:52:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:41:17.016 11:52:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:41:17.016 11:52:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:41:17.016 11:52:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:41:17.016 11:52:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:41:17.016 11:52:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:41:17.275 11:52:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:41:17.275 "name": "raid_bdev1", 00:41:17.275 "uuid": "b62618f8-0f9c-4b3b-a6aa-604c9aee4e88", 00:41:17.275 "strip_size_kb": 0, 00:41:17.275 "state": "online", 00:41:17.275 "raid_level": "raid1", 00:41:17.275 "superblock": true, 00:41:17.275 "num_base_bdevs": 2, 00:41:17.275 "num_base_bdevs_discovered": 2, 00:41:17.275 "num_base_bdevs_operational": 2, 00:41:17.275 "process": { 00:41:17.275 "type": "rebuild", 00:41:17.275 "target": "spare", 00:41:17.275 "progress": { 00:41:17.275 "blocks": 2816, 00:41:17.275 "percent": 35 00:41:17.275 } 00:41:17.275 }, 00:41:17.275 "base_bdevs_list": [ 00:41:17.275 { 00:41:17.275 "name": "spare", 00:41:17.275 "uuid": "b587f987-79cb-50ee-8d3d-2560568d8f1d", 00:41:17.275 "is_configured": true, 00:41:17.275 "data_offset": 256, 00:41:17.275 "data_size": 7936 00:41:17.275 }, 00:41:17.275 { 00:41:17.275 "name": "BaseBdev2", 00:41:17.275 "uuid": "14b9885e-03b2-5b48-bcfa-6f62b2a63fe8", 00:41:17.275 "is_configured": true, 00:41:17.275 "data_offset": 256, 00:41:17.275 "data_size": 7936 00:41:17.275 } 00:41:17.275 ] 00:41:17.275 }' 00:41:17.275 11:52:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:41:17.275 11:52:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:41:17.275 11:52:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:41:17.275 11:52:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:41:17.275 11:52:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:41:17.535 [2024-06-10 11:52:01.375597] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:41:17.535 [2024-06-10 11:52:01.441432] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:41:17.535 [2024-06-10 11:52:01.441469] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:41:17.535 [2024-06-10 11:52:01.441479] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:41:17.535 [2024-06-10 11:52:01.441485] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:41:17.535 11:52:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:41:17.535 11:52:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:41:17.535 11:52:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:41:17.535 11:52:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:41:17.535 11:52:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:41:17.535 11:52:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:41:17.535 11:52:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:41:17.535 11:52:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:41:17.535 11:52:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:41:17.535 11:52:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:41:17.535 11:52:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:41:17.535 11:52:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:41:17.793 11:52:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:41:17.793 "name": "raid_bdev1", 00:41:17.793 "uuid": "b62618f8-0f9c-4b3b-a6aa-604c9aee4e88", 00:41:17.793 "strip_size_kb": 0, 00:41:17.793 "state": "online", 00:41:17.793 "raid_level": "raid1", 00:41:17.793 "superblock": true, 00:41:17.793 "num_base_bdevs": 2, 00:41:17.793 "num_base_bdevs_discovered": 1, 00:41:17.793 "num_base_bdevs_operational": 1, 00:41:17.793 "base_bdevs_list": [ 00:41:17.793 { 00:41:17.793 "name": null, 00:41:17.793 "uuid": "00000000-0000-0000-0000-000000000000", 00:41:17.793 "is_configured": false, 00:41:17.793 "data_offset": 256, 00:41:17.793 "data_size": 7936 00:41:17.793 }, 00:41:17.793 { 00:41:17.793 "name": "BaseBdev2", 00:41:17.793 "uuid": "14b9885e-03b2-5b48-bcfa-6f62b2a63fe8", 00:41:17.793 "is_configured": true, 00:41:17.793 "data_offset": 256, 00:41:17.793 "data_size": 7936 00:41:17.793 } 00:41:17.793 ] 00:41:17.793 }' 00:41:17.793 11:52:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:41:17.793 11:52:01 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:41:18.358 11:52:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:41:18.358 [2024-06-10 11:52:02.262505] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:41:18.358 [2024-06-10 11:52:02.262552] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:41:18.358 [2024-06-10 11:52:02.262567] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x131c560 00:41:18.358 [2024-06-10 11:52:02.262575] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:41:18.358 [2024-06-10 11:52:02.262743] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:41:18.358 [2024-06-10 11:52:02.262754] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:41:18.358 [2024-06-10 11:52:02.262796] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:41:18.358 [2024-06-10 11:52:02.262804] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:41:18.358 [2024-06-10 11:52:02.262811] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:41:18.358 [2024-06-10 11:52:02.262823] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:41:18.358 [2024-06-10 11:52:02.264771] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1379a90 00:41:18.358 [2024-06-10 11:52:02.265708] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:41:18.358 spare 00:41:18.358 11:52:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@762 -- # sleep 1 00:41:19.735 11:52:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:41:19.735 11:52:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:41:19.735 11:52:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:41:19.735 11:52:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:41:19.735 11:52:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:41:19.735 11:52:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:41:19.735 11:52:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:41:19.735 11:52:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:41:19.735 "name": "raid_bdev1", 00:41:19.735 "uuid": "b62618f8-0f9c-4b3b-a6aa-604c9aee4e88", 00:41:19.735 "strip_size_kb": 0, 00:41:19.735 "state": "online", 00:41:19.735 "raid_level": "raid1", 00:41:19.735 "superblock": true, 00:41:19.735 "num_base_bdevs": 2, 00:41:19.735 "num_base_bdevs_discovered": 2, 00:41:19.735 "num_base_bdevs_operational": 2, 00:41:19.735 "process": { 00:41:19.735 "type": "rebuild", 00:41:19.735 "target": "spare", 00:41:19.735 "progress": { 00:41:19.735 "blocks": 2816, 00:41:19.735 "percent": 35 00:41:19.735 } 00:41:19.735 }, 00:41:19.735 "base_bdevs_list": [ 00:41:19.735 { 00:41:19.735 "name": "spare", 00:41:19.735 "uuid": "b587f987-79cb-50ee-8d3d-2560568d8f1d", 00:41:19.735 "is_configured": true, 00:41:19.735 "data_offset": 256, 00:41:19.735 "data_size": 7936 00:41:19.735 }, 00:41:19.735 { 00:41:19.735 "name": "BaseBdev2", 00:41:19.735 "uuid": "14b9885e-03b2-5b48-bcfa-6f62b2a63fe8", 00:41:19.735 "is_configured": true, 00:41:19.735 "data_offset": 256, 00:41:19.735 "data_size": 7936 00:41:19.735 } 00:41:19.735 ] 00:41:19.735 }' 00:41:19.735 11:52:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:41:19.735 11:52:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:41:19.735 11:52:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:41:19.735 11:52:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:41:19.735 11:52:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:41:19.995 [2024-06-10 11:52:03.707112] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:41:19.995 [2024-06-10 11:52:03.776653] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:41:19.995 [2024-06-10 11:52:03.776688] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:41:19.995 [2024-06-10 11:52:03.776698] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:41:19.995 [2024-06-10 11:52:03.776704] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:41:19.995 11:52:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:41:19.995 11:52:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:41:19.995 11:52:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:41:19.995 11:52:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:41:19.995 11:52:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:41:19.995 11:52:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:41:19.995 11:52:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:41:19.995 11:52:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:41:19.995 11:52:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:41:19.995 11:52:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:41:19.995 11:52:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:41:19.995 11:52:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:41:20.254 11:52:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:41:20.254 "name": "raid_bdev1", 00:41:20.254 "uuid": "b62618f8-0f9c-4b3b-a6aa-604c9aee4e88", 00:41:20.254 "strip_size_kb": 0, 00:41:20.254 "state": "online", 00:41:20.254 "raid_level": "raid1", 00:41:20.254 "superblock": true, 00:41:20.254 "num_base_bdevs": 2, 00:41:20.254 "num_base_bdevs_discovered": 1, 00:41:20.254 "num_base_bdevs_operational": 1, 00:41:20.254 "base_bdevs_list": [ 00:41:20.254 { 00:41:20.254 "name": null, 00:41:20.254 "uuid": "00000000-0000-0000-0000-000000000000", 00:41:20.254 "is_configured": false, 00:41:20.254 "data_offset": 256, 00:41:20.254 "data_size": 7936 00:41:20.254 }, 00:41:20.254 { 00:41:20.254 "name": "BaseBdev2", 00:41:20.254 "uuid": "14b9885e-03b2-5b48-bcfa-6f62b2a63fe8", 00:41:20.254 "is_configured": true, 00:41:20.254 "data_offset": 256, 00:41:20.254 "data_size": 7936 00:41:20.254 } 00:41:20.254 ] 00:41:20.254 }' 00:41:20.254 11:52:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:41:20.254 11:52:03 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:41:20.820 11:52:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:41:20.820 11:52:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:41:20.820 11:52:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:41:20.820 11:52:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:41:20.820 11:52:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:41:20.820 11:52:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:41:20.820 11:52:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:41:20.820 11:52:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:41:20.820 "name": "raid_bdev1", 00:41:20.820 "uuid": "b62618f8-0f9c-4b3b-a6aa-604c9aee4e88", 00:41:20.820 "strip_size_kb": 0, 00:41:20.820 "state": "online", 00:41:20.820 "raid_level": "raid1", 00:41:20.820 "superblock": true, 00:41:20.820 "num_base_bdevs": 2, 00:41:20.820 "num_base_bdevs_discovered": 1, 00:41:20.820 "num_base_bdevs_operational": 1, 00:41:20.820 "base_bdevs_list": [ 00:41:20.820 { 00:41:20.820 "name": null, 00:41:20.820 "uuid": "00000000-0000-0000-0000-000000000000", 00:41:20.820 "is_configured": false, 00:41:20.820 "data_offset": 256, 00:41:20.820 "data_size": 7936 00:41:20.820 }, 00:41:20.820 { 00:41:20.820 "name": "BaseBdev2", 00:41:20.820 "uuid": "14b9885e-03b2-5b48-bcfa-6f62b2a63fe8", 00:41:20.820 "is_configured": true, 00:41:20.820 "data_offset": 256, 00:41:20.820 "data_size": 7936 00:41:20.820 } 00:41:20.820 ] 00:41:20.820 }' 00:41:20.820 11:52:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:41:20.820 11:52:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:41:20.820 11:52:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:41:20.820 11:52:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:41:20.820 11:52:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:41:21.077 11:52:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:41:21.334 [2024-06-10 11:52:05.083489] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:41:21.334 [2024-06-10 11:52:05.083539] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:41:21.334 [2024-06-10 11:52:05.083556] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11dec00 00:41:21.334 [2024-06-10 11:52:05.083565] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:41:21.334 [2024-06-10 11:52:05.083730] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:41:21.334 [2024-06-10 11:52:05.083743] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:41:21.334 [2024-06-10 11:52:05.083779] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:41:21.334 [2024-06-10 11:52:05.083796] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:41:21.334 [2024-06-10 11:52:05.083804] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:41:21.334 BaseBdev1 00:41:21.334 11:52:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@773 -- # sleep 1 00:41:22.267 11:52:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:41:22.267 11:52:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:41:22.267 11:52:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:41:22.267 11:52:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:41:22.267 11:52:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:41:22.267 11:52:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:41:22.267 11:52:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:41:22.267 11:52:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:41:22.267 11:52:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:41:22.267 11:52:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:41:22.267 11:52:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:41:22.267 11:52:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:41:22.525 11:52:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:41:22.525 "name": "raid_bdev1", 00:41:22.525 "uuid": "b62618f8-0f9c-4b3b-a6aa-604c9aee4e88", 00:41:22.525 "strip_size_kb": 0, 00:41:22.525 "state": "online", 00:41:22.525 "raid_level": "raid1", 00:41:22.525 "superblock": true, 00:41:22.525 "num_base_bdevs": 2, 00:41:22.525 "num_base_bdevs_discovered": 1, 00:41:22.525 "num_base_bdevs_operational": 1, 00:41:22.525 "base_bdevs_list": [ 00:41:22.525 { 00:41:22.525 "name": null, 00:41:22.525 "uuid": "00000000-0000-0000-0000-000000000000", 00:41:22.525 "is_configured": false, 00:41:22.525 "data_offset": 256, 00:41:22.525 "data_size": 7936 00:41:22.525 }, 00:41:22.525 { 00:41:22.525 "name": "BaseBdev2", 00:41:22.525 "uuid": "14b9885e-03b2-5b48-bcfa-6f62b2a63fe8", 00:41:22.525 "is_configured": true, 00:41:22.525 "data_offset": 256, 00:41:22.525 "data_size": 7936 00:41:22.525 } 00:41:22.525 ] 00:41:22.525 }' 00:41:22.525 11:52:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:41:22.525 11:52:06 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:41:23.088 11:52:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:41:23.088 11:52:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:41:23.088 11:52:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:41:23.088 11:52:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:41:23.088 11:52:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:41:23.088 11:52:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:41:23.088 11:52:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:41:23.088 11:52:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:41:23.088 "name": "raid_bdev1", 00:41:23.088 "uuid": "b62618f8-0f9c-4b3b-a6aa-604c9aee4e88", 00:41:23.088 "strip_size_kb": 0, 00:41:23.088 "state": "online", 00:41:23.088 "raid_level": "raid1", 00:41:23.088 "superblock": true, 00:41:23.088 "num_base_bdevs": 2, 00:41:23.088 "num_base_bdevs_discovered": 1, 00:41:23.088 "num_base_bdevs_operational": 1, 00:41:23.088 "base_bdevs_list": [ 00:41:23.088 { 00:41:23.088 "name": null, 00:41:23.088 "uuid": "00000000-0000-0000-0000-000000000000", 00:41:23.088 "is_configured": false, 00:41:23.088 "data_offset": 256, 00:41:23.088 "data_size": 7936 00:41:23.088 }, 00:41:23.088 { 00:41:23.088 "name": "BaseBdev2", 00:41:23.088 "uuid": "14b9885e-03b2-5b48-bcfa-6f62b2a63fe8", 00:41:23.088 "is_configured": true, 00:41:23.088 "data_offset": 256, 00:41:23.088 "data_size": 7936 00:41:23.088 } 00:41:23.088 ] 00:41:23.088 }' 00:41:23.088 11:52:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:41:23.088 11:52:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:41:23.088 11:52:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:41:23.345 11:52:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:41:23.345 11:52:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:41:23.345 11:52:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@649 -- # local es=0 00:41:23.345 11:52:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:41:23.345 11:52:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:41:23.345 11:52:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:41:23.345 11:52:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:41:23.345 11:52:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:41:23.345 11:52:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:41:23.345 11:52:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:41:23.345 11:52:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:41:23.345 11:52:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:41:23.345 11:52:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:41:23.345 [2024-06-10 11:52:07.205000] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:41:23.345 [2024-06-10 11:52:07.205129] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:41:23.345 [2024-06-10 11:52:07.205141] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:41:23.345 request: 00:41:23.345 { 00:41:23.345 "raid_bdev": "raid_bdev1", 00:41:23.345 "base_bdev": "BaseBdev1", 00:41:23.345 "method": "bdev_raid_add_base_bdev", 00:41:23.345 "req_id": 1 00:41:23.345 } 00:41:23.345 Got JSON-RPC error response 00:41:23.345 response: 00:41:23.345 { 00:41:23.345 "code": -22, 00:41:23.345 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:41:23.345 } 00:41:23.345 11:52:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@652 -- # es=1 00:41:23.345 11:52:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:41:23.345 11:52:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:41:23.345 11:52:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:41:23.345 11:52:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@777 -- # sleep 1 00:41:24.714 11:52:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:41:24.714 11:52:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:41:24.714 11:52:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:41:24.714 11:52:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:41:24.714 11:52:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:41:24.714 11:52:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:41:24.714 11:52:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:41:24.714 11:52:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:41:24.714 11:52:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:41:24.714 11:52:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:41:24.714 11:52:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:41:24.714 11:52:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:41:24.714 11:52:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:41:24.714 "name": "raid_bdev1", 00:41:24.714 "uuid": "b62618f8-0f9c-4b3b-a6aa-604c9aee4e88", 00:41:24.714 "strip_size_kb": 0, 00:41:24.714 "state": "online", 00:41:24.714 "raid_level": "raid1", 00:41:24.714 "superblock": true, 00:41:24.714 "num_base_bdevs": 2, 00:41:24.714 "num_base_bdevs_discovered": 1, 00:41:24.714 "num_base_bdevs_operational": 1, 00:41:24.714 "base_bdevs_list": [ 00:41:24.714 { 00:41:24.714 "name": null, 00:41:24.714 "uuid": "00000000-0000-0000-0000-000000000000", 00:41:24.714 "is_configured": false, 00:41:24.714 "data_offset": 256, 00:41:24.714 "data_size": 7936 00:41:24.714 }, 00:41:24.714 { 00:41:24.714 "name": "BaseBdev2", 00:41:24.714 "uuid": "14b9885e-03b2-5b48-bcfa-6f62b2a63fe8", 00:41:24.714 "is_configured": true, 00:41:24.714 "data_offset": 256, 00:41:24.714 "data_size": 7936 00:41:24.714 } 00:41:24.714 ] 00:41:24.714 }' 00:41:24.714 11:52:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:41:24.714 11:52:08 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:41:24.971 11:52:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:41:24.971 11:52:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:41:24.971 11:52:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:41:24.971 11:52:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:41:24.971 11:52:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:41:24.971 11:52:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:41:24.971 11:52:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:41:25.228 11:52:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:41:25.228 "name": "raid_bdev1", 00:41:25.228 "uuid": "b62618f8-0f9c-4b3b-a6aa-604c9aee4e88", 00:41:25.228 "strip_size_kb": 0, 00:41:25.228 "state": "online", 00:41:25.228 "raid_level": "raid1", 00:41:25.228 "superblock": true, 00:41:25.228 "num_base_bdevs": 2, 00:41:25.229 "num_base_bdevs_discovered": 1, 00:41:25.229 "num_base_bdevs_operational": 1, 00:41:25.229 "base_bdevs_list": [ 00:41:25.229 { 00:41:25.229 "name": null, 00:41:25.229 "uuid": "00000000-0000-0000-0000-000000000000", 00:41:25.229 "is_configured": false, 00:41:25.229 "data_offset": 256, 00:41:25.229 "data_size": 7936 00:41:25.229 }, 00:41:25.229 { 00:41:25.229 "name": "BaseBdev2", 00:41:25.229 "uuid": "14b9885e-03b2-5b48-bcfa-6f62b2a63fe8", 00:41:25.229 "is_configured": true, 00:41:25.229 "data_offset": 256, 00:41:25.229 "data_size": 7936 00:41:25.229 } 00:41:25.229 ] 00:41:25.229 }' 00:41:25.229 11:52:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:41:25.229 11:52:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:41:25.229 11:52:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:41:25.229 11:52:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:41:25.229 11:52:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@782 -- # killprocess 254188 00:41:25.229 11:52:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@949 -- # '[' -z 254188 ']' 00:41:25.229 11:52:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@953 -- # kill -0 254188 00:41:25.229 11:52:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@954 -- # uname 00:41:25.486 11:52:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:41:25.486 11:52:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 254188 00:41:25.486 11:52:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:41:25.486 11:52:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:41:25.486 11:52:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@967 -- # echo 'killing process with pid 254188' 00:41:25.486 killing process with pid 254188 00:41:25.486 11:52:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@968 -- # kill 254188 00:41:25.486 Received shutdown signal, test time was about 60.000000 seconds 00:41:25.486 00:41:25.486 Latency(us) 00:41:25.486 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:41:25.486 =================================================================================================================== 00:41:25.486 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:41:25.486 [2024-06-10 11:52:09.219383] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:41:25.486 [2024-06-10 11:52:09.219453] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:41:25.486 [2024-06-10 11:52:09.219483] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:41:25.486 [2024-06-10 11:52:09.219491] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13797b0 name raid_bdev1, state offline 00:41:25.486 11:52:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@973 -- # wait 254188 00:41:25.486 [2024-06-10 11:52:09.256364] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:41:25.744 11:52:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@784 -- # return 0 00:41:25.744 00:41:25.744 real 0m25.748s 00:41:25.744 user 0m38.737s 00:41:25.744 sys 0m3.991s 00:41:25.744 11:52:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@1125 -- # xtrace_disable 00:41:25.744 11:52:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:41:25.744 ************************************ 00:41:25.744 END TEST raid_rebuild_test_sb_md_separate 00:41:25.744 ************************************ 00:41:25.744 11:52:09 bdev_raid -- bdev/bdev_raid.sh@911 -- # base_malloc_params='-m 32 -i' 00:41:25.744 11:52:09 bdev_raid -- bdev/bdev_raid.sh@912 -- # run_test raid_state_function_test_sb_md_interleaved raid_state_function_test raid1 2 true 00:41:25.744 11:52:09 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:41:25.744 11:52:09 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:41:25.744 11:52:09 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:41:25.744 ************************************ 00:41:25.744 START TEST raid_state_function_test_sb_md_interleaved 00:41:25.744 ************************************ 00:41:25.744 11:52:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@1124 -- # raid_state_function_test raid1 2 true 00:41:25.744 11:52:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:41:25.744 11:52:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:41:25.744 11:52:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:41:25.744 11:52:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:41:25.744 11:52:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:41:25.744 11:52:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:41:25.744 11:52:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:41:25.744 11:52:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:41:25.744 11:52:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:41:25.744 11:52:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:41:25.744 11:52:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:41:25.744 11:52:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:41:25.744 11:52:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:41:25.744 11:52:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:41:25.744 11:52:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:41:25.744 11:52:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # local strip_size 00:41:25.744 11:52:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:41:25.744 11:52:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:41:25.744 11:52:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:41:25.744 11:52:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:41:25.744 11:52:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:41:25.744 11:52:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:41:25.744 11:52:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@244 -- # raid_pid=258000 00:41:25.744 11:52:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 258000' 00:41:25.744 Process raid pid: 258000 00:41:25.744 11:52:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:41:25.744 11:52:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@246 -- # waitforlisten 258000 /var/tmp/spdk-raid.sock 00:41:25.744 11:52:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@830 -- # '[' -z 258000 ']' 00:41:25.744 11:52:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:41:25.744 11:52:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@835 -- # local max_retries=100 00:41:25.744 11:52:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:41:25.744 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:41:25.744 11:52:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@839 -- # xtrace_disable 00:41:25.744 11:52:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:41:25.744 [2024-06-10 11:52:09.606783] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:41:25.744 [2024-06-10 11:52:09.606833] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:41:26.007 [2024-06-10 11:52:09.694561] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:41:26.007 [2024-06-10 11:52:09.774185] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:41:26.007 [2024-06-10 11:52:09.829602] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:41:26.007 [2024-06-10 11:52:09.829632] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:41:26.570 11:52:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:41:26.570 11:52:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@863 -- # return 0 00:41:26.570 11:52:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:41:26.828 [2024-06-10 11:52:10.559265] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:41:26.828 [2024-06-10 11:52:10.559302] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:41:26.828 [2024-06-10 11:52:10.559310] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:41:26.828 [2024-06-10 11:52:10.559318] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:41:26.828 11:52:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:41:26.828 11:52:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:41:26.828 11:52:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:41:26.828 11:52:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:41:26.828 11:52:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:41:26.828 11:52:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:41:26.828 11:52:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:41:26.828 11:52:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:41:26.828 11:52:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:41:26.828 11:52:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:41:26.828 11:52:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:41:26.828 11:52:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:41:26.828 11:52:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:41:26.828 "name": "Existed_Raid", 00:41:26.828 "uuid": "3c3e1c43-6a8c-408f-a3b0-bdb2277c388d", 00:41:26.828 "strip_size_kb": 0, 00:41:26.828 "state": "configuring", 00:41:26.828 "raid_level": "raid1", 00:41:26.828 "superblock": true, 00:41:26.828 "num_base_bdevs": 2, 00:41:26.828 "num_base_bdevs_discovered": 0, 00:41:26.828 "num_base_bdevs_operational": 2, 00:41:26.828 "base_bdevs_list": [ 00:41:26.828 { 00:41:26.828 "name": "BaseBdev1", 00:41:26.828 "uuid": "00000000-0000-0000-0000-000000000000", 00:41:26.828 "is_configured": false, 00:41:26.828 "data_offset": 0, 00:41:26.828 "data_size": 0 00:41:26.828 }, 00:41:26.828 { 00:41:26.828 "name": "BaseBdev2", 00:41:26.828 "uuid": "00000000-0000-0000-0000-000000000000", 00:41:26.828 "is_configured": false, 00:41:26.828 "data_offset": 0, 00:41:26.828 "data_size": 0 00:41:26.828 } 00:41:26.828 ] 00:41:26.828 }' 00:41:26.828 11:52:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:41:26.828 11:52:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:41:27.393 11:52:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:41:27.651 [2024-06-10 11:52:11.401351] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:41:27.651 [2024-06-10 11:52:11.401377] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ec8510 name Existed_Raid, state configuring 00:41:27.651 11:52:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:41:27.651 [2024-06-10 11:52:11.573820] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:41:27.651 [2024-06-10 11:52:11.573854] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:41:27.651 [2024-06-10 11:52:11.573861] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:41:27.651 [2024-06-10 11:52:11.573874] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:41:27.651 11:52:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev1 00:41:27.908 [2024-06-10 11:52:11.751008] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:41:27.908 BaseBdev1 00:41:27.908 11:52:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:41:27.908 11:52:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:41:27.909 11:52:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:41:27.909 11:52:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # local i 00:41:27.909 11:52:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:41:27.909 11:52:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:41:27.909 11:52:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:41:28.167 11:52:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:41:28.167 [ 00:41:28.167 { 00:41:28.167 "name": "BaseBdev1", 00:41:28.167 "aliases": [ 00:41:28.167 "43fe7b35-37a2-4d58-b182-a361e220d074" 00:41:28.167 ], 00:41:28.167 "product_name": "Malloc disk", 00:41:28.167 "block_size": 4128, 00:41:28.167 "num_blocks": 8192, 00:41:28.167 "uuid": "43fe7b35-37a2-4d58-b182-a361e220d074", 00:41:28.167 "md_size": 32, 00:41:28.167 "md_interleave": true, 00:41:28.167 "dif_type": 0, 00:41:28.167 "assigned_rate_limits": { 00:41:28.167 "rw_ios_per_sec": 0, 00:41:28.167 "rw_mbytes_per_sec": 0, 00:41:28.167 "r_mbytes_per_sec": 0, 00:41:28.167 "w_mbytes_per_sec": 0 00:41:28.167 }, 00:41:28.167 "claimed": true, 00:41:28.167 "claim_type": "exclusive_write", 00:41:28.167 "zoned": false, 00:41:28.167 "supported_io_types": { 00:41:28.167 "read": true, 00:41:28.167 "write": true, 00:41:28.167 "unmap": true, 00:41:28.167 "write_zeroes": true, 00:41:28.167 "flush": true, 00:41:28.167 "reset": true, 00:41:28.167 "compare": false, 00:41:28.167 "compare_and_write": false, 00:41:28.167 "abort": true, 00:41:28.167 "nvme_admin": false, 00:41:28.167 "nvme_io": false 00:41:28.167 }, 00:41:28.167 "memory_domains": [ 00:41:28.167 { 00:41:28.167 "dma_device_id": "system", 00:41:28.167 "dma_device_type": 1 00:41:28.167 }, 00:41:28.167 { 00:41:28.167 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:41:28.167 "dma_device_type": 2 00:41:28.167 } 00:41:28.167 ], 00:41:28.167 "driver_specific": {} 00:41:28.167 } 00:41:28.167 ] 00:41:28.424 11:52:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@906 -- # return 0 00:41:28.424 11:52:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:41:28.424 11:52:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:41:28.424 11:52:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:41:28.424 11:52:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:41:28.424 11:52:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:41:28.424 11:52:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:41:28.424 11:52:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:41:28.424 11:52:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:41:28.424 11:52:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:41:28.424 11:52:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:41:28.424 11:52:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:41:28.424 11:52:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:41:28.424 11:52:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:41:28.425 "name": "Existed_Raid", 00:41:28.425 "uuid": "2cf9fad0-52fd-482b-918e-71cf8bf1cdf0", 00:41:28.425 "strip_size_kb": 0, 00:41:28.425 "state": "configuring", 00:41:28.425 "raid_level": "raid1", 00:41:28.425 "superblock": true, 00:41:28.425 "num_base_bdevs": 2, 00:41:28.425 "num_base_bdevs_discovered": 1, 00:41:28.425 "num_base_bdevs_operational": 2, 00:41:28.425 "base_bdevs_list": [ 00:41:28.425 { 00:41:28.425 "name": "BaseBdev1", 00:41:28.425 "uuid": "43fe7b35-37a2-4d58-b182-a361e220d074", 00:41:28.425 "is_configured": true, 00:41:28.425 "data_offset": 256, 00:41:28.425 "data_size": 7936 00:41:28.425 }, 00:41:28.425 { 00:41:28.425 "name": "BaseBdev2", 00:41:28.425 "uuid": "00000000-0000-0000-0000-000000000000", 00:41:28.425 "is_configured": false, 00:41:28.425 "data_offset": 0, 00:41:28.425 "data_size": 0 00:41:28.425 } 00:41:28.425 ] 00:41:28.425 }' 00:41:28.425 11:52:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:41:28.425 11:52:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:41:28.990 11:52:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:41:29.249 [2024-06-10 11:52:12.986223] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:41:29.249 [2024-06-10 11:52:12.986261] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ec7e00 name Existed_Raid, state configuring 00:41:29.249 11:52:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:41:29.249 [2024-06-10 11:52:13.146664] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:41:29.249 [2024-06-10 11:52:13.147766] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:41:29.249 [2024-06-10 11:52:13.147795] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:41:29.249 11:52:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:41:29.249 11:52:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:41:29.249 11:52:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:41:29.249 11:52:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:41:29.249 11:52:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:41:29.249 11:52:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:41:29.249 11:52:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:41:29.249 11:52:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:41:29.249 11:52:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:41:29.249 11:52:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:41:29.249 11:52:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:41:29.249 11:52:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:41:29.249 11:52:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:41:29.249 11:52:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:41:29.507 11:52:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:41:29.507 "name": "Existed_Raid", 00:41:29.507 "uuid": "2df76409-7251-4718-921a-30b6f66312e4", 00:41:29.507 "strip_size_kb": 0, 00:41:29.507 "state": "configuring", 00:41:29.507 "raid_level": "raid1", 00:41:29.507 "superblock": true, 00:41:29.507 "num_base_bdevs": 2, 00:41:29.507 "num_base_bdevs_discovered": 1, 00:41:29.507 "num_base_bdevs_operational": 2, 00:41:29.507 "base_bdevs_list": [ 00:41:29.507 { 00:41:29.507 "name": "BaseBdev1", 00:41:29.507 "uuid": "43fe7b35-37a2-4d58-b182-a361e220d074", 00:41:29.507 "is_configured": true, 00:41:29.507 "data_offset": 256, 00:41:29.507 "data_size": 7936 00:41:29.507 }, 00:41:29.507 { 00:41:29.507 "name": "BaseBdev2", 00:41:29.507 "uuid": "00000000-0000-0000-0000-000000000000", 00:41:29.507 "is_configured": false, 00:41:29.507 "data_offset": 0, 00:41:29.507 "data_size": 0 00:41:29.507 } 00:41:29.507 ] 00:41:29.507 }' 00:41:29.507 11:52:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:41:29.507 11:52:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:41:30.071 11:52:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev2 00:41:30.071 [2024-06-10 11:52:13.995842] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:41:30.071 [2024-06-10 11:52:13.995957] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f49cd0 00:41:30.071 [2024-06-10 11:52:13.995967] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:41:30.071 [2024-06-10 11:52:13.996010] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ebfc00 00:41:30.071 [2024-06-10 11:52:13.996064] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f49cd0 00:41:30.071 [2024-06-10 11:52:13.996070] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1f49cd0 00:41:30.071 [2024-06-10 11:52:13.996108] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:41:30.071 BaseBdev2 00:41:30.071 11:52:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:41:30.071 11:52:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:41:30.071 11:52:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:41:30.071 11:52:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # local i 00:41:30.071 11:52:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:41:30.071 11:52:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:41:30.071 11:52:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:41:30.330 11:52:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:41:30.587 [ 00:41:30.587 { 00:41:30.587 "name": "BaseBdev2", 00:41:30.587 "aliases": [ 00:41:30.588 "c47cf4b5-9dfc-4855-8e73-9865a0cab568" 00:41:30.588 ], 00:41:30.588 "product_name": "Malloc disk", 00:41:30.588 "block_size": 4128, 00:41:30.588 "num_blocks": 8192, 00:41:30.588 "uuid": "c47cf4b5-9dfc-4855-8e73-9865a0cab568", 00:41:30.588 "md_size": 32, 00:41:30.588 "md_interleave": true, 00:41:30.588 "dif_type": 0, 00:41:30.588 "assigned_rate_limits": { 00:41:30.588 "rw_ios_per_sec": 0, 00:41:30.588 "rw_mbytes_per_sec": 0, 00:41:30.588 "r_mbytes_per_sec": 0, 00:41:30.588 "w_mbytes_per_sec": 0 00:41:30.588 }, 00:41:30.588 "claimed": true, 00:41:30.588 "claim_type": "exclusive_write", 00:41:30.588 "zoned": false, 00:41:30.588 "supported_io_types": { 00:41:30.588 "read": true, 00:41:30.588 "write": true, 00:41:30.588 "unmap": true, 00:41:30.588 "write_zeroes": true, 00:41:30.588 "flush": true, 00:41:30.588 "reset": true, 00:41:30.588 "compare": false, 00:41:30.588 "compare_and_write": false, 00:41:30.588 "abort": true, 00:41:30.588 "nvme_admin": false, 00:41:30.588 "nvme_io": false 00:41:30.588 }, 00:41:30.588 "memory_domains": [ 00:41:30.588 { 00:41:30.588 "dma_device_id": "system", 00:41:30.588 "dma_device_type": 1 00:41:30.588 }, 00:41:30.588 { 00:41:30.588 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:41:30.588 "dma_device_type": 2 00:41:30.588 } 00:41:30.588 ], 00:41:30.588 "driver_specific": {} 00:41:30.588 } 00:41:30.588 ] 00:41:30.588 11:52:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@906 -- # return 0 00:41:30.588 11:52:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:41:30.588 11:52:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:41:30.588 11:52:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:41:30.588 11:52:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:41:30.588 11:52:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:41:30.588 11:52:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:41:30.588 11:52:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:41:30.588 11:52:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:41:30.588 11:52:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:41:30.588 11:52:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:41:30.588 11:52:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:41:30.588 11:52:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:41:30.588 11:52:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:41:30.588 11:52:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:41:30.859 11:52:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:41:30.859 "name": "Existed_Raid", 00:41:30.859 "uuid": "2df76409-7251-4718-921a-30b6f66312e4", 00:41:30.859 "strip_size_kb": 0, 00:41:30.859 "state": "online", 00:41:30.859 "raid_level": "raid1", 00:41:30.859 "superblock": true, 00:41:30.859 "num_base_bdevs": 2, 00:41:30.859 "num_base_bdevs_discovered": 2, 00:41:30.859 "num_base_bdevs_operational": 2, 00:41:30.859 "base_bdevs_list": [ 00:41:30.859 { 00:41:30.859 "name": "BaseBdev1", 00:41:30.860 "uuid": "43fe7b35-37a2-4d58-b182-a361e220d074", 00:41:30.860 "is_configured": true, 00:41:30.860 "data_offset": 256, 00:41:30.860 "data_size": 7936 00:41:30.860 }, 00:41:30.860 { 00:41:30.860 "name": "BaseBdev2", 00:41:30.860 "uuid": "c47cf4b5-9dfc-4855-8e73-9865a0cab568", 00:41:30.860 "is_configured": true, 00:41:30.860 "data_offset": 256, 00:41:30.860 "data_size": 7936 00:41:30.860 } 00:41:30.860 ] 00:41:30.860 }' 00:41:30.860 11:52:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:41:30.860 11:52:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:41:31.118 11:52:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:41:31.118 11:52:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:41:31.118 11:52:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:41:31.118 11:52:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:41:31.118 11:52:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:41:31.118 11:52:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:41:31.118 11:52:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:41:31.118 11:52:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:41:31.377 [2024-06-10 11:52:15.179088] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:41:31.377 11:52:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:41:31.377 "name": "Existed_Raid", 00:41:31.377 "aliases": [ 00:41:31.377 "2df76409-7251-4718-921a-30b6f66312e4" 00:41:31.377 ], 00:41:31.377 "product_name": "Raid Volume", 00:41:31.377 "block_size": 4128, 00:41:31.377 "num_blocks": 7936, 00:41:31.377 "uuid": "2df76409-7251-4718-921a-30b6f66312e4", 00:41:31.377 "md_size": 32, 00:41:31.377 "md_interleave": true, 00:41:31.377 "dif_type": 0, 00:41:31.377 "assigned_rate_limits": { 00:41:31.377 "rw_ios_per_sec": 0, 00:41:31.377 "rw_mbytes_per_sec": 0, 00:41:31.377 "r_mbytes_per_sec": 0, 00:41:31.377 "w_mbytes_per_sec": 0 00:41:31.377 }, 00:41:31.377 "claimed": false, 00:41:31.377 "zoned": false, 00:41:31.377 "supported_io_types": { 00:41:31.377 "read": true, 00:41:31.377 "write": true, 00:41:31.377 "unmap": false, 00:41:31.377 "write_zeroes": true, 00:41:31.377 "flush": false, 00:41:31.377 "reset": true, 00:41:31.377 "compare": false, 00:41:31.377 "compare_and_write": false, 00:41:31.377 "abort": false, 00:41:31.377 "nvme_admin": false, 00:41:31.377 "nvme_io": false 00:41:31.377 }, 00:41:31.377 "memory_domains": [ 00:41:31.377 { 00:41:31.377 "dma_device_id": "system", 00:41:31.377 "dma_device_type": 1 00:41:31.377 }, 00:41:31.377 { 00:41:31.377 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:41:31.377 "dma_device_type": 2 00:41:31.377 }, 00:41:31.377 { 00:41:31.377 "dma_device_id": "system", 00:41:31.377 "dma_device_type": 1 00:41:31.377 }, 00:41:31.377 { 00:41:31.377 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:41:31.377 "dma_device_type": 2 00:41:31.377 } 00:41:31.377 ], 00:41:31.377 "driver_specific": { 00:41:31.377 "raid": { 00:41:31.377 "uuid": "2df76409-7251-4718-921a-30b6f66312e4", 00:41:31.377 "strip_size_kb": 0, 00:41:31.377 "state": "online", 00:41:31.377 "raid_level": "raid1", 00:41:31.377 "superblock": true, 00:41:31.377 "num_base_bdevs": 2, 00:41:31.377 "num_base_bdevs_discovered": 2, 00:41:31.377 "num_base_bdevs_operational": 2, 00:41:31.377 "base_bdevs_list": [ 00:41:31.377 { 00:41:31.377 "name": "BaseBdev1", 00:41:31.377 "uuid": "43fe7b35-37a2-4d58-b182-a361e220d074", 00:41:31.377 "is_configured": true, 00:41:31.377 "data_offset": 256, 00:41:31.377 "data_size": 7936 00:41:31.377 }, 00:41:31.377 { 00:41:31.377 "name": "BaseBdev2", 00:41:31.377 "uuid": "c47cf4b5-9dfc-4855-8e73-9865a0cab568", 00:41:31.377 "is_configured": true, 00:41:31.377 "data_offset": 256, 00:41:31.377 "data_size": 7936 00:41:31.377 } 00:41:31.377 ] 00:41:31.377 } 00:41:31.377 } 00:41:31.377 }' 00:41:31.377 11:52:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:41:31.377 11:52:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:41:31.377 BaseBdev2' 00:41:31.377 11:52:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:41:31.377 11:52:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:41:31.377 11:52:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:41:31.636 11:52:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:41:31.636 "name": "BaseBdev1", 00:41:31.636 "aliases": [ 00:41:31.636 "43fe7b35-37a2-4d58-b182-a361e220d074" 00:41:31.636 ], 00:41:31.636 "product_name": "Malloc disk", 00:41:31.636 "block_size": 4128, 00:41:31.636 "num_blocks": 8192, 00:41:31.636 "uuid": "43fe7b35-37a2-4d58-b182-a361e220d074", 00:41:31.636 "md_size": 32, 00:41:31.636 "md_interleave": true, 00:41:31.636 "dif_type": 0, 00:41:31.636 "assigned_rate_limits": { 00:41:31.636 "rw_ios_per_sec": 0, 00:41:31.636 "rw_mbytes_per_sec": 0, 00:41:31.636 "r_mbytes_per_sec": 0, 00:41:31.636 "w_mbytes_per_sec": 0 00:41:31.636 }, 00:41:31.636 "claimed": true, 00:41:31.636 "claim_type": "exclusive_write", 00:41:31.636 "zoned": false, 00:41:31.636 "supported_io_types": { 00:41:31.636 "read": true, 00:41:31.636 "write": true, 00:41:31.636 "unmap": true, 00:41:31.636 "write_zeroes": true, 00:41:31.636 "flush": true, 00:41:31.636 "reset": true, 00:41:31.636 "compare": false, 00:41:31.636 "compare_and_write": false, 00:41:31.636 "abort": true, 00:41:31.636 "nvme_admin": false, 00:41:31.636 "nvme_io": false 00:41:31.636 }, 00:41:31.636 "memory_domains": [ 00:41:31.636 { 00:41:31.636 "dma_device_id": "system", 00:41:31.636 "dma_device_type": 1 00:41:31.636 }, 00:41:31.636 { 00:41:31.636 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:41:31.636 "dma_device_type": 2 00:41:31.636 } 00:41:31.636 ], 00:41:31.636 "driver_specific": {} 00:41:31.636 }' 00:41:31.636 11:52:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:41:31.636 11:52:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:41:31.636 11:52:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:41:31.636 11:52:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:41:31.636 11:52:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:41:31.636 11:52:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:41:31.636 11:52:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:41:31.636 11:52:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:41:31.895 11:52:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:41:31.895 11:52:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:41:31.895 11:52:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:41:31.895 11:52:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:41:31.895 11:52:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:41:31.895 11:52:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:41:31.895 11:52:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:41:32.153 11:52:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:41:32.153 "name": "BaseBdev2", 00:41:32.153 "aliases": [ 00:41:32.153 "c47cf4b5-9dfc-4855-8e73-9865a0cab568" 00:41:32.153 ], 00:41:32.153 "product_name": "Malloc disk", 00:41:32.153 "block_size": 4128, 00:41:32.153 "num_blocks": 8192, 00:41:32.153 "uuid": "c47cf4b5-9dfc-4855-8e73-9865a0cab568", 00:41:32.153 "md_size": 32, 00:41:32.153 "md_interleave": true, 00:41:32.153 "dif_type": 0, 00:41:32.153 "assigned_rate_limits": { 00:41:32.153 "rw_ios_per_sec": 0, 00:41:32.153 "rw_mbytes_per_sec": 0, 00:41:32.153 "r_mbytes_per_sec": 0, 00:41:32.153 "w_mbytes_per_sec": 0 00:41:32.153 }, 00:41:32.153 "claimed": true, 00:41:32.153 "claim_type": "exclusive_write", 00:41:32.153 "zoned": false, 00:41:32.153 "supported_io_types": { 00:41:32.153 "read": true, 00:41:32.153 "write": true, 00:41:32.153 "unmap": true, 00:41:32.153 "write_zeroes": true, 00:41:32.153 "flush": true, 00:41:32.153 "reset": true, 00:41:32.153 "compare": false, 00:41:32.153 "compare_and_write": false, 00:41:32.153 "abort": true, 00:41:32.153 "nvme_admin": false, 00:41:32.153 "nvme_io": false 00:41:32.153 }, 00:41:32.153 "memory_domains": [ 00:41:32.153 { 00:41:32.153 "dma_device_id": "system", 00:41:32.153 "dma_device_type": 1 00:41:32.153 }, 00:41:32.153 { 00:41:32.153 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:41:32.153 "dma_device_type": 2 00:41:32.153 } 00:41:32.153 ], 00:41:32.153 "driver_specific": {} 00:41:32.153 }' 00:41:32.153 11:52:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:41:32.153 11:52:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:41:32.153 11:52:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:41:32.153 11:52:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:41:32.153 11:52:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:41:32.153 11:52:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:41:32.153 11:52:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:41:32.153 11:52:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:41:32.153 11:52:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:41:32.153 11:52:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:41:32.153 11:52:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:41:32.412 11:52:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:41:32.412 11:52:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:41:32.412 [2024-06-10 11:52:16.297857] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:41:32.412 11:52:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@275 -- # local expected_state 00:41:32.412 11:52:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:41:32.412 11:52:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@213 -- # case $1 in 00:41:32.412 11:52:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@214 -- # return 0 00:41:32.412 11:52:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:41:32.412 11:52:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:41:32.412 11:52:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:41:32.412 11:52:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:41:32.412 11:52:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:41:32.412 11:52:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:41:32.412 11:52:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:41:32.412 11:52:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:41:32.412 11:52:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:41:32.412 11:52:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:41:32.412 11:52:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:41:32.412 11:52:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:41:32.412 11:52:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:41:32.671 11:52:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:41:32.671 "name": "Existed_Raid", 00:41:32.671 "uuid": "2df76409-7251-4718-921a-30b6f66312e4", 00:41:32.671 "strip_size_kb": 0, 00:41:32.671 "state": "online", 00:41:32.671 "raid_level": "raid1", 00:41:32.671 "superblock": true, 00:41:32.671 "num_base_bdevs": 2, 00:41:32.671 "num_base_bdevs_discovered": 1, 00:41:32.671 "num_base_bdevs_operational": 1, 00:41:32.671 "base_bdevs_list": [ 00:41:32.671 { 00:41:32.671 "name": null, 00:41:32.671 "uuid": "00000000-0000-0000-0000-000000000000", 00:41:32.671 "is_configured": false, 00:41:32.671 "data_offset": 256, 00:41:32.671 "data_size": 7936 00:41:32.671 }, 00:41:32.671 { 00:41:32.671 "name": "BaseBdev2", 00:41:32.671 "uuid": "c47cf4b5-9dfc-4855-8e73-9865a0cab568", 00:41:32.671 "is_configured": true, 00:41:32.671 "data_offset": 256, 00:41:32.671 "data_size": 7936 00:41:32.671 } 00:41:32.671 ] 00:41:32.671 }' 00:41:32.671 11:52:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:41:32.671 11:52:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:41:33.238 11:52:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:41:33.238 11:52:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:41:33.238 11:52:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:41:33.238 11:52:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:41:33.238 11:52:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:41:33.238 11:52:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:41:33.238 11:52:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:41:33.497 [2024-06-10 11:52:17.282138] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:41:33.497 [2024-06-10 11:52:17.282204] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:41:33.497 [2024-06-10 11:52:17.293197] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:41:33.497 [2024-06-10 11:52:17.293228] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:41:33.497 [2024-06-10 11:52:17.293236] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f49cd0 name Existed_Raid, state offline 00:41:33.497 11:52:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:41:33.497 11:52:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:41:33.497 11:52:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:41:33.497 11:52:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:41:33.756 11:52:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:41:33.756 11:52:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:41:33.756 11:52:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:41:33.756 11:52:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@341 -- # killprocess 258000 00:41:33.756 11:52:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@949 -- # '[' -z 258000 ']' 00:41:33.756 11:52:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # kill -0 258000 00:41:33.756 11:52:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # uname 00:41:33.756 11:52:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:41:33.756 11:52:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 258000 00:41:33.756 11:52:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:41:33.756 11:52:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:41:33.756 11:52:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@967 -- # echo 'killing process with pid 258000' 00:41:33.756 killing process with pid 258000 00:41:33.756 11:52:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@968 -- # kill 258000 00:41:33.756 [2024-06-10 11:52:17.528351] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:41:33.756 11:52:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@973 -- # wait 258000 00:41:33.756 [2024-06-10 11:52:17.529136] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:41:34.015 11:52:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@343 -- # return 0 00:41:34.015 00:41:34.015 real 0m8.156s 00:41:34.015 user 0m14.272s 00:41:34.015 sys 0m1.658s 00:41:34.015 11:52:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@1125 -- # xtrace_disable 00:41:34.015 11:52:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:41:34.015 ************************************ 00:41:34.015 END TEST raid_state_function_test_sb_md_interleaved 00:41:34.015 ************************************ 00:41:34.015 11:52:17 bdev_raid -- bdev/bdev_raid.sh@913 -- # run_test raid_superblock_test_md_interleaved raid_superblock_test raid1 2 00:41:34.015 11:52:17 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:41:34.015 11:52:17 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:41:34.015 11:52:17 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:41:34.015 ************************************ 00:41:34.015 START TEST raid_superblock_test_md_interleaved 00:41:34.015 ************************************ 00:41:34.015 11:52:17 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@1124 -- # raid_superblock_test raid1 2 00:41:34.015 11:52:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:41:34.015 11:52:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:41:34.015 11:52:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:41:34.015 11:52:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:41:34.015 11:52:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:41:34.015 11:52:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:41:34.015 11:52:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:41:34.015 11:52:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:41:34.015 11:52:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:41:34.015 11:52:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@398 -- # local strip_size 00:41:34.015 11:52:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:41:34.015 11:52:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:41:34.015 11:52:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:41:34.015 11:52:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:41:34.015 11:52:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:41:34.015 11:52:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@411 -- # raid_pid=259303 00:41:34.015 11:52:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@412 -- # waitforlisten 259303 /var/tmp/spdk-raid.sock 00:41:34.015 11:52:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:41:34.015 11:52:17 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@830 -- # '[' -z 259303 ']' 00:41:34.016 11:52:17 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:41:34.016 11:52:17 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@835 -- # local max_retries=100 00:41:34.016 11:52:17 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:41:34.016 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:41:34.016 11:52:17 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@839 -- # xtrace_disable 00:41:34.016 11:52:17 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:41:34.016 [2024-06-10 11:52:17.844030] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:41:34.016 [2024-06-10 11:52:17.844083] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid259303 ] 00:41:34.016 [2024-06-10 11:52:17.930121] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:41:34.274 [2024-06-10 11:52:18.016118] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:41:34.274 [2024-06-10 11:52:18.076702] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:41:34.274 [2024-06-10 11:52:18.076741] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:41:34.843 11:52:18 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:41:34.843 11:52:18 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@863 -- # return 0 00:41:34.843 11:52:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:41:34.843 11:52:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:41:34.843 11:52:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:41:34.843 11:52:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:41:34.843 11:52:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:41:34.843 11:52:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:41:34.843 11:52:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:41:34.843 11:52:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:41:34.843 11:52:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b malloc1 00:41:34.843 malloc1 00:41:35.102 11:52:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:41:35.103 [2024-06-10 11:52:18.946029] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:41:35.103 [2024-06-10 11:52:18.946068] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:41:35.103 [2024-06-10 11:52:18.946084] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22670a0 00:41:35.103 [2024-06-10 11:52:18.946093] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:41:35.103 [2024-06-10 11:52:18.947331] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:41:35.103 [2024-06-10 11:52:18.947357] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:41:35.103 pt1 00:41:35.103 11:52:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:41:35.103 11:52:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:41:35.103 11:52:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:41:35.103 11:52:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:41:35.103 11:52:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:41:35.103 11:52:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:41:35.103 11:52:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:41:35.103 11:52:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:41:35.103 11:52:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b malloc2 00:41:35.361 malloc2 00:41:35.361 11:52:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:41:35.361 [2024-06-10 11:52:19.303151] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:41:35.361 [2024-06-10 11:52:19.303189] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:41:35.361 [2024-06-10 11:52:19.303201] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20c2f90 00:41:35.361 [2024-06-10 11:52:19.303214] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:41:35.362 [2024-06-10 11:52:19.304483] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:41:35.362 [2024-06-10 11:52:19.304506] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:41:35.362 pt2 00:41:35.621 11:52:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:41:35.621 11:52:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:41:35.621 11:52:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:41:35.621 [2024-06-10 11:52:19.463577] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:41:35.621 [2024-06-10 11:52:19.464744] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:41:35.621 [2024-06-10 11:52:19.464852] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x20c3930 00:41:35.621 [2024-06-10 11:52:19.464861] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:41:35.621 [2024-06-10 11:52:19.464917] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20c5760 00:41:35.621 [2024-06-10 11:52:19.464974] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x20c3930 00:41:35.621 [2024-06-10 11:52:19.464980] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x20c3930 00:41:35.621 [2024-06-10 11:52:19.465019] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:41:35.621 11:52:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:41:35.621 11:52:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:41:35.621 11:52:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:41:35.621 11:52:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:41:35.621 11:52:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:41:35.621 11:52:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:41:35.621 11:52:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:41:35.621 11:52:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:41:35.621 11:52:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:41:35.621 11:52:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:41:35.621 11:52:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:41:35.621 11:52:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:41:35.881 11:52:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:41:35.881 "name": "raid_bdev1", 00:41:35.881 "uuid": "3b8c6bb3-8365-4871-b539-16c9cb2ce669", 00:41:35.881 "strip_size_kb": 0, 00:41:35.881 "state": "online", 00:41:35.881 "raid_level": "raid1", 00:41:35.881 "superblock": true, 00:41:35.881 "num_base_bdevs": 2, 00:41:35.881 "num_base_bdevs_discovered": 2, 00:41:35.881 "num_base_bdevs_operational": 2, 00:41:35.881 "base_bdevs_list": [ 00:41:35.881 { 00:41:35.881 "name": "pt1", 00:41:35.881 "uuid": "00000000-0000-0000-0000-000000000001", 00:41:35.881 "is_configured": true, 00:41:35.881 "data_offset": 256, 00:41:35.881 "data_size": 7936 00:41:35.881 }, 00:41:35.881 { 00:41:35.881 "name": "pt2", 00:41:35.881 "uuid": "00000000-0000-0000-0000-000000000002", 00:41:35.881 "is_configured": true, 00:41:35.881 "data_offset": 256, 00:41:35.881 "data_size": 7936 00:41:35.881 } 00:41:35.881 ] 00:41:35.881 }' 00:41:35.881 11:52:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:41:35.881 11:52:19 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:41:36.448 11:52:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:41:36.448 11:52:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:41:36.448 11:52:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:41:36.448 11:52:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:41:36.448 11:52:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:41:36.448 11:52:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:41:36.448 11:52:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:41:36.448 11:52:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:41:36.448 [2024-06-10 11:52:20.313925] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:41:36.448 11:52:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:41:36.448 "name": "raid_bdev1", 00:41:36.448 "aliases": [ 00:41:36.448 "3b8c6bb3-8365-4871-b539-16c9cb2ce669" 00:41:36.448 ], 00:41:36.448 "product_name": "Raid Volume", 00:41:36.448 "block_size": 4128, 00:41:36.448 "num_blocks": 7936, 00:41:36.448 "uuid": "3b8c6bb3-8365-4871-b539-16c9cb2ce669", 00:41:36.448 "md_size": 32, 00:41:36.448 "md_interleave": true, 00:41:36.448 "dif_type": 0, 00:41:36.448 "assigned_rate_limits": { 00:41:36.448 "rw_ios_per_sec": 0, 00:41:36.448 "rw_mbytes_per_sec": 0, 00:41:36.448 "r_mbytes_per_sec": 0, 00:41:36.448 "w_mbytes_per_sec": 0 00:41:36.448 }, 00:41:36.448 "claimed": false, 00:41:36.448 "zoned": false, 00:41:36.448 "supported_io_types": { 00:41:36.448 "read": true, 00:41:36.448 "write": true, 00:41:36.448 "unmap": false, 00:41:36.448 "write_zeroes": true, 00:41:36.448 "flush": false, 00:41:36.448 "reset": true, 00:41:36.448 "compare": false, 00:41:36.448 "compare_and_write": false, 00:41:36.448 "abort": false, 00:41:36.448 "nvme_admin": false, 00:41:36.448 "nvme_io": false 00:41:36.448 }, 00:41:36.448 "memory_domains": [ 00:41:36.448 { 00:41:36.448 "dma_device_id": "system", 00:41:36.448 "dma_device_type": 1 00:41:36.448 }, 00:41:36.448 { 00:41:36.448 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:41:36.448 "dma_device_type": 2 00:41:36.448 }, 00:41:36.448 { 00:41:36.448 "dma_device_id": "system", 00:41:36.448 "dma_device_type": 1 00:41:36.448 }, 00:41:36.448 { 00:41:36.448 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:41:36.448 "dma_device_type": 2 00:41:36.448 } 00:41:36.448 ], 00:41:36.448 "driver_specific": { 00:41:36.448 "raid": { 00:41:36.448 "uuid": "3b8c6bb3-8365-4871-b539-16c9cb2ce669", 00:41:36.448 "strip_size_kb": 0, 00:41:36.448 "state": "online", 00:41:36.448 "raid_level": "raid1", 00:41:36.448 "superblock": true, 00:41:36.448 "num_base_bdevs": 2, 00:41:36.448 "num_base_bdevs_discovered": 2, 00:41:36.448 "num_base_bdevs_operational": 2, 00:41:36.448 "base_bdevs_list": [ 00:41:36.448 { 00:41:36.448 "name": "pt1", 00:41:36.448 "uuid": "00000000-0000-0000-0000-000000000001", 00:41:36.448 "is_configured": true, 00:41:36.448 "data_offset": 256, 00:41:36.448 "data_size": 7936 00:41:36.448 }, 00:41:36.448 { 00:41:36.448 "name": "pt2", 00:41:36.448 "uuid": "00000000-0000-0000-0000-000000000002", 00:41:36.448 "is_configured": true, 00:41:36.448 "data_offset": 256, 00:41:36.448 "data_size": 7936 00:41:36.448 } 00:41:36.448 ] 00:41:36.448 } 00:41:36.449 } 00:41:36.449 }' 00:41:36.449 11:52:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:41:36.449 11:52:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:41:36.449 pt2' 00:41:36.449 11:52:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:41:36.449 11:52:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:41:36.449 11:52:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:41:36.710 11:52:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:41:36.710 "name": "pt1", 00:41:36.710 "aliases": [ 00:41:36.710 "00000000-0000-0000-0000-000000000001" 00:41:36.710 ], 00:41:36.710 "product_name": "passthru", 00:41:36.710 "block_size": 4128, 00:41:36.710 "num_blocks": 8192, 00:41:36.710 "uuid": "00000000-0000-0000-0000-000000000001", 00:41:36.710 "md_size": 32, 00:41:36.710 "md_interleave": true, 00:41:36.710 "dif_type": 0, 00:41:36.710 "assigned_rate_limits": { 00:41:36.710 "rw_ios_per_sec": 0, 00:41:36.710 "rw_mbytes_per_sec": 0, 00:41:36.710 "r_mbytes_per_sec": 0, 00:41:36.710 "w_mbytes_per_sec": 0 00:41:36.710 }, 00:41:36.710 "claimed": true, 00:41:36.710 "claim_type": "exclusive_write", 00:41:36.710 "zoned": false, 00:41:36.710 "supported_io_types": { 00:41:36.710 "read": true, 00:41:36.710 "write": true, 00:41:36.710 "unmap": true, 00:41:36.710 "write_zeroes": true, 00:41:36.710 "flush": true, 00:41:36.710 "reset": true, 00:41:36.710 "compare": false, 00:41:36.710 "compare_and_write": false, 00:41:36.710 "abort": true, 00:41:36.710 "nvme_admin": false, 00:41:36.710 "nvme_io": false 00:41:36.710 }, 00:41:36.710 "memory_domains": [ 00:41:36.710 { 00:41:36.710 "dma_device_id": "system", 00:41:36.710 "dma_device_type": 1 00:41:36.710 }, 00:41:36.710 { 00:41:36.710 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:41:36.710 "dma_device_type": 2 00:41:36.710 } 00:41:36.710 ], 00:41:36.710 "driver_specific": { 00:41:36.710 "passthru": { 00:41:36.710 "name": "pt1", 00:41:36.710 "base_bdev_name": "malloc1" 00:41:36.710 } 00:41:36.710 } 00:41:36.710 }' 00:41:36.710 11:52:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:41:36.710 11:52:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:41:36.710 11:52:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:41:36.710 11:52:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:41:37.036 11:52:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:41:37.036 11:52:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:41:37.036 11:52:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:41:37.036 11:52:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:41:37.036 11:52:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:41:37.036 11:52:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:41:37.036 11:52:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:41:37.036 11:52:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:41:37.036 11:52:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:41:37.036 11:52:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:41:37.036 11:52:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:41:37.303 11:52:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:41:37.303 "name": "pt2", 00:41:37.303 "aliases": [ 00:41:37.303 "00000000-0000-0000-0000-000000000002" 00:41:37.303 ], 00:41:37.303 "product_name": "passthru", 00:41:37.303 "block_size": 4128, 00:41:37.303 "num_blocks": 8192, 00:41:37.303 "uuid": "00000000-0000-0000-0000-000000000002", 00:41:37.303 "md_size": 32, 00:41:37.303 "md_interleave": true, 00:41:37.303 "dif_type": 0, 00:41:37.303 "assigned_rate_limits": { 00:41:37.303 "rw_ios_per_sec": 0, 00:41:37.303 "rw_mbytes_per_sec": 0, 00:41:37.303 "r_mbytes_per_sec": 0, 00:41:37.303 "w_mbytes_per_sec": 0 00:41:37.303 }, 00:41:37.303 "claimed": true, 00:41:37.303 "claim_type": "exclusive_write", 00:41:37.303 "zoned": false, 00:41:37.303 "supported_io_types": { 00:41:37.303 "read": true, 00:41:37.303 "write": true, 00:41:37.303 "unmap": true, 00:41:37.303 "write_zeroes": true, 00:41:37.303 "flush": true, 00:41:37.303 "reset": true, 00:41:37.303 "compare": false, 00:41:37.303 "compare_and_write": false, 00:41:37.303 "abort": true, 00:41:37.303 "nvme_admin": false, 00:41:37.303 "nvme_io": false 00:41:37.303 }, 00:41:37.303 "memory_domains": [ 00:41:37.303 { 00:41:37.303 "dma_device_id": "system", 00:41:37.303 "dma_device_type": 1 00:41:37.303 }, 00:41:37.303 { 00:41:37.303 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:41:37.303 "dma_device_type": 2 00:41:37.303 } 00:41:37.303 ], 00:41:37.303 "driver_specific": { 00:41:37.303 "passthru": { 00:41:37.303 "name": "pt2", 00:41:37.303 "base_bdev_name": "malloc2" 00:41:37.303 } 00:41:37.303 } 00:41:37.303 }' 00:41:37.303 11:52:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:41:37.303 11:52:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:41:37.303 11:52:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:41:37.303 11:52:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:41:37.303 11:52:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:41:37.303 11:52:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:41:37.303 11:52:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:41:37.562 11:52:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:41:37.562 11:52:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:41:37.562 11:52:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:41:37.562 11:52:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:41:37.562 11:52:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:41:37.562 11:52:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:41:37.562 11:52:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:41:37.821 [2024-06-10 11:52:21.509047] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:41:37.821 11:52:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=3b8c6bb3-8365-4871-b539-16c9cb2ce669 00:41:37.821 11:52:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@435 -- # '[' -z 3b8c6bb3-8365-4871-b539-16c9cb2ce669 ']' 00:41:37.821 11:52:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:41:37.821 [2024-06-10 11:52:21.689348] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:41:37.821 [2024-06-10 11:52:21.689363] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:41:37.821 [2024-06-10 11:52:21.689402] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:41:37.821 [2024-06-10 11:52:21.689436] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:41:37.821 [2024-06-10 11:52:21.689444] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20c3930 name raid_bdev1, state offline 00:41:37.821 11:52:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:41:37.821 11:52:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:41:38.080 11:52:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:41:38.080 11:52:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:41:38.080 11:52:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:41:38.080 11:52:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:41:38.339 11:52:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:41:38.339 11:52:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:41:38.339 11:52:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:41:38.339 11:52:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:41:38.600 11:52:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:41:38.600 11:52:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:41:38.600 11:52:22 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@649 -- # local es=0 00:41:38.600 11:52:22 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:41:38.600 11:52:22 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:41:38.600 11:52:22 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:41:38.600 11:52:22 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:41:38.600 11:52:22 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:41:38.600 11:52:22 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:41:38.600 11:52:22 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:41:38.600 11:52:22 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:41:38.600 11:52:22 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:41:38.600 11:52:22 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:41:38.859 [2024-06-10 11:52:22.559574] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:41:38.859 [2024-06-10 11:52:22.560616] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:41:38.859 [2024-06-10 11:52:22.560659] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:41:38.859 [2024-06-10 11:52:22.560690] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:41:38.859 [2024-06-10 11:52:22.560704] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:41:38.859 [2024-06-10 11:52:22.560711] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x225d560 name raid_bdev1, state configuring 00:41:38.859 request: 00:41:38.859 { 00:41:38.859 "name": "raid_bdev1", 00:41:38.859 "raid_level": "raid1", 00:41:38.859 "base_bdevs": [ 00:41:38.859 "malloc1", 00:41:38.859 "malloc2" 00:41:38.859 ], 00:41:38.859 "superblock": false, 00:41:38.859 "method": "bdev_raid_create", 00:41:38.859 "req_id": 1 00:41:38.859 } 00:41:38.859 Got JSON-RPC error response 00:41:38.859 response: 00:41:38.859 { 00:41:38.859 "code": -17, 00:41:38.859 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:41:38.859 } 00:41:38.859 11:52:22 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@652 -- # es=1 00:41:38.859 11:52:22 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:41:38.859 11:52:22 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:41:38.859 11:52:22 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:41:38.859 11:52:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:41:38.859 11:52:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:41:38.859 11:52:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:41:38.859 11:52:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:41:38.859 11:52:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:41:39.118 [2024-06-10 11:52:22.904430] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:41:39.118 [2024-06-10 11:52:22.904467] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:41:39.118 [2024-06-10 11:52:22.904479] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20c19f0 00:41:39.118 [2024-06-10 11:52:22.904491] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:41:39.118 [2024-06-10 11:52:22.905539] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:41:39.118 [2024-06-10 11:52:22.905561] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:41:39.118 [2024-06-10 11:52:22.905595] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:41:39.118 [2024-06-10 11:52:22.905615] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:41:39.118 pt1 00:41:39.118 11:52:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:41:39.118 11:52:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:41:39.118 11:52:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:41:39.118 11:52:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:41:39.118 11:52:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:41:39.118 11:52:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:41:39.118 11:52:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:41:39.118 11:52:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:41:39.118 11:52:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:41:39.118 11:52:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:41:39.118 11:52:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:41:39.118 11:52:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:41:39.377 11:52:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:41:39.377 "name": "raid_bdev1", 00:41:39.377 "uuid": "3b8c6bb3-8365-4871-b539-16c9cb2ce669", 00:41:39.377 "strip_size_kb": 0, 00:41:39.377 "state": "configuring", 00:41:39.377 "raid_level": "raid1", 00:41:39.377 "superblock": true, 00:41:39.377 "num_base_bdevs": 2, 00:41:39.377 "num_base_bdevs_discovered": 1, 00:41:39.377 "num_base_bdevs_operational": 2, 00:41:39.377 "base_bdevs_list": [ 00:41:39.377 { 00:41:39.377 "name": "pt1", 00:41:39.377 "uuid": "00000000-0000-0000-0000-000000000001", 00:41:39.377 "is_configured": true, 00:41:39.377 "data_offset": 256, 00:41:39.377 "data_size": 7936 00:41:39.377 }, 00:41:39.377 { 00:41:39.377 "name": null, 00:41:39.377 "uuid": "00000000-0000-0000-0000-000000000002", 00:41:39.377 "is_configured": false, 00:41:39.377 "data_offset": 256, 00:41:39.377 "data_size": 7936 00:41:39.377 } 00:41:39.377 ] 00:41:39.377 }' 00:41:39.377 11:52:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:41:39.377 11:52:23 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:41:39.944 11:52:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:41:39.944 11:52:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:41:39.945 11:52:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:41:39.945 11:52:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:41:39.945 [2024-06-10 11:52:23.734577] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:41:39.945 [2024-06-10 11:52:23.734619] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:41:39.945 [2024-06-10 11:52:23.734633] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20c2450 00:41:39.945 [2024-06-10 11:52:23.734641] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:41:39.945 [2024-06-10 11:52:23.734758] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:41:39.945 [2024-06-10 11:52:23.734774] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:41:39.945 [2024-06-10 11:52:23.734805] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:41:39.945 [2024-06-10 11:52:23.734818] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:41:39.945 [2024-06-10 11:52:23.734885] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x20c5ea0 00:41:39.945 [2024-06-10 11:52:23.734893] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:41:39.945 [2024-06-10 11:52:23.734928] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20c7340 00:41:39.945 [2024-06-10 11:52:23.734979] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x20c5ea0 00:41:39.945 [2024-06-10 11:52:23.734986] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x20c5ea0 00:41:39.945 [2024-06-10 11:52:23.735023] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:41:39.945 pt2 00:41:39.945 11:52:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:41:39.945 11:52:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:41:39.945 11:52:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:41:39.945 11:52:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:41:39.945 11:52:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:41:39.945 11:52:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:41:39.945 11:52:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:41:39.945 11:52:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:41:39.945 11:52:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:41:39.945 11:52:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:41:39.945 11:52:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:41:39.945 11:52:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:41:39.945 11:52:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:41:39.945 11:52:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:41:40.204 11:52:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:41:40.204 "name": "raid_bdev1", 00:41:40.204 "uuid": "3b8c6bb3-8365-4871-b539-16c9cb2ce669", 00:41:40.204 "strip_size_kb": 0, 00:41:40.204 "state": "online", 00:41:40.204 "raid_level": "raid1", 00:41:40.204 "superblock": true, 00:41:40.204 "num_base_bdevs": 2, 00:41:40.204 "num_base_bdevs_discovered": 2, 00:41:40.204 "num_base_bdevs_operational": 2, 00:41:40.204 "base_bdevs_list": [ 00:41:40.204 { 00:41:40.204 "name": "pt1", 00:41:40.204 "uuid": "00000000-0000-0000-0000-000000000001", 00:41:40.204 "is_configured": true, 00:41:40.204 "data_offset": 256, 00:41:40.204 "data_size": 7936 00:41:40.204 }, 00:41:40.204 { 00:41:40.204 "name": "pt2", 00:41:40.204 "uuid": "00000000-0000-0000-0000-000000000002", 00:41:40.204 "is_configured": true, 00:41:40.204 "data_offset": 256, 00:41:40.204 "data_size": 7936 00:41:40.204 } 00:41:40.204 ] 00:41:40.204 }' 00:41:40.204 11:52:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:41:40.204 11:52:23 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:41:40.772 11:52:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:41:40.772 11:52:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:41:40.772 11:52:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:41:40.772 11:52:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:41:40.772 11:52:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:41:40.772 11:52:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:41:40.772 11:52:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:41:40.772 11:52:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:41:40.772 [2024-06-10 11:52:24.604983] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:41:40.772 11:52:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:41:40.772 "name": "raid_bdev1", 00:41:40.772 "aliases": [ 00:41:40.772 "3b8c6bb3-8365-4871-b539-16c9cb2ce669" 00:41:40.772 ], 00:41:40.772 "product_name": "Raid Volume", 00:41:40.772 "block_size": 4128, 00:41:40.772 "num_blocks": 7936, 00:41:40.772 "uuid": "3b8c6bb3-8365-4871-b539-16c9cb2ce669", 00:41:40.772 "md_size": 32, 00:41:40.772 "md_interleave": true, 00:41:40.772 "dif_type": 0, 00:41:40.772 "assigned_rate_limits": { 00:41:40.772 "rw_ios_per_sec": 0, 00:41:40.772 "rw_mbytes_per_sec": 0, 00:41:40.772 "r_mbytes_per_sec": 0, 00:41:40.772 "w_mbytes_per_sec": 0 00:41:40.772 }, 00:41:40.772 "claimed": false, 00:41:40.772 "zoned": false, 00:41:40.772 "supported_io_types": { 00:41:40.772 "read": true, 00:41:40.772 "write": true, 00:41:40.772 "unmap": false, 00:41:40.772 "write_zeroes": true, 00:41:40.772 "flush": false, 00:41:40.772 "reset": true, 00:41:40.772 "compare": false, 00:41:40.772 "compare_and_write": false, 00:41:40.772 "abort": false, 00:41:40.772 "nvme_admin": false, 00:41:40.772 "nvme_io": false 00:41:40.772 }, 00:41:40.772 "memory_domains": [ 00:41:40.772 { 00:41:40.772 "dma_device_id": "system", 00:41:40.772 "dma_device_type": 1 00:41:40.772 }, 00:41:40.772 { 00:41:40.772 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:41:40.772 "dma_device_type": 2 00:41:40.772 }, 00:41:40.772 { 00:41:40.772 "dma_device_id": "system", 00:41:40.772 "dma_device_type": 1 00:41:40.772 }, 00:41:40.772 { 00:41:40.772 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:41:40.772 "dma_device_type": 2 00:41:40.772 } 00:41:40.772 ], 00:41:40.772 "driver_specific": { 00:41:40.772 "raid": { 00:41:40.772 "uuid": "3b8c6bb3-8365-4871-b539-16c9cb2ce669", 00:41:40.772 "strip_size_kb": 0, 00:41:40.772 "state": "online", 00:41:40.772 "raid_level": "raid1", 00:41:40.772 "superblock": true, 00:41:40.772 "num_base_bdevs": 2, 00:41:40.772 "num_base_bdevs_discovered": 2, 00:41:40.772 "num_base_bdevs_operational": 2, 00:41:40.772 "base_bdevs_list": [ 00:41:40.772 { 00:41:40.772 "name": "pt1", 00:41:40.772 "uuid": "00000000-0000-0000-0000-000000000001", 00:41:40.772 "is_configured": true, 00:41:40.772 "data_offset": 256, 00:41:40.772 "data_size": 7936 00:41:40.772 }, 00:41:40.772 { 00:41:40.772 "name": "pt2", 00:41:40.772 "uuid": "00000000-0000-0000-0000-000000000002", 00:41:40.772 "is_configured": true, 00:41:40.772 "data_offset": 256, 00:41:40.772 "data_size": 7936 00:41:40.772 } 00:41:40.772 ] 00:41:40.772 } 00:41:40.772 } 00:41:40.772 }' 00:41:40.772 11:52:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:41:40.772 11:52:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:41:40.772 pt2' 00:41:40.772 11:52:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:41:40.772 11:52:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:41:40.772 11:52:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:41:41.031 11:52:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:41:41.031 "name": "pt1", 00:41:41.031 "aliases": [ 00:41:41.031 "00000000-0000-0000-0000-000000000001" 00:41:41.031 ], 00:41:41.031 "product_name": "passthru", 00:41:41.031 "block_size": 4128, 00:41:41.031 "num_blocks": 8192, 00:41:41.031 "uuid": "00000000-0000-0000-0000-000000000001", 00:41:41.031 "md_size": 32, 00:41:41.031 "md_interleave": true, 00:41:41.031 "dif_type": 0, 00:41:41.031 "assigned_rate_limits": { 00:41:41.031 "rw_ios_per_sec": 0, 00:41:41.031 "rw_mbytes_per_sec": 0, 00:41:41.031 "r_mbytes_per_sec": 0, 00:41:41.031 "w_mbytes_per_sec": 0 00:41:41.031 }, 00:41:41.031 "claimed": true, 00:41:41.031 "claim_type": "exclusive_write", 00:41:41.031 "zoned": false, 00:41:41.031 "supported_io_types": { 00:41:41.031 "read": true, 00:41:41.031 "write": true, 00:41:41.031 "unmap": true, 00:41:41.031 "write_zeroes": true, 00:41:41.031 "flush": true, 00:41:41.031 "reset": true, 00:41:41.031 "compare": false, 00:41:41.031 "compare_and_write": false, 00:41:41.031 "abort": true, 00:41:41.031 "nvme_admin": false, 00:41:41.031 "nvme_io": false 00:41:41.031 }, 00:41:41.031 "memory_domains": [ 00:41:41.031 { 00:41:41.031 "dma_device_id": "system", 00:41:41.031 "dma_device_type": 1 00:41:41.031 }, 00:41:41.031 { 00:41:41.031 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:41:41.031 "dma_device_type": 2 00:41:41.031 } 00:41:41.031 ], 00:41:41.031 "driver_specific": { 00:41:41.031 "passthru": { 00:41:41.031 "name": "pt1", 00:41:41.031 "base_bdev_name": "malloc1" 00:41:41.031 } 00:41:41.031 } 00:41:41.031 }' 00:41:41.031 11:52:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:41:41.031 11:52:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:41:41.031 11:52:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:41:41.031 11:52:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:41:41.290 11:52:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:41:41.290 11:52:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:41:41.290 11:52:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:41:41.290 11:52:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:41:41.290 11:52:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:41:41.290 11:52:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:41:41.290 11:52:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:41:41.290 11:52:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:41:41.290 11:52:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:41:41.290 11:52:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:41:41.290 11:52:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:41:41.550 11:52:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:41:41.550 "name": "pt2", 00:41:41.550 "aliases": [ 00:41:41.550 "00000000-0000-0000-0000-000000000002" 00:41:41.550 ], 00:41:41.550 "product_name": "passthru", 00:41:41.550 "block_size": 4128, 00:41:41.550 "num_blocks": 8192, 00:41:41.550 "uuid": "00000000-0000-0000-0000-000000000002", 00:41:41.550 "md_size": 32, 00:41:41.550 "md_interleave": true, 00:41:41.550 "dif_type": 0, 00:41:41.550 "assigned_rate_limits": { 00:41:41.550 "rw_ios_per_sec": 0, 00:41:41.550 "rw_mbytes_per_sec": 0, 00:41:41.550 "r_mbytes_per_sec": 0, 00:41:41.550 "w_mbytes_per_sec": 0 00:41:41.550 }, 00:41:41.550 "claimed": true, 00:41:41.550 "claim_type": "exclusive_write", 00:41:41.550 "zoned": false, 00:41:41.550 "supported_io_types": { 00:41:41.550 "read": true, 00:41:41.550 "write": true, 00:41:41.550 "unmap": true, 00:41:41.550 "write_zeroes": true, 00:41:41.550 "flush": true, 00:41:41.550 "reset": true, 00:41:41.550 "compare": false, 00:41:41.550 "compare_and_write": false, 00:41:41.550 "abort": true, 00:41:41.550 "nvme_admin": false, 00:41:41.550 "nvme_io": false 00:41:41.550 }, 00:41:41.550 "memory_domains": [ 00:41:41.550 { 00:41:41.550 "dma_device_id": "system", 00:41:41.550 "dma_device_type": 1 00:41:41.550 }, 00:41:41.550 { 00:41:41.550 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:41:41.550 "dma_device_type": 2 00:41:41.550 } 00:41:41.550 ], 00:41:41.550 "driver_specific": { 00:41:41.550 "passthru": { 00:41:41.550 "name": "pt2", 00:41:41.550 "base_bdev_name": "malloc2" 00:41:41.550 } 00:41:41.550 } 00:41:41.550 }' 00:41:41.550 11:52:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:41:41.550 11:52:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:41:41.550 11:52:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:41:41.550 11:52:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:41:41.550 11:52:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:41:41.550 11:52:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:41:41.550 11:52:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:41:41.809 11:52:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:41:41.809 11:52:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:41:41.809 11:52:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:41:41.809 11:52:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:41:41.809 11:52:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:41:41.809 11:52:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:41:41.809 11:52:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:41:42.068 [2024-06-10 11:52:25.788023] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:41:42.068 11:52:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # '[' 3b8c6bb3-8365-4871-b539-16c9cb2ce669 '!=' 3b8c6bb3-8365-4871-b539-16c9cb2ce669 ']' 00:41:42.068 11:52:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:41:42.068 11:52:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@213 -- # case $1 in 00:41:42.068 11:52:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@214 -- # return 0 00:41:42.068 11:52:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:41:42.068 [2024-06-10 11:52:25.964393] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:41:42.068 11:52:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:41:42.068 11:52:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:41:42.068 11:52:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:41:42.068 11:52:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:41:42.068 11:52:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:41:42.068 11:52:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:41:42.068 11:52:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:41:42.068 11:52:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:41:42.068 11:52:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:41:42.068 11:52:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:41:42.068 11:52:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:41:42.068 11:52:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:41:42.327 11:52:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:41:42.327 "name": "raid_bdev1", 00:41:42.327 "uuid": "3b8c6bb3-8365-4871-b539-16c9cb2ce669", 00:41:42.327 "strip_size_kb": 0, 00:41:42.327 "state": "online", 00:41:42.327 "raid_level": "raid1", 00:41:42.327 "superblock": true, 00:41:42.327 "num_base_bdevs": 2, 00:41:42.327 "num_base_bdevs_discovered": 1, 00:41:42.327 "num_base_bdevs_operational": 1, 00:41:42.327 "base_bdevs_list": [ 00:41:42.327 { 00:41:42.327 "name": null, 00:41:42.327 "uuid": "00000000-0000-0000-0000-000000000000", 00:41:42.327 "is_configured": false, 00:41:42.327 "data_offset": 256, 00:41:42.327 "data_size": 7936 00:41:42.327 }, 00:41:42.327 { 00:41:42.327 "name": "pt2", 00:41:42.327 "uuid": "00000000-0000-0000-0000-000000000002", 00:41:42.327 "is_configured": true, 00:41:42.327 "data_offset": 256, 00:41:42.327 "data_size": 7936 00:41:42.327 } 00:41:42.327 ] 00:41:42.327 }' 00:41:42.327 11:52:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:41:42.327 11:52:26 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:41:42.894 11:52:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:41:42.895 [2024-06-10 11:52:26.798576] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:41:42.895 [2024-06-10 11:52:26.798599] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:41:42.895 [2024-06-10 11:52:26.798640] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:41:42.895 [2024-06-10 11:52:26.798673] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:41:42.895 [2024-06-10 11:52:26.798681] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20c5ea0 name raid_bdev1, state offline 00:41:42.895 11:52:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:41:42.895 11:52:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:41:43.153 11:52:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:41:43.153 11:52:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:41:43.153 11:52:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:41:43.153 11:52:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:41:43.154 11:52:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:41:43.412 11:52:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:41:43.412 11:52:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:41:43.412 11:52:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:41:43.412 11:52:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:41:43.412 11:52:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@518 -- # i=1 00:41:43.413 11:52:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:41:43.413 [2024-06-10 11:52:27.315914] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:41:43.413 [2024-06-10 11:52:27.315964] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:41:43.413 [2024-06-10 11:52:27.315976] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20c1ec0 00:41:43.413 [2024-06-10 11:52:27.315984] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:41:43.413 [2024-06-10 11:52:27.317035] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:41:43.413 [2024-06-10 11:52:27.317059] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:41:43.413 [2024-06-10 11:52:27.317094] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:41:43.413 [2024-06-10 11:52:27.317114] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:41:43.413 [2024-06-10 11:52:27.317164] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x20c6120 00:41:43.413 [2024-06-10 11:52:27.317170] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:41:43.413 [2024-06-10 11:52:27.317208] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20c64e0 00:41:43.413 [2024-06-10 11:52:27.317257] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x20c6120 00:41:43.413 [2024-06-10 11:52:27.317263] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x20c6120 00:41:43.413 [2024-06-10 11:52:27.317297] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:41:43.413 pt2 00:41:43.413 11:52:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:41:43.413 11:52:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:41:43.413 11:52:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:41:43.413 11:52:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:41:43.413 11:52:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:41:43.413 11:52:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:41:43.413 11:52:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:41:43.413 11:52:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:41:43.413 11:52:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:41:43.413 11:52:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:41:43.413 11:52:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:41:43.413 11:52:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:41:43.672 11:52:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:41:43.672 "name": "raid_bdev1", 00:41:43.672 "uuid": "3b8c6bb3-8365-4871-b539-16c9cb2ce669", 00:41:43.672 "strip_size_kb": 0, 00:41:43.672 "state": "online", 00:41:43.672 "raid_level": "raid1", 00:41:43.672 "superblock": true, 00:41:43.672 "num_base_bdevs": 2, 00:41:43.672 "num_base_bdevs_discovered": 1, 00:41:43.672 "num_base_bdevs_operational": 1, 00:41:43.672 "base_bdevs_list": [ 00:41:43.672 { 00:41:43.672 "name": null, 00:41:43.672 "uuid": "00000000-0000-0000-0000-000000000000", 00:41:43.672 "is_configured": false, 00:41:43.672 "data_offset": 256, 00:41:43.672 "data_size": 7936 00:41:43.672 }, 00:41:43.672 { 00:41:43.672 "name": "pt2", 00:41:43.672 "uuid": "00000000-0000-0000-0000-000000000002", 00:41:43.672 "is_configured": true, 00:41:43.672 "data_offset": 256, 00:41:43.672 "data_size": 7936 00:41:43.672 } 00:41:43.672 ] 00:41:43.672 }' 00:41:43.672 11:52:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:41:43.672 11:52:27 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:41:44.241 11:52:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:41:44.241 [2024-06-10 11:52:28.138037] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:41:44.241 [2024-06-10 11:52:28.138055] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:41:44.241 [2024-06-10 11:52:28.138090] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:41:44.241 [2024-06-10 11:52:28.138120] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:41:44.241 [2024-06-10 11:52:28.138128] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20c6120 name raid_bdev1, state offline 00:41:44.241 11:52:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:41:44.241 11:52:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:41:44.500 11:52:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:41:44.500 11:52:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:41:44.500 11:52:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:41:44.500 11:52:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:41:44.759 [2024-06-10 11:52:28.490940] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:41:44.759 [2024-06-10 11:52:28.490975] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:41:44.759 [2024-06-10 11:52:28.490987] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20c36c0 00:41:44.759 [2024-06-10 11:52:28.490995] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:41:44.759 [2024-06-10 11:52:28.492049] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:41:44.759 [2024-06-10 11:52:28.492077] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:41:44.759 [2024-06-10 11:52:28.492112] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:41:44.759 [2024-06-10 11:52:28.492132] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:41:44.759 [2024-06-10 11:52:28.492191] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:41:44.759 [2024-06-10 11:52:28.492200] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:41:44.759 [2024-06-10 11:52:28.492210] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20c9d80 name raid_bdev1, state configuring 00:41:44.759 [2024-06-10 11:52:28.492226] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:41:44.759 [2024-06-10 11:52:28.492262] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x20c66a0 00:41:44.759 [2024-06-10 11:52:28.492269] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:41:44.759 [2024-06-10 11:52:28.492308] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20c8040 00:41:44.759 [2024-06-10 11:52:28.492357] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x20c66a0 00:41:44.759 [2024-06-10 11:52:28.492363] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x20c66a0 00:41:44.759 [2024-06-10 11:52:28.492403] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:41:44.759 pt1 00:41:44.759 11:52:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:41:44.759 11:52:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:41:44.759 11:52:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:41:44.759 11:52:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:41:44.759 11:52:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:41:44.759 11:52:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:41:44.759 11:52:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:41:44.759 11:52:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:41:44.759 11:52:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:41:44.759 11:52:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:41:44.759 11:52:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:41:44.759 11:52:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:41:44.759 11:52:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:41:44.759 11:52:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:41:44.759 "name": "raid_bdev1", 00:41:44.759 "uuid": "3b8c6bb3-8365-4871-b539-16c9cb2ce669", 00:41:44.759 "strip_size_kb": 0, 00:41:44.759 "state": "online", 00:41:44.759 "raid_level": "raid1", 00:41:44.759 "superblock": true, 00:41:44.759 "num_base_bdevs": 2, 00:41:44.759 "num_base_bdevs_discovered": 1, 00:41:44.759 "num_base_bdevs_operational": 1, 00:41:44.759 "base_bdevs_list": [ 00:41:44.759 { 00:41:44.759 "name": null, 00:41:44.759 "uuid": "00000000-0000-0000-0000-000000000000", 00:41:44.759 "is_configured": false, 00:41:44.759 "data_offset": 256, 00:41:44.759 "data_size": 7936 00:41:44.759 }, 00:41:44.759 { 00:41:44.759 "name": "pt2", 00:41:44.759 "uuid": "00000000-0000-0000-0000-000000000002", 00:41:44.759 "is_configured": true, 00:41:44.759 "data_offset": 256, 00:41:44.759 "data_size": 7936 00:41:44.759 } 00:41:44.759 ] 00:41:44.759 }' 00:41:44.759 11:52:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:41:44.759 11:52:28 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:41:45.326 11:52:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:41:45.326 11:52:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:41:45.585 11:52:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:41:45.585 11:52:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:41:45.585 11:52:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:41:45.585 [2024-06-10 11:52:29.517717] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:41:45.843 11:52:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # '[' 3b8c6bb3-8365-4871-b539-16c9cb2ce669 '!=' 3b8c6bb3-8365-4871-b539-16c9cb2ce669 ']' 00:41:45.843 11:52:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@562 -- # killprocess 259303 00:41:45.843 11:52:29 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@949 -- # '[' -z 259303 ']' 00:41:45.843 11:52:29 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@953 -- # kill -0 259303 00:41:45.843 11:52:29 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@954 -- # uname 00:41:45.843 11:52:29 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:41:45.843 11:52:29 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 259303 00:41:45.843 11:52:29 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:41:45.843 11:52:29 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:41:45.843 11:52:29 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@967 -- # echo 'killing process with pid 259303' 00:41:45.843 killing process with pid 259303 00:41:45.843 11:52:29 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@968 -- # kill 259303 00:41:45.843 [2024-06-10 11:52:29.587838] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:41:45.843 [2024-06-10 11:52:29.587882] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:41:45.843 [2024-06-10 11:52:29.587914] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:41:45.843 [2024-06-10 11:52:29.587922] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20c66a0 name raid_bdev1, state offline 00:41:45.843 11:52:29 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@973 -- # wait 259303 00:41:45.843 [2024-06-10 11:52:29.605268] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:41:46.103 11:52:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@564 -- # return 0 00:41:46.103 00:41:46.103 real 0m12.017s 00:41:46.103 user 0m21.574s 00:41:46.103 sys 0m2.414s 00:41:46.103 11:52:29 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@1125 -- # xtrace_disable 00:41:46.103 11:52:29 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:41:46.103 ************************************ 00:41:46.103 END TEST raid_superblock_test_md_interleaved 00:41:46.103 ************************************ 00:41:46.103 11:52:29 bdev_raid -- bdev/bdev_raid.sh@914 -- # run_test raid_rebuild_test_sb_md_interleaved raid_rebuild_test raid1 2 true false false 00:41:46.103 11:52:29 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:41:46.103 11:52:29 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:41:46.103 11:52:29 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:41:46.103 ************************************ 00:41:46.103 START TEST raid_rebuild_test_sb_md_interleaved 00:41:46.103 ************************************ 00:41:46.103 11:52:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@1124 -- # raid_rebuild_test raid1 2 true false false 00:41:46.103 11:52:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:41:46.103 11:52:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:41:46.103 11:52:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:41:46.103 11:52:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:41:46.103 11:52:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@572 -- # local verify=false 00:41:46.103 11:52:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:41:46.103 11:52:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:41:46.103 11:52:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:41:46.103 11:52:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:41:46.103 11:52:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:41:46.103 11:52:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:41:46.103 11:52:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:41:46.103 11:52:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:41:46.103 11:52:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:41:46.103 11:52:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:41:46.103 11:52:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:41:46.103 11:52:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # local strip_size 00:41:46.103 11:52:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@576 -- # local create_arg 00:41:46.103 11:52:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:41:46.103 11:52:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@578 -- # local data_offset 00:41:46.103 11:52:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:41:46.103 11:52:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:41:46.103 11:52:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:41:46.103 11:52:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:41:46.103 11:52:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:41:46.103 11:52:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@596 -- # raid_pid=261127 00:41:46.103 11:52:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@597 -- # waitforlisten 261127 /var/tmp/spdk-raid.sock 00:41:46.103 11:52:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@830 -- # '[' -z 261127 ']' 00:41:46.103 11:52:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:41:46.103 11:52:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@835 -- # local max_retries=100 00:41:46.103 11:52:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:41:46.103 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:41:46.103 11:52:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@839 -- # xtrace_disable 00:41:46.103 11:52:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:41:46.103 [2024-06-10 11:52:29.929456] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:41:46.103 [2024-06-10 11:52:29.929502] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid261127 ] 00:41:46.103 I/O size of 3145728 is greater than zero copy threshold (65536). 00:41:46.103 Zero copy mechanism will not be used. 00:41:46.103 [2024-06-10 11:52:30.017262] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:41:46.362 [2024-06-10 11:52:30.105764] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:41:46.362 [2024-06-10 11:52:30.163273] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:41:46.362 [2024-06-10 11:52:30.163295] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:41:46.928 11:52:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:41:46.928 11:52:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@863 -- # return 0 00:41:46.928 11:52:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:41:46.928 11:52:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev1_malloc 00:41:47.187 BaseBdev1_malloc 00:41:47.187 11:52:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:41:47.187 [2024-06-10 11:52:31.061968] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:41:47.187 [2024-06-10 11:52:31.062011] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:41:47.187 [2024-06-10 11:52:31.062029] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x282f760 00:41:47.187 [2024-06-10 11:52:31.062038] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:41:47.187 [2024-06-10 11:52:31.063164] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:41:47.187 [2024-06-10 11:52:31.063188] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:41:47.187 BaseBdev1 00:41:47.187 11:52:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:41:47.187 11:52:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev2_malloc 00:41:47.445 BaseBdev2_malloc 00:41:47.445 11:52:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:41:47.704 [2024-06-10 11:52:31.404245] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:41:47.704 [2024-06-10 11:52:31.404284] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:41:47.704 [2024-06-10 11:52:31.404300] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2814560 00:41:47.704 [2024-06-10 11:52:31.404310] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:41:47.704 [2024-06-10 11:52:31.405372] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:41:47.704 [2024-06-10 11:52:31.405394] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:41:47.704 BaseBdev2 00:41:47.704 11:52:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b spare_malloc 00:41:47.704 spare_malloc 00:41:47.704 11:52:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:41:47.962 spare_delay 00:41:47.963 11:52:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:41:47.963 [2024-06-10 11:52:31.906876] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:41:47.963 [2024-06-10 11:52:31.906912] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:41:47.963 [2024-06-10 11:52:31.906935] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2815030 00:41:47.963 [2024-06-10 11:52:31.906944] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:41:47.963 [2024-06-10 11:52:31.907960] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:41:47.963 [2024-06-10 11:52:31.907981] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:41:48.221 spare 00:41:48.221 11:52:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:41:48.221 [2024-06-10 11:52:32.067310] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:41:48.221 [2024-06-10 11:52:32.068247] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:41:48.221 [2024-06-10 11:52:32.068374] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x281f5e0 00:41:48.221 [2024-06-10 11:52:32.068382] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:41:48.221 [2024-06-10 11:52:32.068433] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x268a390 00:41:48.221 [2024-06-10 11:52:32.068487] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x281f5e0 00:41:48.221 [2024-06-10 11:52:32.068494] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x281f5e0 00:41:48.221 [2024-06-10 11:52:32.068533] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:41:48.221 11:52:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:41:48.221 11:52:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:41:48.221 11:52:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:41:48.221 11:52:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:41:48.221 11:52:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:41:48.221 11:52:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:41:48.221 11:52:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:41:48.222 11:52:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:41:48.222 11:52:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:41:48.222 11:52:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:41:48.222 11:52:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:41:48.222 11:52:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:41:48.480 11:52:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:41:48.480 "name": "raid_bdev1", 00:41:48.480 "uuid": "36ef460d-29cf-44b8-923d-b827c21c9965", 00:41:48.480 "strip_size_kb": 0, 00:41:48.480 "state": "online", 00:41:48.480 "raid_level": "raid1", 00:41:48.480 "superblock": true, 00:41:48.480 "num_base_bdevs": 2, 00:41:48.480 "num_base_bdevs_discovered": 2, 00:41:48.480 "num_base_bdevs_operational": 2, 00:41:48.480 "base_bdevs_list": [ 00:41:48.480 { 00:41:48.480 "name": "BaseBdev1", 00:41:48.480 "uuid": "87c1bf61-246d-58f2-9181-146df12fbf09", 00:41:48.480 "is_configured": true, 00:41:48.480 "data_offset": 256, 00:41:48.480 "data_size": 7936 00:41:48.480 }, 00:41:48.480 { 00:41:48.480 "name": "BaseBdev2", 00:41:48.480 "uuid": "e99c8034-e6e1-54d3-83cc-ed7232aa2a5d", 00:41:48.480 "is_configured": true, 00:41:48.480 "data_offset": 256, 00:41:48.480 "data_size": 7936 00:41:48.480 } 00:41:48.480 ] 00:41:48.480 }' 00:41:48.480 11:52:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:41:48.480 11:52:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:41:49.047 11:52:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:41:49.047 11:52:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:41:49.047 [2024-06-10 11:52:32.917646] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:41:49.047 11:52:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:41:49.047 11:52:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:41:49.047 11:52:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:41:49.306 11:52:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:41:49.306 11:52:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:41:49.306 11:52:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@623 -- # '[' false = true ']' 00:41:49.306 11:52:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:41:49.565 [2024-06-10 11:52:33.274414] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:41:49.565 11:52:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:41:49.565 11:52:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:41:49.565 11:52:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:41:49.565 11:52:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:41:49.565 11:52:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:41:49.565 11:52:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:41:49.565 11:52:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:41:49.565 11:52:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:41:49.565 11:52:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:41:49.565 11:52:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:41:49.565 11:52:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:41:49.565 11:52:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:41:49.565 11:52:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:41:49.565 "name": "raid_bdev1", 00:41:49.565 "uuid": "36ef460d-29cf-44b8-923d-b827c21c9965", 00:41:49.565 "strip_size_kb": 0, 00:41:49.565 "state": "online", 00:41:49.565 "raid_level": "raid1", 00:41:49.565 "superblock": true, 00:41:49.565 "num_base_bdevs": 2, 00:41:49.565 "num_base_bdevs_discovered": 1, 00:41:49.565 "num_base_bdevs_operational": 1, 00:41:49.565 "base_bdevs_list": [ 00:41:49.565 { 00:41:49.565 "name": null, 00:41:49.565 "uuid": "00000000-0000-0000-0000-000000000000", 00:41:49.565 "is_configured": false, 00:41:49.565 "data_offset": 256, 00:41:49.565 "data_size": 7936 00:41:49.565 }, 00:41:49.565 { 00:41:49.565 "name": "BaseBdev2", 00:41:49.565 "uuid": "e99c8034-e6e1-54d3-83cc-ed7232aa2a5d", 00:41:49.565 "is_configured": true, 00:41:49.565 "data_offset": 256, 00:41:49.565 "data_size": 7936 00:41:49.565 } 00:41:49.565 ] 00:41:49.565 }' 00:41:49.565 11:52:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:41:49.565 11:52:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:41:50.132 11:52:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:41:50.392 [2024-06-10 11:52:34.096550] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:41:50.392 [2024-06-10 11:52:34.099814] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2820de0 00:41:50.392 [2024-06-10 11:52:34.101397] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:41:50.392 11:52:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@646 -- # sleep 1 00:41:51.327 11:52:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:41:51.327 11:52:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:41:51.327 11:52:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:41:51.327 11:52:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:41:51.327 11:52:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:41:51.327 11:52:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:41:51.327 11:52:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:41:51.585 11:52:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:41:51.585 "name": "raid_bdev1", 00:41:51.585 "uuid": "36ef460d-29cf-44b8-923d-b827c21c9965", 00:41:51.585 "strip_size_kb": 0, 00:41:51.585 "state": "online", 00:41:51.585 "raid_level": "raid1", 00:41:51.585 "superblock": true, 00:41:51.585 "num_base_bdevs": 2, 00:41:51.585 "num_base_bdevs_discovered": 2, 00:41:51.585 "num_base_bdevs_operational": 2, 00:41:51.585 "process": { 00:41:51.585 "type": "rebuild", 00:41:51.585 "target": "spare", 00:41:51.585 "progress": { 00:41:51.585 "blocks": 2816, 00:41:51.585 "percent": 35 00:41:51.585 } 00:41:51.585 }, 00:41:51.585 "base_bdevs_list": [ 00:41:51.585 { 00:41:51.585 "name": "spare", 00:41:51.585 "uuid": "d2fe6068-7ba3-5124-9bd5-3d859d6a1574", 00:41:51.585 "is_configured": true, 00:41:51.585 "data_offset": 256, 00:41:51.585 "data_size": 7936 00:41:51.585 }, 00:41:51.585 { 00:41:51.585 "name": "BaseBdev2", 00:41:51.585 "uuid": "e99c8034-e6e1-54d3-83cc-ed7232aa2a5d", 00:41:51.586 "is_configured": true, 00:41:51.586 "data_offset": 256, 00:41:51.586 "data_size": 7936 00:41:51.586 } 00:41:51.586 ] 00:41:51.586 }' 00:41:51.586 11:52:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:41:51.586 11:52:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:41:51.586 11:52:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:41:51.586 11:52:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:41:51.586 11:52:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:41:51.844 [2024-06-10 11:52:35.550131] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:41:51.844 [2024-06-10 11:52:35.612447] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:41:51.844 [2024-06-10 11:52:35.612479] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:41:51.844 [2024-06-10 11:52:35.612489] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:41:51.844 [2024-06-10 11:52:35.612495] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:41:51.844 11:52:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:41:51.844 11:52:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:41:51.844 11:52:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:41:51.844 11:52:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:41:51.844 11:52:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:41:51.844 11:52:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:41:51.844 11:52:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:41:51.844 11:52:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:41:51.844 11:52:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:41:51.844 11:52:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:41:51.844 11:52:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:41:51.844 11:52:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:41:52.103 11:52:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:41:52.103 "name": "raid_bdev1", 00:41:52.103 "uuid": "36ef460d-29cf-44b8-923d-b827c21c9965", 00:41:52.103 "strip_size_kb": 0, 00:41:52.103 "state": "online", 00:41:52.103 "raid_level": "raid1", 00:41:52.103 "superblock": true, 00:41:52.103 "num_base_bdevs": 2, 00:41:52.103 "num_base_bdevs_discovered": 1, 00:41:52.103 "num_base_bdevs_operational": 1, 00:41:52.103 "base_bdevs_list": [ 00:41:52.103 { 00:41:52.103 "name": null, 00:41:52.103 "uuid": "00000000-0000-0000-0000-000000000000", 00:41:52.103 "is_configured": false, 00:41:52.103 "data_offset": 256, 00:41:52.103 "data_size": 7936 00:41:52.103 }, 00:41:52.104 { 00:41:52.104 "name": "BaseBdev2", 00:41:52.104 "uuid": "e99c8034-e6e1-54d3-83cc-ed7232aa2a5d", 00:41:52.104 "is_configured": true, 00:41:52.104 "data_offset": 256, 00:41:52.104 "data_size": 7936 00:41:52.104 } 00:41:52.104 ] 00:41:52.104 }' 00:41:52.104 11:52:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:41:52.104 11:52:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:41:52.363 11:52:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:41:52.363 11:52:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:41:52.363 11:52:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:41:52.363 11:52:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:41:52.363 11:52:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:41:52.363 11:52:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:41:52.363 11:52:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:41:52.622 11:52:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:41:52.622 "name": "raid_bdev1", 00:41:52.622 "uuid": "36ef460d-29cf-44b8-923d-b827c21c9965", 00:41:52.622 "strip_size_kb": 0, 00:41:52.622 "state": "online", 00:41:52.622 "raid_level": "raid1", 00:41:52.622 "superblock": true, 00:41:52.622 "num_base_bdevs": 2, 00:41:52.622 "num_base_bdevs_discovered": 1, 00:41:52.622 "num_base_bdevs_operational": 1, 00:41:52.622 "base_bdevs_list": [ 00:41:52.622 { 00:41:52.622 "name": null, 00:41:52.622 "uuid": "00000000-0000-0000-0000-000000000000", 00:41:52.622 "is_configured": false, 00:41:52.622 "data_offset": 256, 00:41:52.622 "data_size": 7936 00:41:52.622 }, 00:41:52.622 { 00:41:52.622 "name": "BaseBdev2", 00:41:52.622 "uuid": "e99c8034-e6e1-54d3-83cc-ed7232aa2a5d", 00:41:52.622 "is_configured": true, 00:41:52.622 "data_offset": 256, 00:41:52.622 "data_size": 7936 00:41:52.622 } 00:41:52.622 ] 00:41:52.622 }' 00:41:52.622 11:52:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:41:52.622 11:52:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:41:52.622 11:52:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:41:52.622 11:52:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:41:52.622 11:52:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:41:52.881 [2024-06-10 11:52:36.694815] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:41:52.881 [2024-06-10 11:52:36.698528] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2821fd0 00:41:52.881 [2024-06-10 11:52:36.699598] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:41:52.881 11:52:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@662 -- # sleep 1 00:41:53.819 11:52:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:41:53.819 11:52:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:41:53.819 11:52:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:41:53.819 11:52:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:41:53.819 11:52:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:41:53.819 11:52:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:41:53.819 11:52:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:41:54.078 11:52:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:41:54.078 "name": "raid_bdev1", 00:41:54.078 "uuid": "36ef460d-29cf-44b8-923d-b827c21c9965", 00:41:54.078 "strip_size_kb": 0, 00:41:54.078 "state": "online", 00:41:54.078 "raid_level": "raid1", 00:41:54.078 "superblock": true, 00:41:54.078 "num_base_bdevs": 2, 00:41:54.078 "num_base_bdevs_discovered": 2, 00:41:54.078 "num_base_bdevs_operational": 2, 00:41:54.078 "process": { 00:41:54.078 "type": "rebuild", 00:41:54.078 "target": "spare", 00:41:54.078 "progress": { 00:41:54.078 "blocks": 2816, 00:41:54.078 "percent": 35 00:41:54.078 } 00:41:54.078 }, 00:41:54.078 "base_bdevs_list": [ 00:41:54.078 { 00:41:54.078 "name": "spare", 00:41:54.078 "uuid": "d2fe6068-7ba3-5124-9bd5-3d859d6a1574", 00:41:54.078 "is_configured": true, 00:41:54.078 "data_offset": 256, 00:41:54.078 "data_size": 7936 00:41:54.078 }, 00:41:54.078 { 00:41:54.078 "name": "BaseBdev2", 00:41:54.078 "uuid": "e99c8034-e6e1-54d3-83cc-ed7232aa2a5d", 00:41:54.078 "is_configured": true, 00:41:54.078 "data_offset": 256, 00:41:54.078 "data_size": 7936 00:41:54.078 } 00:41:54.078 ] 00:41:54.078 }' 00:41:54.078 11:52:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:41:54.078 11:52:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:41:54.078 11:52:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:41:54.078 11:52:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:41:54.078 11:52:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:41:54.078 11:52:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:41:54.078 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:41:54.078 11:52:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:41:54.078 11:52:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:41:54.078 11:52:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:41:54.078 11:52:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@705 -- # local timeout=887 00:41:54.078 11:52:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:41:54.078 11:52:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:41:54.078 11:52:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:41:54.078 11:52:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:41:54.078 11:52:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:41:54.078 11:52:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:41:54.078 11:52:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:41:54.078 11:52:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:41:54.337 11:52:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:41:54.337 "name": "raid_bdev1", 00:41:54.337 "uuid": "36ef460d-29cf-44b8-923d-b827c21c9965", 00:41:54.337 "strip_size_kb": 0, 00:41:54.337 "state": "online", 00:41:54.337 "raid_level": "raid1", 00:41:54.337 "superblock": true, 00:41:54.337 "num_base_bdevs": 2, 00:41:54.337 "num_base_bdevs_discovered": 2, 00:41:54.337 "num_base_bdevs_operational": 2, 00:41:54.337 "process": { 00:41:54.337 "type": "rebuild", 00:41:54.337 "target": "spare", 00:41:54.337 "progress": { 00:41:54.337 "blocks": 3584, 00:41:54.337 "percent": 45 00:41:54.337 } 00:41:54.337 }, 00:41:54.337 "base_bdevs_list": [ 00:41:54.337 { 00:41:54.337 "name": "spare", 00:41:54.337 "uuid": "d2fe6068-7ba3-5124-9bd5-3d859d6a1574", 00:41:54.337 "is_configured": true, 00:41:54.337 "data_offset": 256, 00:41:54.337 "data_size": 7936 00:41:54.337 }, 00:41:54.337 { 00:41:54.337 "name": "BaseBdev2", 00:41:54.337 "uuid": "e99c8034-e6e1-54d3-83cc-ed7232aa2a5d", 00:41:54.337 "is_configured": true, 00:41:54.337 "data_offset": 256, 00:41:54.337 "data_size": 7936 00:41:54.337 } 00:41:54.337 ] 00:41:54.337 }' 00:41:54.337 11:52:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:41:54.337 11:52:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:41:54.337 11:52:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:41:54.337 11:52:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:41:54.337 11:52:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@710 -- # sleep 1 00:41:55.714 11:52:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:41:55.714 11:52:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:41:55.714 11:52:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:41:55.714 11:52:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:41:55.714 11:52:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:41:55.714 11:52:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:41:55.714 11:52:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:41:55.714 11:52:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:41:55.714 11:52:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:41:55.714 "name": "raid_bdev1", 00:41:55.714 "uuid": "36ef460d-29cf-44b8-923d-b827c21c9965", 00:41:55.714 "strip_size_kb": 0, 00:41:55.714 "state": "online", 00:41:55.714 "raid_level": "raid1", 00:41:55.714 "superblock": true, 00:41:55.714 "num_base_bdevs": 2, 00:41:55.714 "num_base_bdevs_discovered": 2, 00:41:55.714 "num_base_bdevs_operational": 2, 00:41:55.714 "process": { 00:41:55.714 "type": "rebuild", 00:41:55.714 "target": "spare", 00:41:55.714 "progress": { 00:41:55.714 "blocks": 6656, 00:41:55.714 "percent": 83 00:41:55.714 } 00:41:55.714 }, 00:41:55.714 "base_bdevs_list": [ 00:41:55.714 { 00:41:55.714 "name": "spare", 00:41:55.714 "uuid": "d2fe6068-7ba3-5124-9bd5-3d859d6a1574", 00:41:55.714 "is_configured": true, 00:41:55.714 "data_offset": 256, 00:41:55.714 "data_size": 7936 00:41:55.714 }, 00:41:55.714 { 00:41:55.714 "name": "BaseBdev2", 00:41:55.714 "uuid": "e99c8034-e6e1-54d3-83cc-ed7232aa2a5d", 00:41:55.714 "is_configured": true, 00:41:55.714 "data_offset": 256, 00:41:55.714 "data_size": 7936 00:41:55.714 } 00:41:55.714 ] 00:41:55.714 }' 00:41:55.714 11:52:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:41:55.714 11:52:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:41:55.714 11:52:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:41:55.714 11:52:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:41:55.714 11:52:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@710 -- # sleep 1 00:41:55.973 [2024-06-10 11:52:39.822148] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:41:55.973 [2024-06-10 11:52:39.822190] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:41:55.973 [2024-06-10 11:52:39.822249] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:41:56.909 11:52:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:41:56.909 11:52:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:41:56.909 11:52:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:41:56.909 11:52:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:41:56.909 11:52:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:41:56.909 11:52:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:41:56.909 11:52:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:41:56.909 11:52:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:41:56.909 11:52:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:41:56.909 "name": "raid_bdev1", 00:41:56.909 "uuid": "36ef460d-29cf-44b8-923d-b827c21c9965", 00:41:56.909 "strip_size_kb": 0, 00:41:56.909 "state": "online", 00:41:56.909 "raid_level": "raid1", 00:41:56.909 "superblock": true, 00:41:56.909 "num_base_bdevs": 2, 00:41:56.909 "num_base_bdevs_discovered": 2, 00:41:56.909 "num_base_bdevs_operational": 2, 00:41:56.909 "base_bdevs_list": [ 00:41:56.909 { 00:41:56.909 "name": "spare", 00:41:56.909 "uuid": "d2fe6068-7ba3-5124-9bd5-3d859d6a1574", 00:41:56.909 "is_configured": true, 00:41:56.909 "data_offset": 256, 00:41:56.909 "data_size": 7936 00:41:56.909 }, 00:41:56.909 { 00:41:56.909 "name": "BaseBdev2", 00:41:56.909 "uuid": "e99c8034-e6e1-54d3-83cc-ed7232aa2a5d", 00:41:56.909 "is_configured": true, 00:41:56.909 "data_offset": 256, 00:41:56.909 "data_size": 7936 00:41:56.909 } 00:41:56.909 ] 00:41:56.909 }' 00:41:56.909 11:52:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:41:56.909 11:52:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:41:56.909 11:52:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:41:56.909 11:52:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:41:56.909 11:52:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@708 -- # break 00:41:56.909 11:52:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:41:56.909 11:52:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:41:56.909 11:52:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:41:56.909 11:52:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:41:56.909 11:52:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:41:56.909 11:52:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:41:56.909 11:52:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:41:57.168 11:52:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:41:57.168 "name": "raid_bdev1", 00:41:57.168 "uuid": "36ef460d-29cf-44b8-923d-b827c21c9965", 00:41:57.168 "strip_size_kb": 0, 00:41:57.168 "state": "online", 00:41:57.168 "raid_level": "raid1", 00:41:57.168 "superblock": true, 00:41:57.168 "num_base_bdevs": 2, 00:41:57.168 "num_base_bdevs_discovered": 2, 00:41:57.168 "num_base_bdevs_operational": 2, 00:41:57.168 "base_bdevs_list": [ 00:41:57.168 { 00:41:57.168 "name": "spare", 00:41:57.168 "uuid": "d2fe6068-7ba3-5124-9bd5-3d859d6a1574", 00:41:57.168 "is_configured": true, 00:41:57.168 "data_offset": 256, 00:41:57.168 "data_size": 7936 00:41:57.168 }, 00:41:57.168 { 00:41:57.168 "name": "BaseBdev2", 00:41:57.168 "uuid": "e99c8034-e6e1-54d3-83cc-ed7232aa2a5d", 00:41:57.168 "is_configured": true, 00:41:57.168 "data_offset": 256, 00:41:57.168 "data_size": 7936 00:41:57.168 } 00:41:57.168 ] 00:41:57.168 }' 00:41:57.168 11:52:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:41:57.168 11:52:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:41:57.168 11:52:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:41:57.168 11:52:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:41:57.168 11:52:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:41:57.168 11:52:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:41:57.168 11:52:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:41:57.168 11:52:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:41:57.168 11:52:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:41:57.168 11:52:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:41:57.168 11:52:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:41:57.168 11:52:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:41:57.168 11:52:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:41:57.168 11:52:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:41:57.168 11:52:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:41:57.168 11:52:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:41:57.427 11:52:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:41:57.427 "name": "raid_bdev1", 00:41:57.427 "uuid": "36ef460d-29cf-44b8-923d-b827c21c9965", 00:41:57.427 "strip_size_kb": 0, 00:41:57.427 "state": "online", 00:41:57.427 "raid_level": "raid1", 00:41:57.427 "superblock": true, 00:41:57.427 "num_base_bdevs": 2, 00:41:57.427 "num_base_bdevs_discovered": 2, 00:41:57.427 "num_base_bdevs_operational": 2, 00:41:57.427 "base_bdevs_list": [ 00:41:57.427 { 00:41:57.427 "name": "spare", 00:41:57.427 "uuid": "d2fe6068-7ba3-5124-9bd5-3d859d6a1574", 00:41:57.427 "is_configured": true, 00:41:57.427 "data_offset": 256, 00:41:57.427 "data_size": 7936 00:41:57.427 }, 00:41:57.427 { 00:41:57.427 "name": "BaseBdev2", 00:41:57.427 "uuid": "e99c8034-e6e1-54d3-83cc-ed7232aa2a5d", 00:41:57.427 "is_configured": true, 00:41:57.427 "data_offset": 256, 00:41:57.427 "data_size": 7936 00:41:57.427 } 00:41:57.427 ] 00:41:57.427 }' 00:41:57.427 11:52:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:41:57.427 11:52:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:41:57.994 11:52:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:41:57.994 [2024-06-10 11:52:41.842935] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:41:57.994 [2024-06-10 11:52:41.842960] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:41:57.994 [2024-06-10 11:52:41.843003] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:41:57.994 [2024-06-10 11:52:41.843040] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:41:57.994 [2024-06-10 11:52:41.843049] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x281f5e0 name raid_bdev1, state offline 00:41:57.994 11:52:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:41:57.994 11:52:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # jq length 00:41:58.252 11:52:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:41:58.252 11:52:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@721 -- # '[' false = true ']' 00:41:58.252 11:52:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:41:58.252 11:52:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:41:58.509 11:52:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:41:58.509 [2024-06-10 11:52:42.372314] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:41:58.509 [2024-06-10 11:52:42.372352] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:41:58.509 [2024-06-10 11:52:42.372368] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x268c1a0 00:41:58.509 [2024-06-10 11:52:42.372377] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:41:58.509 [2024-06-10 11:52:42.373683] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:41:58.509 [2024-06-10 11:52:42.373707] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:41:58.509 [2024-06-10 11:52:42.373749] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:41:58.509 [2024-06-10 11:52:42.373769] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:41:58.509 [2024-06-10 11:52:42.373827] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:41:58.509 spare 00:41:58.509 11:52:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:41:58.509 11:52:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:41:58.509 11:52:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:41:58.509 11:52:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:41:58.509 11:52:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:41:58.509 11:52:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:41:58.509 11:52:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:41:58.509 11:52:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:41:58.509 11:52:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:41:58.509 11:52:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:41:58.509 11:52:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:41:58.509 11:52:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:41:58.769 [2024-06-10 11:52:42.474126] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2821b10 00:41:58.769 [2024-06-10 11:52:42.474141] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:41:58.769 [2024-06-10 11:52:42.474196] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2817740 00:41:58.769 [2024-06-10 11:52:42.474262] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2821b10 00:41:58.769 [2024-06-10 11:52:42.474269] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2821b10 00:41:58.769 [2024-06-10 11:52:42.474315] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:41:58.769 11:52:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:41:58.769 "name": "raid_bdev1", 00:41:58.769 "uuid": "36ef460d-29cf-44b8-923d-b827c21c9965", 00:41:58.769 "strip_size_kb": 0, 00:41:58.769 "state": "online", 00:41:58.769 "raid_level": "raid1", 00:41:58.769 "superblock": true, 00:41:58.769 "num_base_bdevs": 2, 00:41:58.769 "num_base_bdevs_discovered": 2, 00:41:58.769 "num_base_bdevs_operational": 2, 00:41:58.769 "base_bdevs_list": [ 00:41:58.769 { 00:41:58.769 "name": "spare", 00:41:58.769 "uuid": "d2fe6068-7ba3-5124-9bd5-3d859d6a1574", 00:41:58.769 "is_configured": true, 00:41:58.769 "data_offset": 256, 00:41:58.769 "data_size": 7936 00:41:58.769 }, 00:41:58.769 { 00:41:58.769 "name": "BaseBdev2", 00:41:58.769 "uuid": "e99c8034-e6e1-54d3-83cc-ed7232aa2a5d", 00:41:58.769 "is_configured": true, 00:41:58.769 "data_offset": 256, 00:41:58.769 "data_size": 7936 00:41:58.769 } 00:41:58.769 ] 00:41:58.769 }' 00:41:58.769 11:52:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:41:58.769 11:52:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:41:59.406 11:52:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:41:59.406 11:52:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:41:59.406 11:52:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:41:59.406 11:52:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:41:59.406 11:52:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:41:59.406 11:52:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:41:59.406 11:52:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:41:59.406 11:52:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:41:59.406 "name": "raid_bdev1", 00:41:59.406 "uuid": "36ef460d-29cf-44b8-923d-b827c21c9965", 00:41:59.406 "strip_size_kb": 0, 00:41:59.406 "state": "online", 00:41:59.406 "raid_level": "raid1", 00:41:59.406 "superblock": true, 00:41:59.406 "num_base_bdevs": 2, 00:41:59.406 "num_base_bdevs_discovered": 2, 00:41:59.406 "num_base_bdevs_operational": 2, 00:41:59.406 "base_bdevs_list": [ 00:41:59.406 { 00:41:59.406 "name": "spare", 00:41:59.406 "uuid": "d2fe6068-7ba3-5124-9bd5-3d859d6a1574", 00:41:59.406 "is_configured": true, 00:41:59.406 "data_offset": 256, 00:41:59.406 "data_size": 7936 00:41:59.406 }, 00:41:59.406 { 00:41:59.406 "name": "BaseBdev2", 00:41:59.406 "uuid": "e99c8034-e6e1-54d3-83cc-ed7232aa2a5d", 00:41:59.406 "is_configured": true, 00:41:59.406 "data_offset": 256, 00:41:59.406 "data_size": 7936 00:41:59.406 } 00:41:59.406 ] 00:41:59.406 }' 00:41:59.406 11:52:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:41:59.406 11:52:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:41:59.406 11:52:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:41:59.406 11:52:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:41:59.406 11:52:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:41:59.406 11:52:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:41:59.664 11:52:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:41:59.664 11:52:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:41:59.922 [2024-06-10 11:52:43.619594] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:41:59.922 11:52:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:41:59.922 11:52:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:41:59.922 11:52:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:41:59.922 11:52:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:41:59.922 11:52:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:41:59.922 11:52:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:41:59.922 11:52:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:41:59.922 11:52:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:41:59.922 11:52:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:41:59.922 11:52:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:41:59.922 11:52:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:41:59.922 11:52:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:41:59.922 11:52:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:41:59.922 "name": "raid_bdev1", 00:41:59.922 "uuid": "36ef460d-29cf-44b8-923d-b827c21c9965", 00:41:59.922 "strip_size_kb": 0, 00:41:59.922 "state": "online", 00:41:59.922 "raid_level": "raid1", 00:41:59.922 "superblock": true, 00:41:59.922 "num_base_bdevs": 2, 00:41:59.922 "num_base_bdevs_discovered": 1, 00:41:59.922 "num_base_bdevs_operational": 1, 00:41:59.922 "base_bdevs_list": [ 00:41:59.922 { 00:41:59.922 "name": null, 00:41:59.922 "uuid": "00000000-0000-0000-0000-000000000000", 00:41:59.922 "is_configured": false, 00:41:59.922 "data_offset": 256, 00:41:59.922 "data_size": 7936 00:41:59.922 }, 00:41:59.922 { 00:41:59.922 "name": "BaseBdev2", 00:41:59.922 "uuid": "e99c8034-e6e1-54d3-83cc-ed7232aa2a5d", 00:41:59.922 "is_configured": true, 00:41:59.923 "data_offset": 256, 00:41:59.923 "data_size": 7936 00:41:59.923 } 00:41:59.923 ] 00:41:59.923 }' 00:41:59.923 11:52:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:41:59.923 11:52:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:42:00.489 11:52:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:42:00.748 [2024-06-10 11:52:44.481832] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:42:00.748 [2024-06-10 11:52:44.481945] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:42:00.748 [2024-06-10 11:52:44.481957] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:42:00.748 [2024-06-10 11:52:44.481978] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:42:00.748 [2024-06-10 11:52:44.485143] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2817740 00:42:00.748 [2024-06-10 11:52:44.486723] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:42:00.748 11:52:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@755 -- # sleep 1 00:42:01.681 11:52:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:42:01.681 11:52:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:42:01.681 11:52:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:42:01.681 11:52:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:42:01.681 11:52:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:42:01.681 11:52:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:42:01.681 11:52:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:42:01.940 11:52:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:42:01.940 "name": "raid_bdev1", 00:42:01.940 "uuid": "36ef460d-29cf-44b8-923d-b827c21c9965", 00:42:01.940 "strip_size_kb": 0, 00:42:01.940 "state": "online", 00:42:01.940 "raid_level": "raid1", 00:42:01.940 "superblock": true, 00:42:01.940 "num_base_bdevs": 2, 00:42:01.940 "num_base_bdevs_discovered": 2, 00:42:01.940 "num_base_bdevs_operational": 2, 00:42:01.940 "process": { 00:42:01.940 "type": "rebuild", 00:42:01.940 "target": "spare", 00:42:01.940 "progress": { 00:42:01.940 "blocks": 2816, 00:42:01.940 "percent": 35 00:42:01.940 } 00:42:01.940 }, 00:42:01.940 "base_bdevs_list": [ 00:42:01.940 { 00:42:01.940 "name": "spare", 00:42:01.940 "uuid": "d2fe6068-7ba3-5124-9bd5-3d859d6a1574", 00:42:01.940 "is_configured": true, 00:42:01.940 "data_offset": 256, 00:42:01.940 "data_size": 7936 00:42:01.940 }, 00:42:01.940 { 00:42:01.940 "name": "BaseBdev2", 00:42:01.940 "uuid": "e99c8034-e6e1-54d3-83cc-ed7232aa2a5d", 00:42:01.940 "is_configured": true, 00:42:01.940 "data_offset": 256, 00:42:01.940 "data_size": 7936 00:42:01.940 } 00:42:01.940 ] 00:42:01.940 }' 00:42:01.940 11:52:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:42:01.940 11:52:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:42:01.940 11:52:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:42:01.940 11:52:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:42:01.940 11:52:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:42:02.198 [2024-06-10 11:52:45.939098] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:42:02.198 [2024-06-10 11:52:45.997894] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:42:02.198 [2024-06-10 11:52:45.997926] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:42:02.198 [2024-06-10 11:52:45.997936] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:42:02.198 [2024-06-10 11:52:45.997943] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:42:02.198 11:52:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:42:02.198 11:52:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:42:02.198 11:52:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:42:02.198 11:52:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:42:02.198 11:52:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:42:02.198 11:52:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:42:02.198 11:52:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:42:02.198 11:52:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:42:02.198 11:52:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:42:02.198 11:52:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:42:02.198 11:52:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:42:02.198 11:52:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:42:02.456 11:52:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:42:02.456 "name": "raid_bdev1", 00:42:02.456 "uuid": "36ef460d-29cf-44b8-923d-b827c21c9965", 00:42:02.456 "strip_size_kb": 0, 00:42:02.456 "state": "online", 00:42:02.456 "raid_level": "raid1", 00:42:02.456 "superblock": true, 00:42:02.456 "num_base_bdevs": 2, 00:42:02.456 "num_base_bdevs_discovered": 1, 00:42:02.456 "num_base_bdevs_operational": 1, 00:42:02.456 "base_bdevs_list": [ 00:42:02.456 { 00:42:02.456 "name": null, 00:42:02.456 "uuid": "00000000-0000-0000-0000-000000000000", 00:42:02.456 "is_configured": false, 00:42:02.456 "data_offset": 256, 00:42:02.456 "data_size": 7936 00:42:02.456 }, 00:42:02.456 { 00:42:02.456 "name": "BaseBdev2", 00:42:02.456 "uuid": "e99c8034-e6e1-54d3-83cc-ed7232aa2a5d", 00:42:02.456 "is_configured": true, 00:42:02.456 "data_offset": 256, 00:42:02.456 "data_size": 7936 00:42:02.456 } 00:42:02.456 ] 00:42:02.456 }' 00:42:02.456 11:52:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:42:02.456 11:52:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:42:03.022 11:52:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:42:03.022 [2024-06-10 11:52:46.864368] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:42:03.022 [2024-06-10 11:52:46.864408] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:42:03.022 [2024-06-10 11:52:46.864428] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x268a460 00:42:03.022 [2024-06-10 11:52:46.864437] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:42:03.022 [2024-06-10 11:52:46.864569] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:42:03.022 [2024-06-10 11:52:46.864580] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:42:03.022 [2024-06-10 11:52:46.864618] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:42:03.022 [2024-06-10 11:52:46.864626] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:42:03.022 [2024-06-10 11:52:46.864634] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:42:03.022 [2024-06-10 11:52:46.864647] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:42:03.022 [2024-06-10 11:52:46.867793] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x28301c0 00:42:03.022 [2024-06-10 11:52:46.868851] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:42:03.022 spare 00:42:03.022 11:52:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@762 -- # sleep 1 00:42:03.956 11:52:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:42:03.956 11:52:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:42:03.956 11:52:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:42:03.956 11:52:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:42:03.956 11:52:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:42:03.956 11:52:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:42:03.956 11:52:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:42:04.214 11:52:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:42:04.214 "name": "raid_bdev1", 00:42:04.214 "uuid": "36ef460d-29cf-44b8-923d-b827c21c9965", 00:42:04.214 "strip_size_kb": 0, 00:42:04.214 "state": "online", 00:42:04.214 "raid_level": "raid1", 00:42:04.214 "superblock": true, 00:42:04.214 "num_base_bdevs": 2, 00:42:04.214 "num_base_bdevs_discovered": 2, 00:42:04.214 "num_base_bdevs_operational": 2, 00:42:04.214 "process": { 00:42:04.214 "type": "rebuild", 00:42:04.214 "target": "spare", 00:42:04.214 "progress": { 00:42:04.214 "blocks": 2816, 00:42:04.214 "percent": 35 00:42:04.214 } 00:42:04.214 }, 00:42:04.214 "base_bdevs_list": [ 00:42:04.214 { 00:42:04.214 "name": "spare", 00:42:04.214 "uuid": "d2fe6068-7ba3-5124-9bd5-3d859d6a1574", 00:42:04.214 "is_configured": true, 00:42:04.214 "data_offset": 256, 00:42:04.214 "data_size": 7936 00:42:04.214 }, 00:42:04.214 { 00:42:04.214 "name": "BaseBdev2", 00:42:04.214 "uuid": "e99c8034-e6e1-54d3-83cc-ed7232aa2a5d", 00:42:04.214 "is_configured": true, 00:42:04.214 "data_offset": 256, 00:42:04.214 "data_size": 7936 00:42:04.214 } 00:42:04.214 ] 00:42:04.214 }' 00:42:04.214 11:52:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:42:04.214 11:52:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:42:04.214 11:52:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:42:04.214 11:52:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:42:04.214 11:52:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:42:04.472 [2024-06-10 11:52:48.313108] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:42:04.472 [2024-06-10 11:52:48.379846] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:42:04.472 [2024-06-10 11:52:48.379883] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:42:04.472 [2024-06-10 11:52:48.379910] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:42:04.472 [2024-06-10 11:52:48.379916] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:42:04.472 11:52:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:42:04.472 11:52:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:42:04.472 11:52:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:42:04.472 11:52:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:42:04.472 11:52:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:42:04.472 11:52:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:42:04.472 11:52:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:42:04.472 11:52:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:42:04.472 11:52:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:42:04.472 11:52:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:42:04.472 11:52:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:42:04.472 11:52:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:42:04.729 11:52:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:42:04.730 "name": "raid_bdev1", 00:42:04.730 "uuid": "36ef460d-29cf-44b8-923d-b827c21c9965", 00:42:04.730 "strip_size_kb": 0, 00:42:04.730 "state": "online", 00:42:04.730 "raid_level": "raid1", 00:42:04.730 "superblock": true, 00:42:04.730 "num_base_bdevs": 2, 00:42:04.730 "num_base_bdevs_discovered": 1, 00:42:04.730 "num_base_bdevs_operational": 1, 00:42:04.730 "base_bdevs_list": [ 00:42:04.730 { 00:42:04.730 "name": null, 00:42:04.730 "uuid": "00000000-0000-0000-0000-000000000000", 00:42:04.730 "is_configured": false, 00:42:04.730 "data_offset": 256, 00:42:04.730 "data_size": 7936 00:42:04.730 }, 00:42:04.730 { 00:42:04.730 "name": "BaseBdev2", 00:42:04.730 "uuid": "e99c8034-e6e1-54d3-83cc-ed7232aa2a5d", 00:42:04.730 "is_configured": true, 00:42:04.730 "data_offset": 256, 00:42:04.730 "data_size": 7936 00:42:04.730 } 00:42:04.730 ] 00:42:04.730 }' 00:42:04.730 11:52:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:42:04.730 11:52:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:42:05.296 11:52:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:42:05.296 11:52:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:42:05.296 11:52:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:42:05.296 11:52:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:42:05.296 11:52:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:42:05.296 11:52:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:42:05.296 11:52:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:42:05.296 11:52:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:42:05.296 "name": "raid_bdev1", 00:42:05.296 "uuid": "36ef460d-29cf-44b8-923d-b827c21c9965", 00:42:05.296 "strip_size_kb": 0, 00:42:05.296 "state": "online", 00:42:05.296 "raid_level": "raid1", 00:42:05.296 "superblock": true, 00:42:05.296 "num_base_bdevs": 2, 00:42:05.296 "num_base_bdevs_discovered": 1, 00:42:05.296 "num_base_bdevs_operational": 1, 00:42:05.296 "base_bdevs_list": [ 00:42:05.296 { 00:42:05.296 "name": null, 00:42:05.296 "uuid": "00000000-0000-0000-0000-000000000000", 00:42:05.296 "is_configured": false, 00:42:05.296 "data_offset": 256, 00:42:05.296 "data_size": 7936 00:42:05.296 }, 00:42:05.296 { 00:42:05.296 "name": "BaseBdev2", 00:42:05.296 "uuid": "e99c8034-e6e1-54d3-83cc-ed7232aa2a5d", 00:42:05.296 "is_configured": true, 00:42:05.296 "data_offset": 256, 00:42:05.296 "data_size": 7936 00:42:05.296 } 00:42:05.296 ] 00:42:05.296 }' 00:42:05.296 11:52:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:42:05.554 11:52:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:42:05.554 11:52:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:42:05.554 11:52:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:42:05.554 11:52:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:42:05.554 11:52:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:42:05.813 [2024-06-10 11:52:49.630953] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:42:05.813 [2024-06-10 11:52:49.630992] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:42:05.813 [2024-06-10 11:52:49.631008] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x268d4a0 00:42:05.813 [2024-06-10 11:52:49.631017] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:42:05.813 [2024-06-10 11:52:49.631136] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:42:05.813 [2024-06-10 11:52:49.631148] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:42:05.813 [2024-06-10 11:52:49.631181] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:42:05.813 [2024-06-10 11:52:49.631189] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:42:05.813 [2024-06-10 11:52:49.631197] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:42:05.813 BaseBdev1 00:42:05.813 11:52:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@773 -- # sleep 1 00:42:06.748 11:52:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:42:06.748 11:52:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:42:06.748 11:52:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:42:06.749 11:52:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:42:06.749 11:52:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:42:06.749 11:52:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:42:06.749 11:52:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:42:06.749 11:52:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:42:06.749 11:52:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:42:06.749 11:52:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:42:06.749 11:52:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:42:06.749 11:52:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:42:07.007 11:52:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:42:07.007 "name": "raid_bdev1", 00:42:07.007 "uuid": "36ef460d-29cf-44b8-923d-b827c21c9965", 00:42:07.007 "strip_size_kb": 0, 00:42:07.007 "state": "online", 00:42:07.007 "raid_level": "raid1", 00:42:07.007 "superblock": true, 00:42:07.007 "num_base_bdevs": 2, 00:42:07.007 "num_base_bdevs_discovered": 1, 00:42:07.007 "num_base_bdevs_operational": 1, 00:42:07.007 "base_bdevs_list": [ 00:42:07.007 { 00:42:07.007 "name": null, 00:42:07.007 "uuid": "00000000-0000-0000-0000-000000000000", 00:42:07.007 "is_configured": false, 00:42:07.007 "data_offset": 256, 00:42:07.007 "data_size": 7936 00:42:07.007 }, 00:42:07.007 { 00:42:07.007 "name": "BaseBdev2", 00:42:07.007 "uuid": "e99c8034-e6e1-54d3-83cc-ed7232aa2a5d", 00:42:07.007 "is_configured": true, 00:42:07.007 "data_offset": 256, 00:42:07.007 "data_size": 7936 00:42:07.007 } 00:42:07.007 ] 00:42:07.007 }' 00:42:07.007 11:52:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:42:07.007 11:52:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:42:07.573 11:52:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:42:07.573 11:52:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:42:07.573 11:52:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:42:07.573 11:52:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:42:07.573 11:52:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:42:07.573 11:52:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:42:07.573 11:52:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:42:07.573 11:52:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:42:07.573 "name": "raid_bdev1", 00:42:07.573 "uuid": "36ef460d-29cf-44b8-923d-b827c21c9965", 00:42:07.573 "strip_size_kb": 0, 00:42:07.573 "state": "online", 00:42:07.573 "raid_level": "raid1", 00:42:07.573 "superblock": true, 00:42:07.573 "num_base_bdevs": 2, 00:42:07.573 "num_base_bdevs_discovered": 1, 00:42:07.573 "num_base_bdevs_operational": 1, 00:42:07.573 "base_bdevs_list": [ 00:42:07.573 { 00:42:07.573 "name": null, 00:42:07.573 "uuid": "00000000-0000-0000-0000-000000000000", 00:42:07.573 "is_configured": false, 00:42:07.573 "data_offset": 256, 00:42:07.573 "data_size": 7936 00:42:07.573 }, 00:42:07.573 { 00:42:07.573 "name": "BaseBdev2", 00:42:07.573 "uuid": "e99c8034-e6e1-54d3-83cc-ed7232aa2a5d", 00:42:07.573 "is_configured": true, 00:42:07.573 "data_offset": 256, 00:42:07.573 "data_size": 7936 00:42:07.573 } 00:42:07.573 ] 00:42:07.573 }' 00:42:07.573 11:52:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:42:07.831 11:52:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:42:07.831 11:52:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:42:07.831 11:52:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:42:07.831 11:52:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:42:07.831 11:52:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@649 -- # local es=0 00:42:07.831 11:52:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:42:07.831 11:52:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:42:07.831 11:52:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:42:07.831 11:52:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:42:07.831 11:52:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:42:07.831 11:52:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:42:07.831 11:52:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:42:07.831 11:52:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:42:07.831 11:52:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:42:07.831 11:52:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:42:07.831 [2024-06-10 11:52:51.708366] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:42:07.831 [2024-06-10 11:52:51.708463] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:42:07.831 [2024-06-10 11:52:51.708475] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:42:07.831 request: 00:42:07.831 { 00:42:07.831 "raid_bdev": "raid_bdev1", 00:42:07.831 "base_bdev": "BaseBdev1", 00:42:07.831 "method": "bdev_raid_add_base_bdev", 00:42:07.831 "req_id": 1 00:42:07.831 } 00:42:07.831 Got JSON-RPC error response 00:42:07.831 response: 00:42:07.831 { 00:42:07.831 "code": -22, 00:42:07.831 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:42:07.831 } 00:42:07.831 11:52:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@652 -- # es=1 00:42:07.831 11:52:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:42:07.831 11:52:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:42:07.831 11:52:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:42:07.831 11:52:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@777 -- # sleep 1 00:42:09.202 11:52:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:42:09.202 11:52:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:42:09.202 11:52:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:42:09.202 11:52:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:42:09.202 11:52:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:42:09.202 11:52:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:42:09.202 11:52:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:42:09.202 11:52:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:42:09.202 11:52:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:42:09.202 11:52:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:42:09.202 11:52:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:42:09.202 11:52:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:42:09.202 11:52:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:42:09.202 "name": "raid_bdev1", 00:42:09.202 "uuid": "36ef460d-29cf-44b8-923d-b827c21c9965", 00:42:09.202 "strip_size_kb": 0, 00:42:09.202 "state": "online", 00:42:09.202 "raid_level": "raid1", 00:42:09.202 "superblock": true, 00:42:09.202 "num_base_bdevs": 2, 00:42:09.202 "num_base_bdevs_discovered": 1, 00:42:09.202 "num_base_bdevs_operational": 1, 00:42:09.202 "base_bdevs_list": [ 00:42:09.202 { 00:42:09.202 "name": null, 00:42:09.202 "uuid": "00000000-0000-0000-0000-000000000000", 00:42:09.202 "is_configured": false, 00:42:09.202 "data_offset": 256, 00:42:09.202 "data_size": 7936 00:42:09.202 }, 00:42:09.202 { 00:42:09.202 "name": "BaseBdev2", 00:42:09.202 "uuid": "e99c8034-e6e1-54d3-83cc-ed7232aa2a5d", 00:42:09.202 "is_configured": true, 00:42:09.202 "data_offset": 256, 00:42:09.202 "data_size": 7936 00:42:09.202 } 00:42:09.202 ] 00:42:09.202 }' 00:42:09.202 11:52:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:42:09.202 11:52:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:42:09.459 11:52:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:42:09.459 11:52:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:42:09.459 11:52:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:42:09.459 11:52:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:42:09.459 11:52:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:42:09.459 11:52:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:42:09.459 11:52:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:42:09.717 11:52:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:42:09.717 "name": "raid_bdev1", 00:42:09.717 "uuid": "36ef460d-29cf-44b8-923d-b827c21c9965", 00:42:09.717 "strip_size_kb": 0, 00:42:09.717 "state": "online", 00:42:09.717 "raid_level": "raid1", 00:42:09.717 "superblock": true, 00:42:09.717 "num_base_bdevs": 2, 00:42:09.717 "num_base_bdevs_discovered": 1, 00:42:09.717 "num_base_bdevs_operational": 1, 00:42:09.717 "base_bdevs_list": [ 00:42:09.717 { 00:42:09.717 "name": null, 00:42:09.717 "uuid": "00000000-0000-0000-0000-000000000000", 00:42:09.717 "is_configured": false, 00:42:09.718 "data_offset": 256, 00:42:09.718 "data_size": 7936 00:42:09.718 }, 00:42:09.718 { 00:42:09.718 "name": "BaseBdev2", 00:42:09.718 "uuid": "e99c8034-e6e1-54d3-83cc-ed7232aa2a5d", 00:42:09.718 "is_configured": true, 00:42:09.718 "data_offset": 256, 00:42:09.718 "data_size": 7936 00:42:09.718 } 00:42:09.718 ] 00:42:09.718 }' 00:42:09.718 11:52:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:42:09.718 11:52:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:42:09.718 11:52:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:42:09.718 11:52:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:42:09.718 11:52:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@782 -- # killprocess 261127 00:42:09.718 11:52:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@949 -- # '[' -z 261127 ']' 00:42:09.718 11:52:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # kill -0 261127 00:42:09.718 11:52:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # uname 00:42:09.718 11:52:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:42:09.718 11:52:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 261127 00:42:09.976 11:52:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:42:09.976 11:52:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:42:09.976 11:52:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@967 -- # echo 'killing process with pid 261127' 00:42:09.976 killing process with pid 261127 00:42:09.976 11:52:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@968 -- # kill 261127 00:42:09.976 Received shutdown signal, test time was about 60.000000 seconds 00:42:09.976 00:42:09.976 Latency(us) 00:42:09.976 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:42:09.976 =================================================================================================================== 00:42:09.976 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:42:09.976 [2024-06-10 11:52:53.680049] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:42:09.976 [2024-06-10 11:52:53.680114] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:42:09.976 [2024-06-10 11:52:53.680143] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:42:09.976 [2024-06-10 11:52:53.680151] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2821b10 name raid_bdev1, state offline 00:42:09.976 11:52:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@973 -- # wait 261127 00:42:09.976 [2024-06-10 11:52:53.709287] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:42:09.976 11:52:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@784 -- # return 0 00:42:09.976 00:42:09.976 real 0m24.020s 00:42:09.976 user 0m36.869s 00:42:09.976 sys 0m3.194s 00:42:09.976 11:52:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@1125 -- # xtrace_disable 00:42:09.976 11:52:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:42:09.976 ************************************ 00:42:09.976 END TEST raid_rebuild_test_sb_md_interleaved 00:42:09.976 ************************************ 00:42:10.235 11:52:53 bdev_raid -- bdev/bdev_raid.sh@916 -- # trap - EXIT 00:42:10.235 11:52:53 bdev_raid -- bdev/bdev_raid.sh@917 -- # cleanup 00:42:10.235 11:52:53 bdev_raid -- bdev/bdev_raid.sh@58 -- # '[' -n 261127 ']' 00:42:10.235 11:52:53 bdev_raid -- bdev/bdev_raid.sh@58 -- # ps -p 261127 00:42:10.235 11:52:53 bdev_raid -- bdev/bdev_raid.sh@62 -- # rm -rf /raidtest 00:42:10.235 00:42:10.235 real 14m33.021s 00:42:10.235 user 24m5.301s 00:42:10.235 sys 2m45.589s 00:42:10.235 11:52:53 bdev_raid -- common/autotest_common.sh@1125 -- # xtrace_disable 00:42:10.235 11:52:53 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:42:10.235 ************************************ 00:42:10.235 END TEST bdev_raid 00:42:10.235 ************************************ 00:42:10.235 11:52:54 -- spdk/autotest.sh@191 -- # run_test bdevperf_config /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test_config.sh 00:42:10.235 11:52:54 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:42:10.235 11:52:54 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:42:10.235 11:52:54 -- common/autotest_common.sh@10 -- # set +x 00:42:10.235 ************************************ 00:42:10.235 START TEST bdevperf_config 00:42:10.235 ************************************ 00:42:10.235 11:52:54 bdevperf_config -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test_config.sh 00:42:10.235 * Looking for test storage... 00:42:10.235 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf 00:42:10.235 11:52:54 bdevperf_config -- bdevperf/test_config.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/common.sh 00:42:10.235 11:52:54 bdevperf_config -- bdevperf/common.sh@5 -- # bdevperf=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf 00:42:10.235 11:52:54 bdevperf_config -- bdevperf/test_config.sh@12 -- # jsonconf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json 00:42:10.235 11:52:54 bdevperf_config -- bdevperf/test_config.sh@13 -- # testconf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:42:10.235 11:52:54 bdevperf_config -- bdevperf/test_config.sh@15 -- # trap 'cleanup; exit 1' SIGINT SIGTERM EXIT 00:42:10.235 11:52:54 bdevperf_config -- bdevperf/test_config.sh@17 -- # create_job global read Malloc0 00:42:10.235 11:52:54 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=global 00:42:10.235 11:52:54 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=read 00:42:10.235 11:52:54 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:42:10.235 11:52:54 bdevperf_config -- bdevperf/common.sh@12 -- # [[ global == \g\l\o\b\a\l ]] 00:42:10.235 11:52:54 bdevperf_config -- bdevperf/common.sh@13 -- # cat 00:42:10.235 11:52:54 bdevperf_config -- bdevperf/common.sh@18 -- # job='[global]' 00:42:10.235 11:52:54 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:42:10.235 00:42:10.235 11:52:54 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:42:10.235 11:52:54 bdevperf_config -- bdevperf/test_config.sh@18 -- # create_job job0 00:42:10.235 11:52:54 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:42:10.235 11:52:54 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:42:10.235 11:52:54 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:42:10.235 11:52:54 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:42:10.235 11:52:54 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:42:10.235 11:52:54 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:42:10.235 00:42:10.235 11:52:54 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:42:10.235 11:52:54 bdevperf_config -- bdevperf/test_config.sh@19 -- # create_job job1 00:42:10.236 11:52:54 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:42:10.236 11:52:54 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:42:10.236 11:52:54 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:42:10.236 11:52:54 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:42:10.236 11:52:54 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:42:10.236 11:52:54 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:42:10.236 00:42:10.236 11:52:54 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:42:10.236 11:52:54 bdevperf_config -- bdevperf/test_config.sh@20 -- # create_job job2 00:42:10.236 11:52:54 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:42:10.236 11:52:54 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:42:10.236 11:52:54 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:42:10.236 11:52:54 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:42:10.236 11:52:54 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:42:10.236 11:52:54 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:42:10.236 00:42:10.236 11:52:54 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:42:10.494 11:52:54 bdevperf_config -- bdevperf/test_config.sh@21 -- # create_job job3 00:42:10.494 11:52:54 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job3 00:42:10.494 11:52:54 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:42:10.494 11:52:54 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:42:10.494 11:52:54 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job3 == \g\l\o\b\a\l ]] 00:42:10.494 11:52:54 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job3]' 00:42:10.494 11:52:54 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:42:10.494 00:42:10.494 11:52:54 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:42:10.494 11:52:54 bdevperf_config -- bdevperf/test_config.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:42:13.026 11:52:56 bdevperf_config -- bdevperf/test_config.sh@22 -- # bdevperf_output='[2024-06-10 11:52:54.240615] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:42:13.026 [2024-06-10 11:52:54.240679] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid264735 ] 00:42:13.026 Using job config with 4 jobs 00:42:13.026 [2024-06-10 11:52:54.333254] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:42:13.026 [2024-06-10 11:52:54.428069] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:42:13.026 cpumask for '\''job0'\'' is too big 00:42:13.026 cpumask for '\''job1'\'' is too big 00:42:13.026 cpumask for '\''job2'\'' is too big 00:42:13.026 cpumask for '\''job3'\'' is too big 00:42:13.026 Running I/O for 2 seconds... 00:42:13.026 00:42:13.026 Latency(us) 00:42:13.026 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:42:13.026 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:42:13.026 Malloc0 : 2.01 37998.35 37.11 0.00 0.00 6733.99 1260.86 10371.78 00:42:13.026 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:42:13.026 Malloc0 : 2.01 38011.22 37.12 0.00 0.00 6721.67 1154.00 9175.04 00:42:13.026 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:42:13.026 Malloc0 : 2.01 37989.76 37.10 0.00 0.00 6715.82 1175.37 8092.27 00:42:13.026 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:42:13.026 Malloc0 : 2.02 37968.33 37.08 0.00 0.00 6710.33 1210.99 8092.27 00:42:13.026 =================================================================================================================== 00:42:13.026 Total : 151967.66 148.41 0.00 0.00 6720.44 1154.00 10371.78' 00:42:13.026 11:52:56 bdevperf_config -- bdevperf/test_config.sh@23 -- # get_num_jobs '[2024-06-10 11:52:54.240615] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:42:13.026 [2024-06-10 11:52:54.240679] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid264735 ] 00:42:13.026 Using job config with 4 jobs 00:42:13.026 [2024-06-10 11:52:54.333254] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:42:13.026 [2024-06-10 11:52:54.428069] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:42:13.026 cpumask for '\''job0'\'' is too big 00:42:13.026 cpumask for '\''job1'\'' is too big 00:42:13.026 cpumask for '\''job2'\'' is too big 00:42:13.026 cpumask for '\''job3'\'' is too big 00:42:13.026 Running I/O for 2 seconds... 00:42:13.026 00:42:13.026 Latency(us) 00:42:13.026 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:42:13.026 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:42:13.026 Malloc0 : 2.01 37998.35 37.11 0.00 0.00 6733.99 1260.86 10371.78 00:42:13.026 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:42:13.026 Malloc0 : 2.01 38011.22 37.12 0.00 0.00 6721.67 1154.00 9175.04 00:42:13.026 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:42:13.026 Malloc0 : 2.01 37989.76 37.10 0.00 0.00 6715.82 1175.37 8092.27 00:42:13.026 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:42:13.026 Malloc0 : 2.02 37968.33 37.08 0.00 0.00 6710.33 1210.99 8092.27 00:42:13.026 =================================================================================================================== 00:42:13.026 Total : 151967.66 148.41 0.00 0.00 6720.44 1154.00 10371.78' 00:42:13.026 11:52:56 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-06-10 11:52:54.240615] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:42:13.026 [2024-06-10 11:52:54.240679] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid264735 ] 00:42:13.026 Using job config with 4 jobs 00:42:13.026 [2024-06-10 11:52:54.333254] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:42:13.026 [2024-06-10 11:52:54.428069] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:42:13.026 cpumask for '\''job0'\'' is too big 00:42:13.026 cpumask for '\''job1'\'' is too big 00:42:13.026 cpumask for '\''job2'\'' is too big 00:42:13.026 cpumask for '\''job3'\'' is too big 00:42:13.026 Running I/O for 2 seconds... 00:42:13.026 00:42:13.026 Latency(us) 00:42:13.026 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:42:13.026 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:42:13.026 Malloc0 : 2.01 37998.35 37.11 0.00 0.00 6733.99 1260.86 10371.78 00:42:13.026 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:42:13.026 Malloc0 : 2.01 38011.22 37.12 0.00 0.00 6721.67 1154.00 9175.04 00:42:13.026 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:42:13.026 Malloc0 : 2.01 37989.76 37.10 0.00 0.00 6715.82 1175.37 8092.27 00:42:13.026 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:42:13.026 Malloc0 : 2.02 37968.33 37.08 0.00 0.00 6710.33 1210.99 8092.27 00:42:13.026 =================================================================================================================== 00:42:13.026 Total : 151967.66 148.41 0.00 0.00 6720.44 1154.00 10371.78' 00:42:13.026 11:52:56 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:42:13.027 11:52:56 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:42:13.027 11:52:56 bdevperf_config -- bdevperf/test_config.sh@23 -- # [[ 4 == \4 ]] 00:42:13.027 11:52:56 bdevperf_config -- bdevperf/test_config.sh@25 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -C -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:42:13.027 [2024-06-10 11:52:56.872608] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:42:13.027 [2024-06-10 11:52:56.872662] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid265090 ] 00:42:13.027 [2024-06-10 11:52:56.967936] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:42:13.285 [2024-06-10 11:52:57.063366] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:42:13.285 cpumask for 'job0' is too big 00:42:13.285 cpumask for 'job1' is too big 00:42:13.285 cpumask for 'job2' is too big 00:42:13.285 cpumask for 'job3' is too big 00:42:15.816 11:52:59 bdevperf_config -- bdevperf/test_config.sh@25 -- # bdevperf_output='Using job config with 4 jobs 00:42:15.816 Running I/O for 2 seconds... 00:42:15.816 00:42:15.816 Latency(us) 00:42:15.816 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:42:15.816 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:42:15.816 Malloc0 : 2.01 37786.05 36.90 0.00 0.00 6770.80 1232.36 10371.78 00:42:15.816 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:42:15.816 Malloc0 : 2.01 37764.54 36.88 0.00 0.00 6764.97 1154.00 9232.03 00:42:15.816 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:42:15.816 Malloc0 : 2.01 37743.19 36.86 0.00 0.00 6758.95 1210.99 8149.26 00:42:15.816 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:42:15.816 Malloc0 : 2.02 37721.85 36.84 0.00 0.00 6753.37 1175.37 8092.27 00:42:15.816 =================================================================================================================== 00:42:15.816 Total : 151015.64 147.48 0.00 0.00 6762.02 1154.00 10371.78' 00:42:15.816 11:52:59 bdevperf_config -- bdevperf/test_config.sh@27 -- # cleanup 00:42:15.816 11:52:59 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:42:15.816 11:52:59 bdevperf_config -- bdevperf/test_config.sh@29 -- # create_job job0 write Malloc0 00:42:15.816 11:52:59 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:42:15.816 11:52:59 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:42:15.816 11:52:59 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:42:15.816 11:52:59 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:42:15.816 11:52:59 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:42:15.816 11:52:59 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:42:15.816 00:42:15.816 11:52:59 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:42:15.816 11:52:59 bdevperf_config -- bdevperf/test_config.sh@30 -- # create_job job1 write Malloc0 00:42:15.816 11:52:59 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:42:15.816 11:52:59 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:42:15.816 11:52:59 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:42:15.816 11:52:59 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:42:15.817 11:52:59 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:42:15.817 11:52:59 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:42:15.817 00:42:15.817 11:52:59 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:42:15.817 11:52:59 bdevperf_config -- bdevperf/test_config.sh@31 -- # create_job job2 write Malloc0 00:42:15.817 11:52:59 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:42:15.817 11:52:59 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:42:15.817 11:52:59 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:42:15.817 11:52:59 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:42:15.817 11:52:59 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:42:15.817 11:52:59 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:42:15.817 00:42:15.817 11:52:59 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:42:15.817 11:52:59 bdevperf_config -- bdevperf/test_config.sh@32 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:42:18.346 11:53:02 bdevperf_config -- bdevperf/test_config.sh@32 -- # bdevperf_output='[2024-06-10 11:52:59.507671] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:42:18.346 [2024-06-10 11:52:59.507719] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid265439 ] 00:42:18.346 Using job config with 3 jobs 00:42:18.346 [2024-06-10 11:52:59.599159] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:42:18.346 [2024-06-10 11:52:59.691448] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:42:18.346 cpumask for '\''job0'\'' is too big 00:42:18.346 cpumask for '\''job1'\'' is too big 00:42:18.346 cpumask for '\''job2'\'' is too big 00:42:18.346 Running I/O for 2 seconds... 00:42:18.346 00:42:18.346 Latency(us) 00:42:18.346 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:42:18.346 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:42:18.346 Malloc0 : 2.01 51534.72 50.33 0.00 0.00 4964.16 1182.50 7294.44 00:42:18.346 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:42:18.346 Malloc0 : 2.01 51505.18 50.30 0.00 0.00 4959.88 1118.39 6183.18 00:42:18.346 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:42:18.346 Malloc0 : 2.01 51561.23 50.35 0.00 0.00 4947.54 577.00 6154.69 00:42:18.346 =================================================================================================================== 00:42:18.346 Total : 154601.14 150.98 0.00 0.00 4957.19 577.00 7294.44' 00:42:18.347 11:53:02 bdevperf_config -- bdevperf/test_config.sh@33 -- # get_num_jobs '[2024-06-10 11:52:59.507671] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:42:18.347 [2024-06-10 11:52:59.507719] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid265439 ] 00:42:18.347 Using job config with 3 jobs 00:42:18.347 [2024-06-10 11:52:59.599159] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:42:18.347 [2024-06-10 11:52:59.691448] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:42:18.347 cpumask for '\''job0'\'' is too big 00:42:18.347 cpumask for '\''job1'\'' is too big 00:42:18.347 cpumask for '\''job2'\'' is too big 00:42:18.347 Running I/O for 2 seconds... 00:42:18.347 00:42:18.347 Latency(us) 00:42:18.347 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:42:18.347 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:42:18.347 Malloc0 : 2.01 51534.72 50.33 0.00 0.00 4964.16 1182.50 7294.44 00:42:18.347 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:42:18.347 Malloc0 : 2.01 51505.18 50.30 0.00 0.00 4959.88 1118.39 6183.18 00:42:18.347 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:42:18.347 Malloc0 : 2.01 51561.23 50.35 0.00 0.00 4947.54 577.00 6154.69 00:42:18.347 =================================================================================================================== 00:42:18.347 Total : 154601.14 150.98 0.00 0.00 4957.19 577.00 7294.44' 00:42:18.347 11:53:02 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-06-10 11:52:59.507671] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:42:18.347 [2024-06-10 11:52:59.507719] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid265439 ] 00:42:18.347 Using job config with 3 jobs 00:42:18.347 [2024-06-10 11:52:59.599159] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:42:18.347 [2024-06-10 11:52:59.691448] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:42:18.347 cpumask for '\''job0'\'' is too big 00:42:18.347 cpumask for '\''job1'\'' is too big 00:42:18.347 cpumask for '\''job2'\'' is too big 00:42:18.347 Running I/O for 2 seconds... 00:42:18.347 00:42:18.347 Latency(us) 00:42:18.347 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:42:18.347 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:42:18.347 Malloc0 : 2.01 51534.72 50.33 0.00 0.00 4964.16 1182.50 7294.44 00:42:18.347 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:42:18.347 Malloc0 : 2.01 51505.18 50.30 0.00 0.00 4959.88 1118.39 6183.18 00:42:18.347 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:42:18.347 Malloc0 : 2.01 51561.23 50.35 0.00 0.00 4947.54 577.00 6154.69 00:42:18.347 =================================================================================================================== 00:42:18.347 Total : 154601.14 150.98 0.00 0.00 4957.19 577.00 7294.44' 00:42:18.347 11:53:02 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:42:18.347 11:53:02 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:42:18.347 11:53:02 bdevperf_config -- bdevperf/test_config.sh@33 -- # [[ 3 == \3 ]] 00:42:18.347 11:53:02 bdevperf_config -- bdevperf/test_config.sh@35 -- # cleanup 00:42:18.347 11:53:02 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:42:18.347 11:53:02 bdevperf_config -- bdevperf/test_config.sh@37 -- # create_job global rw Malloc0:Malloc1 00:42:18.347 11:53:02 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=global 00:42:18.347 11:53:02 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=rw 00:42:18.347 11:53:02 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0:Malloc1 00:42:18.347 11:53:02 bdevperf_config -- bdevperf/common.sh@12 -- # [[ global == \g\l\o\b\a\l ]] 00:42:18.347 11:53:02 bdevperf_config -- bdevperf/common.sh@13 -- # cat 00:42:18.347 11:53:02 bdevperf_config -- bdevperf/common.sh@18 -- # job='[global]' 00:42:18.347 11:53:02 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:42:18.347 00:42:18.347 11:53:02 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:42:18.347 11:53:02 bdevperf_config -- bdevperf/test_config.sh@38 -- # create_job job0 00:42:18.347 11:53:02 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:42:18.347 11:53:02 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:42:18.347 11:53:02 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:42:18.347 11:53:02 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:42:18.347 11:53:02 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:42:18.347 11:53:02 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:42:18.347 00:42:18.347 11:53:02 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:42:18.347 11:53:02 bdevperf_config -- bdevperf/test_config.sh@39 -- # create_job job1 00:42:18.347 11:53:02 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:42:18.347 11:53:02 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:42:18.347 11:53:02 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:42:18.347 11:53:02 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:42:18.347 11:53:02 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:42:18.347 11:53:02 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:42:18.347 00:42:18.347 11:53:02 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:42:18.347 11:53:02 bdevperf_config -- bdevperf/test_config.sh@40 -- # create_job job2 00:42:18.347 11:53:02 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:42:18.347 11:53:02 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:42:18.347 11:53:02 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:42:18.347 11:53:02 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:42:18.347 11:53:02 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:42:18.347 11:53:02 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:42:18.347 00:42:18.347 11:53:02 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:42:18.347 11:53:02 bdevperf_config -- bdevperf/test_config.sh@41 -- # create_job job3 00:42:18.347 11:53:02 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job3 00:42:18.347 11:53:02 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:42:18.347 11:53:02 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:42:18.347 11:53:02 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job3 == \g\l\o\b\a\l ]] 00:42:18.347 11:53:02 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job3]' 00:42:18.347 11:53:02 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:42:18.347 00:42:18.347 11:53:02 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:42:18.347 11:53:02 bdevperf_config -- bdevperf/test_config.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:42:20.876 11:53:04 bdevperf_config -- bdevperf/test_config.sh@42 -- # bdevperf_output='[2024-06-10 11:53:02.162923] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:42:20.876 [2024-06-10 11:53:02.162979] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid265800 ] 00:42:20.876 Using job config with 4 jobs 00:42:20.876 [2024-06-10 11:53:02.265017] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:42:20.876 [2024-06-10 11:53:02.363515] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:42:20.876 cpumask for '\''job0'\'' is too big 00:42:20.876 cpumask for '\''job1'\'' is too big 00:42:20.876 cpumask for '\''job2'\'' is too big 00:42:20.876 cpumask for '\''job3'\'' is too big 00:42:20.876 Running I/O for 2 seconds... 00:42:20.876 00:42:20.876 Latency(us) 00:42:20.876 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:42:20.876 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:42:20.876 Malloc0 : 2.02 18860.32 18.42 0.00 0.00 13566.39 2478.97 20401.64 00:42:20.876 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:42:20.876 Malloc1 : 2.02 18848.73 18.41 0.00 0.00 13567.02 2906.38 20401.64 00:42:20.876 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:42:20.876 Malloc0 : 2.02 18837.44 18.40 0.00 0.00 13545.17 2379.24 18008.15 00:42:20.876 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:42:20.876 Malloc1 : 2.03 18826.49 18.39 0.00 0.00 13544.74 2863.64 18008.15 00:42:20.876 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:42:20.876 Malloc0 : 2.03 18815.92 18.37 0.00 0.00 13521.62 2364.99 15956.59 00:42:20.876 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:42:20.876 Malloc1 : 2.03 18804.63 18.36 0.00 0.00 13523.43 2877.89 16070.57 00:42:20.876 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:42:20.876 Malloc0 : 2.03 18793.36 18.35 0.00 0.00 13500.69 2364.99 16070.57 00:42:20.877 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:42:20.877 Malloc1 : 2.03 18782.37 18.34 0.00 0.00 13500.46 2877.89 16184.54 00:42:20.877 =================================================================================================================== 00:42:20.877 Total : 150569.26 147.04 0.00 0.00 13533.69 2364.99 20401.64' 00:42:20.877 11:53:04 bdevperf_config -- bdevperf/test_config.sh@43 -- # get_num_jobs '[2024-06-10 11:53:02.162923] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:42:20.877 [2024-06-10 11:53:02.162979] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid265800 ] 00:42:20.877 Using job config with 4 jobs 00:42:20.877 [2024-06-10 11:53:02.265017] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:42:20.877 [2024-06-10 11:53:02.363515] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:42:20.877 cpumask for '\''job0'\'' is too big 00:42:20.877 cpumask for '\''job1'\'' is too big 00:42:20.877 cpumask for '\''job2'\'' is too big 00:42:20.877 cpumask for '\''job3'\'' is too big 00:42:20.877 Running I/O for 2 seconds... 00:42:20.877 00:42:20.877 Latency(us) 00:42:20.877 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:42:20.877 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:42:20.877 Malloc0 : 2.02 18860.32 18.42 0.00 0.00 13566.39 2478.97 20401.64 00:42:20.877 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:42:20.877 Malloc1 : 2.02 18848.73 18.41 0.00 0.00 13567.02 2906.38 20401.64 00:42:20.877 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:42:20.877 Malloc0 : 2.02 18837.44 18.40 0.00 0.00 13545.17 2379.24 18008.15 00:42:20.877 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:42:20.877 Malloc1 : 2.03 18826.49 18.39 0.00 0.00 13544.74 2863.64 18008.15 00:42:20.877 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:42:20.877 Malloc0 : 2.03 18815.92 18.37 0.00 0.00 13521.62 2364.99 15956.59 00:42:20.877 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:42:20.877 Malloc1 : 2.03 18804.63 18.36 0.00 0.00 13523.43 2877.89 16070.57 00:42:20.877 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:42:20.877 Malloc0 : 2.03 18793.36 18.35 0.00 0.00 13500.69 2364.99 16070.57 00:42:20.877 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:42:20.877 Malloc1 : 2.03 18782.37 18.34 0.00 0.00 13500.46 2877.89 16184.54 00:42:20.877 =================================================================================================================== 00:42:20.877 Total : 150569.26 147.04 0.00 0.00 13533.69 2364.99 20401.64' 00:42:20.877 11:53:04 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-06-10 11:53:02.162923] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:42:20.877 [2024-06-10 11:53:02.162979] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid265800 ] 00:42:20.877 Using job config with 4 jobs 00:42:20.877 [2024-06-10 11:53:02.265017] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:42:20.877 [2024-06-10 11:53:02.363515] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:42:20.877 cpumask for '\''job0'\'' is too big 00:42:20.877 cpumask for '\''job1'\'' is too big 00:42:20.877 cpumask for '\''job2'\'' is too big 00:42:20.877 cpumask for '\''job3'\'' is too big 00:42:20.877 Running I/O for 2 seconds... 00:42:20.877 00:42:20.877 Latency(us) 00:42:20.877 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:42:20.877 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:42:20.877 Malloc0 : 2.02 18860.32 18.42 0.00 0.00 13566.39 2478.97 20401.64 00:42:20.877 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:42:20.877 Malloc1 : 2.02 18848.73 18.41 0.00 0.00 13567.02 2906.38 20401.64 00:42:20.877 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:42:20.877 Malloc0 : 2.02 18837.44 18.40 0.00 0.00 13545.17 2379.24 18008.15 00:42:20.877 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:42:20.877 Malloc1 : 2.03 18826.49 18.39 0.00 0.00 13544.74 2863.64 18008.15 00:42:20.877 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:42:20.877 Malloc0 : 2.03 18815.92 18.37 0.00 0.00 13521.62 2364.99 15956.59 00:42:20.877 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:42:20.877 Malloc1 : 2.03 18804.63 18.36 0.00 0.00 13523.43 2877.89 16070.57 00:42:20.877 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:42:20.877 Malloc0 : 2.03 18793.36 18.35 0.00 0.00 13500.69 2364.99 16070.57 00:42:20.877 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:42:20.877 Malloc1 : 2.03 18782.37 18.34 0.00 0.00 13500.46 2877.89 16184.54 00:42:20.877 =================================================================================================================== 00:42:20.877 Total : 150569.26 147.04 0.00 0.00 13533.69 2364.99 20401.64' 00:42:20.877 11:53:04 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:42:20.877 11:53:04 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:42:20.877 11:53:04 bdevperf_config -- bdevperf/test_config.sh@43 -- # [[ 4 == \4 ]] 00:42:20.877 11:53:04 bdevperf_config -- bdevperf/test_config.sh@44 -- # cleanup 00:42:20.877 11:53:04 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:42:20.877 11:53:04 bdevperf_config -- bdevperf/test_config.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:42:20.877 00:42:20.877 real 0m10.718s 00:42:20.877 user 0m9.571s 00:42:20.877 sys 0m1.004s 00:42:20.877 11:53:04 bdevperf_config -- common/autotest_common.sh@1125 -- # xtrace_disable 00:42:20.877 11:53:04 bdevperf_config -- common/autotest_common.sh@10 -- # set +x 00:42:20.877 ************************************ 00:42:20.877 END TEST bdevperf_config 00:42:20.877 ************************************ 00:42:20.877 11:53:04 -- spdk/autotest.sh@192 -- # uname -s 00:42:20.877 11:53:04 -- spdk/autotest.sh@192 -- # [[ Linux == Linux ]] 00:42:20.877 11:53:04 -- spdk/autotest.sh@193 -- # run_test reactor_set_interrupt /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:42:20.877 11:53:04 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:42:20.877 11:53:04 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:42:20.877 11:53:04 -- common/autotest_common.sh@10 -- # set +x 00:42:21.138 ************************************ 00:42:21.138 START TEST reactor_set_interrupt 00:42:21.138 ************************************ 00:42:21.138 11:53:04 reactor_set_interrupt -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:42:21.138 * Looking for test storage... 00:42:21.138 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:42:21.138 11:53:04 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/interrupt_common.sh 00:42:21.138 11:53:04 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:42:21.138 11:53:04 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:42:21.138 11:53:04 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # testdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:42:21.138 11:53:04 reactor_set_interrupt -- interrupt/interrupt_common.sh@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/../.. 00:42:21.138 11:53:04 reactor_set_interrupt -- interrupt/interrupt_common.sh@6 -- # rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:42:21.138 11:53:04 reactor_set_interrupt -- interrupt/interrupt_common.sh@7 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh 00:42:21.138 11:53:04 reactor_set_interrupt -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:42:21.138 11:53:04 reactor_set_interrupt -- common/autotest_common.sh@34 -- # set -e 00:42:21.138 11:53:04 reactor_set_interrupt -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:42:21.138 11:53:04 reactor_set_interrupt -- common/autotest_common.sh@36 -- # shopt -s extglob 00:42:21.138 11:53:04 reactor_set_interrupt -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:42:21.138 11:53:04 reactor_set_interrupt -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/crypto-phy-autotest/spdk/../output ']' 00:42:21.138 11:53:04 reactor_set_interrupt -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh ]] 00:42:21.138 11:53:04 reactor_set_interrupt -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh 00:42:21.138 11:53:04 reactor_set_interrupt -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:42:21.138 11:53:04 reactor_set_interrupt -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:42:21.138 11:53:04 reactor_set_interrupt -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=y 00:42:21.138 11:53:04 reactor_set_interrupt -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:42:21.138 11:53:04 reactor_set_interrupt -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:42:21.138 11:53:04 reactor_set_interrupt -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:42:21.138 11:53:04 reactor_set_interrupt -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:42:21.138 11:53:04 reactor_set_interrupt -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:42:21.138 11:53:04 reactor_set_interrupt -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:42:21.138 11:53:04 reactor_set_interrupt -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:42:21.138 11:53:04 reactor_set_interrupt -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:42:21.138 11:53:04 reactor_set_interrupt -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:42:21.138 11:53:04 reactor_set_interrupt -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:42:21.138 11:53:04 reactor_set_interrupt -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:42:21.138 11:53:04 reactor_set_interrupt -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:42:21.138 11:53:04 reactor_set_interrupt -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:42:21.138 11:53:04 reactor_set_interrupt -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:42:21.138 11:53:04 reactor_set_interrupt -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:42:21.138 11:53:04 reactor_set_interrupt -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:42:21.138 11:53:04 reactor_set_interrupt -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:42:21.138 11:53:04 reactor_set_interrupt -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:42:21.138 11:53:04 reactor_set_interrupt -- common/build_config.sh@22 -- # CONFIG_CET=n 00:42:21.138 11:53:04 reactor_set_interrupt -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=y 00:42:21.138 11:53:04 reactor_set_interrupt -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:42:21.138 11:53:04 reactor_set_interrupt -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:42:21.138 11:53:04 reactor_set_interrupt -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:42:21.138 11:53:04 reactor_set_interrupt -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:42:21.138 11:53:04 reactor_set_interrupt -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:42:21.138 11:53:04 reactor_set_interrupt -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:42:21.138 11:53:04 reactor_set_interrupt -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:42:21.138 11:53:04 reactor_set_interrupt -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:42:21.138 11:53:04 reactor_set_interrupt -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:42:21.138 11:53:04 reactor_set_interrupt -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:42:21.138 11:53:04 reactor_set_interrupt -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB= 00:42:21.138 11:53:04 reactor_set_interrupt -- common/build_config.sh@35 -- # CONFIG_FUZZER=n 00:42:21.138 11:53:04 reactor_set_interrupt -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:42:21.138 11:53:04 reactor_set_interrupt -- common/build_config.sh@37 -- # CONFIG_CRYPTO=y 00:42:21.138 11:53:04 reactor_set_interrupt -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:42:21.138 11:53:04 reactor_set_interrupt -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:42:21.138 11:53:04 reactor_set_interrupt -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:42:21.138 11:53:04 reactor_set_interrupt -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:42:21.138 11:53:04 reactor_set_interrupt -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:42:21.138 11:53:04 reactor_set_interrupt -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:42:21.138 11:53:04 reactor_set_interrupt -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:42:21.138 11:53:04 reactor_set_interrupt -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:42:21.138 11:53:04 reactor_set_interrupt -- common/build_config.sh@46 -- # CONFIG_DPDK_UADK=n 00:42:21.138 11:53:04 reactor_set_interrupt -- common/build_config.sh@47 -- # CONFIG_COVERAGE=y 00:42:21.138 11:53:04 reactor_set_interrupt -- common/build_config.sh@48 -- # CONFIG_RDMA=y 00:42:21.138 11:53:04 reactor_set_interrupt -- common/build_config.sh@49 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:42:21.138 11:53:04 reactor_set_interrupt -- common/build_config.sh@50 -- # CONFIG_URING_PATH= 00:42:21.138 11:53:04 reactor_set_interrupt -- common/build_config.sh@51 -- # CONFIG_XNVME=n 00:42:21.138 11:53:04 reactor_set_interrupt -- common/build_config.sh@52 -- # CONFIG_VFIO_USER=n 00:42:21.138 11:53:04 reactor_set_interrupt -- common/build_config.sh@53 -- # CONFIG_ARCH=native 00:42:21.138 11:53:04 reactor_set_interrupt -- common/build_config.sh@54 -- # CONFIG_HAVE_EVP_MAC=y 00:42:21.138 11:53:04 reactor_set_interrupt -- common/build_config.sh@55 -- # CONFIG_URING_ZNS=n 00:42:21.138 11:53:04 reactor_set_interrupt -- common/build_config.sh@56 -- # CONFIG_WERROR=y 00:42:21.138 11:53:04 reactor_set_interrupt -- common/build_config.sh@57 -- # CONFIG_HAVE_LIBBSD=n 00:42:21.138 11:53:04 reactor_set_interrupt -- common/build_config.sh@58 -- # CONFIG_UBSAN=y 00:42:21.138 11:53:04 reactor_set_interrupt -- common/build_config.sh@59 -- # CONFIG_IPSEC_MB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:42:21.138 11:53:04 reactor_set_interrupt -- common/build_config.sh@60 -- # CONFIG_GOLANG=n 00:42:21.138 11:53:04 reactor_set_interrupt -- common/build_config.sh@61 -- # CONFIG_ISAL=y 00:42:21.139 11:53:04 reactor_set_interrupt -- common/build_config.sh@62 -- # CONFIG_IDXD_KERNEL=y 00:42:21.139 11:53:04 reactor_set_interrupt -- common/build_config.sh@63 -- # CONFIG_DPDK_LIB_DIR= 00:42:21.139 11:53:04 reactor_set_interrupt -- common/build_config.sh@64 -- # CONFIG_RDMA_PROV=verbs 00:42:21.139 11:53:04 reactor_set_interrupt -- common/build_config.sh@65 -- # CONFIG_APPS=y 00:42:21.139 11:53:04 reactor_set_interrupt -- common/build_config.sh@66 -- # CONFIG_SHARED=y 00:42:21.139 11:53:04 reactor_set_interrupt -- common/build_config.sh@67 -- # CONFIG_HAVE_KEYUTILS=y 00:42:21.139 11:53:04 reactor_set_interrupt -- common/build_config.sh@68 -- # CONFIG_FC_PATH= 00:42:21.139 11:53:04 reactor_set_interrupt -- common/build_config.sh@69 -- # CONFIG_DPDK_PKG_CONFIG=n 00:42:21.139 11:53:04 reactor_set_interrupt -- common/build_config.sh@70 -- # CONFIG_FC=n 00:42:21.139 11:53:04 reactor_set_interrupt -- common/build_config.sh@71 -- # CONFIG_AVAHI=n 00:42:21.139 11:53:04 reactor_set_interrupt -- common/build_config.sh@72 -- # CONFIG_FIO_PLUGIN=y 00:42:21.139 11:53:04 reactor_set_interrupt -- common/build_config.sh@73 -- # CONFIG_RAID5F=n 00:42:21.139 11:53:04 reactor_set_interrupt -- common/build_config.sh@74 -- # CONFIG_EXAMPLES=y 00:42:21.139 11:53:04 reactor_set_interrupt -- common/build_config.sh@75 -- # CONFIG_TESTS=y 00:42:21.139 11:53:04 reactor_set_interrupt -- common/build_config.sh@76 -- # CONFIG_CRYPTO_MLX5=y 00:42:21.139 11:53:04 reactor_set_interrupt -- common/build_config.sh@77 -- # CONFIG_MAX_LCORES= 00:42:21.139 11:53:04 reactor_set_interrupt -- common/build_config.sh@78 -- # CONFIG_IPSEC_MB=y 00:42:21.139 11:53:04 reactor_set_interrupt -- common/build_config.sh@79 -- # CONFIG_PGO_DIR= 00:42:21.139 11:53:04 reactor_set_interrupt -- common/build_config.sh@80 -- # CONFIG_DEBUG=y 00:42:21.139 11:53:04 reactor_set_interrupt -- common/build_config.sh@81 -- # CONFIG_DPDK_COMPRESSDEV=y 00:42:21.139 11:53:04 reactor_set_interrupt -- common/build_config.sh@82 -- # CONFIG_CROSS_PREFIX= 00:42:21.139 11:53:04 reactor_set_interrupt -- common/build_config.sh@83 -- # CONFIG_URING=n 00:42:21.139 11:53:04 reactor_set_interrupt -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:42:21.139 11:53:04 reactor_set_interrupt -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:42:21.139 11:53:04 reactor_set_interrupt -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:42:21.139 11:53:04 reactor_set_interrupt -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:42:21.139 11:53:04 reactor_set_interrupt -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:42:21.139 11:53:04 reactor_set_interrupt -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:42:21.139 11:53:04 reactor_set_interrupt -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:42:21.139 11:53:05 reactor_set_interrupt -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:42:21.139 11:53:05 reactor_set_interrupt -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:42:21.139 11:53:05 reactor_set_interrupt -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:42:21.139 11:53:05 reactor_set_interrupt -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:42:21.139 11:53:05 reactor_set_interrupt -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:42:21.139 11:53:05 reactor_set_interrupt -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:42:21.139 11:53:05 reactor_set_interrupt -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:42:21.139 11:53:05 reactor_set_interrupt -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/config.h ]] 00:42:21.139 11:53:05 reactor_set_interrupt -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:42:21.139 #define SPDK_CONFIG_H 00:42:21.139 #define SPDK_CONFIG_APPS 1 00:42:21.139 #define SPDK_CONFIG_ARCH native 00:42:21.139 #undef SPDK_CONFIG_ASAN 00:42:21.139 #undef SPDK_CONFIG_AVAHI 00:42:21.139 #undef SPDK_CONFIG_CET 00:42:21.139 #define SPDK_CONFIG_COVERAGE 1 00:42:21.139 #define SPDK_CONFIG_CROSS_PREFIX 00:42:21.139 #define SPDK_CONFIG_CRYPTO 1 00:42:21.139 #define SPDK_CONFIG_CRYPTO_MLX5 1 00:42:21.139 #undef SPDK_CONFIG_CUSTOMOCF 00:42:21.139 #undef SPDK_CONFIG_DAOS 00:42:21.139 #define SPDK_CONFIG_DAOS_DIR 00:42:21.139 #define SPDK_CONFIG_DEBUG 1 00:42:21.139 #define SPDK_CONFIG_DPDK_COMPRESSDEV 1 00:42:21.139 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:42:21.139 #define SPDK_CONFIG_DPDK_INC_DIR 00:42:21.139 #define SPDK_CONFIG_DPDK_LIB_DIR 00:42:21.139 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:42:21.139 #undef SPDK_CONFIG_DPDK_UADK 00:42:21.139 #define SPDK_CONFIG_ENV /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:42:21.139 #define SPDK_CONFIG_EXAMPLES 1 00:42:21.139 #undef SPDK_CONFIG_FC 00:42:21.139 #define SPDK_CONFIG_FC_PATH 00:42:21.139 #define SPDK_CONFIG_FIO_PLUGIN 1 00:42:21.139 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:42:21.139 #undef SPDK_CONFIG_FUSE 00:42:21.139 #undef SPDK_CONFIG_FUZZER 00:42:21.139 #define SPDK_CONFIG_FUZZER_LIB 00:42:21.139 #undef SPDK_CONFIG_GOLANG 00:42:21.139 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:42:21.139 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:42:21.139 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:42:21.139 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:42:21.139 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:42:21.139 #undef SPDK_CONFIG_HAVE_LIBBSD 00:42:21.139 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:42:21.139 #define SPDK_CONFIG_IDXD 1 00:42:21.139 #define SPDK_CONFIG_IDXD_KERNEL 1 00:42:21.139 #define SPDK_CONFIG_IPSEC_MB 1 00:42:21.139 #define SPDK_CONFIG_IPSEC_MB_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:42:21.139 #define SPDK_CONFIG_ISAL 1 00:42:21.139 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:42:21.139 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:42:21.139 #define SPDK_CONFIG_LIBDIR 00:42:21.139 #undef SPDK_CONFIG_LTO 00:42:21.139 #define SPDK_CONFIG_MAX_LCORES 00:42:21.139 #define SPDK_CONFIG_NVME_CUSE 1 00:42:21.139 #undef SPDK_CONFIG_OCF 00:42:21.139 #define SPDK_CONFIG_OCF_PATH 00:42:21.139 #define SPDK_CONFIG_OPENSSL_PATH 00:42:21.139 #undef SPDK_CONFIG_PGO_CAPTURE 00:42:21.139 #define SPDK_CONFIG_PGO_DIR 00:42:21.139 #undef SPDK_CONFIG_PGO_USE 00:42:21.139 #define SPDK_CONFIG_PREFIX /usr/local 00:42:21.139 #undef SPDK_CONFIG_RAID5F 00:42:21.139 #undef SPDK_CONFIG_RBD 00:42:21.139 #define SPDK_CONFIG_RDMA 1 00:42:21.139 #define SPDK_CONFIG_RDMA_PROV verbs 00:42:21.139 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:42:21.139 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:42:21.139 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:42:21.139 #define SPDK_CONFIG_SHARED 1 00:42:21.139 #undef SPDK_CONFIG_SMA 00:42:21.139 #define SPDK_CONFIG_TESTS 1 00:42:21.139 #undef SPDK_CONFIG_TSAN 00:42:21.139 #define SPDK_CONFIG_UBLK 1 00:42:21.139 #define SPDK_CONFIG_UBSAN 1 00:42:21.139 #undef SPDK_CONFIG_UNIT_TESTS 00:42:21.139 #undef SPDK_CONFIG_URING 00:42:21.139 #define SPDK_CONFIG_URING_PATH 00:42:21.139 #undef SPDK_CONFIG_URING_ZNS 00:42:21.139 #undef SPDK_CONFIG_USDT 00:42:21.139 #define SPDK_CONFIG_VBDEV_COMPRESS 1 00:42:21.139 #define SPDK_CONFIG_VBDEV_COMPRESS_MLX5 1 00:42:21.139 #undef SPDK_CONFIG_VFIO_USER 00:42:21.139 #define SPDK_CONFIG_VFIO_USER_DIR 00:42:21.139 #define SPDK_CONFIG_VHOST 1 00:42:21.139 #define SPDK_CONFIG_VIRTIO 1 00:42:21.139 #undef SPDK_CONFIG_VTUNE 00:42:21.139 #define SPDK_CONFIG_VTUNE_DIR 00:42:21.139 #define SPDK_CONFIG_WERROR 1 00:42:21.139 #define SPDK_CONFIG_WPDK_DIR 00:42:21.139 #undef SPDK_CONFIG_XNVME 00:42:21.139 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:42:21.139 11:53:05 reactor_set_interrupt -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:42:21.140 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:42:21.140 11:53:05 reactor_set_interrupt -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:42:21.140 11:53:05 reactor_set_interrupt -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:42:21.140 11:53:05 reactor_set_interrupt -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:42:21.140 11:53:05 reactor_set_interrupt -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:42:21.140 11:53:05 reactor_set_interrupt -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:42:21.140 11:53:05 reactor_set_interrupt -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:42:21.140 11:53:05 reactor_set_interrupt -- paths/export.sh@5 -- # export PATH 00:42:21.140 11:53:05 reactor_set_interrupt -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:42:21.140 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:42:21.140 11:53:05 reactor_set_interrupt -- pm/common@6 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:42:21.140 11:53:05 reactor_set_interrupt -- pm/common@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:42:21.140 11:53:05 reactor_set_interrupt -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:42:21.140 11:53:05 reactor_set_interrupt -- pm/common@7 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/../../../ 00:42:21.140 11:53:05 reactor_set_interrupt -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:42:21.140 11:53:05 reactor_set_interrupt -- pm/common@64 -- # TEST_TAG=N/A 00:42:21.140 11:53:05 reactor_set_interrupt -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/crypto-phy-autotest/spdk/.run_test_name 00:42:21.140 11:53:05 reactor_set_interrupt -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power 00:42:21.140 11:53:05 reactor_set_interrupt -- pm/common@68 -- # uname -s 00:42:21.140 11:53:05 reactor_set_interrupt -- pm/common@68 -- # PM_OS=Linux 00:42:21.140 11:53:05 reactor_set_interrupt -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:42:21.140 11:53:05 reactor_set_interrupt -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:42:21.140 11:53:05 reactor_set_interrupt -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:42:21.140 11:53:05 reactor_set_interrupt -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:42:21.140 11:53:05 reactor_set_interrupt -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:42:21.140 11:53:05 reactor_set_interrupt -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:42:21.140 11:53:05 reactor_set_interrupt -- pm/common@76 -- # SUDO[0]= 00:42:21.140 11:53:05 reactor_set_interrupt -- pm/common@76 -- # SUDO[1]='sudo -E' 00:42:21.140 11:53:05 reactor_set_interrupt -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:42:21.140 11:53:05 reactor_set_interrupt -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:42:21.140 11:53:05 reactor_set_interrupt -- pm/common@81 -- # [[ Linux == Linux ]] 00:42:21.140 11:53:05 reactor_set_interrupt -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:42:21.140 11:53:05 reactor_set_interrupt -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:42:21.140 11:53:05 reactor_set_interrupt -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:42:21.140 11:53:05 reactor_set_interrupt -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:42:21.140 11:53:05 reactor_set_interrupt -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power ]] 00:42:21.140 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@58 -- # : 0 00:42:21.140 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:42:21.140 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@62 -- # : 0 00:42:21.140 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:42:21.140 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@64 -- # : 0 00:42:21.140 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:42:21.140 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@66 -- # : 1 00:42:21.140 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:42:21.140 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@68 -- # : 0 00:42:21.140 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:42:21.140 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@70 -- # : 00:42:21.140 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:42:21.140 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@72 -- # : 0 00:42:21.140 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:42:21.140 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@74 -- # : 1 00:42:21.140 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:42:21.140 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@76 -- # : 0 00:42:21.140 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:42:21.140 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@78 -- # : 0 00:42:21.140 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:42:21.140 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@80 -- # : 0 00:42:21.140 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:42:21.140 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@82 -- # : 0 00:42:21.140 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:42:21.140 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@84 -- # : 0 00:42:21.140 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:42:21.140 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@86 -- # : 0 00:42:21.140 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:42:21.140 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@88 -- # : 0 00:42:21.140 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:42:21.140 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@90 -- # : 0 00:42:21.140 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:42:21.140 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@92 -- # : 0 00:42:21.140 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:42:21.140 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@94 -- # : 0 00:42:21.140 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:42:21.140 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@96 -- # : 0 00:42:21.140 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:42:21.140 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@98 -- # : 0 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@100 -- # : 0 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@102 -- # : rdma 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@104 -- # : 0 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@106 -- # : 0 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@108 -- # : 1 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@110 -- # : 0 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@111 -- # export SPDK_TEST_IOAT 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@112 -- # : 0 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@113 -- # export SPDK_TEST_BLOBFS 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@114 -- # : 0 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@115 -- # export SPDK_TEST_VHOST_INIT 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@116 -- # : 0 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@117 -- # export SPDK_TEST_LVOL 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@118 -- # : 1 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@119 -- # export SPDK_TEST_VBDEV_COMPRESS 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@120 -- # : 0 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@121 -- # export SPDK_RUN_ASAN 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@122 -- # : 1 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@123 -- # export SPDK_RUN_UBSAN 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@124 -- # : 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@125 -- # export SPDK_RUN_EXTERNAL_DPDK 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@126 -- # : 0 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@127 -- # export SPDK_RUN_NON_ROOT 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@128 -- # : 1 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@129 -- # export SPDK_TEST_CRYPTO 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@130 -- # : 0 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@131 -- # export SPDK_TEST_FTL 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@132 -- # : 0 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@133 -- # export SPDK_TEST_OCF 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@134 -- # : 0 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@135 -- # export SPDK_TEST_VMD 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@136 -- # : 0 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@137 -- # export SPDK_TEST_OPAL 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@138 -- # : 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@139 -- # export SPDK_TEST_NATIVE_DPDK 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@140 -- # : true 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@141 -- # export SPDK_AUTOTEST_X 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@142 -- # : 0 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@143 -- # export SPDK_TEST_RAID5 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@144 -- # : 0 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@146 -- # : 0 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@148 -- # : 0 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@150 -- # : 0 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@152 -- # : 0 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@154 -- # : 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@156 -- # : 0 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@158 -- # : 0 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@160 -- # : 0 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@162 -- # : 0 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL_DSA 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@164 -- # : 0 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_IAA 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@167 -- # : 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@168 -- # export SPDK_TEST_FUZZER_TARGET 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@169 -- # : 0 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@170 -- # export SPDK_TEST_NVMF_MDNS 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@171 -- # : 0 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@172 -- # export SPDK_JSONRPC_GO_CLIENT 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@175 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@175 -- # SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@176 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@176 -- # DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@177 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@177 -- # VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:42:21.141 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@178 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:42:21.142 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@178 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:42:21.142 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@181 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:42:21.142 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@181 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:42:21.142 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@185 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:42:21.142 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@185 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:42:21.142 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@189 -- # export PYTHONDONTWRITEBYTECODE=1 00:42:21.142 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@189 -- # PYTHONDONTWRITEBYTECODE=1 00:42:21.142 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@193 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:42:21.142 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@193 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:42:21.142 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@194 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:42:21.142 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@194 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:42:21.142 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@198 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:42:21.142 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@199 -- # rm -rf /var/tmp/asan_suppression_file 00:42:21.142 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@200 -- # cat 00:42:21.142 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@236 -- # echo leak:libfuse3.so 00:42:21.142 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@238 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:42:21.142 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@238 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:42:21.142 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@240 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:42:21.142 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@240 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:42:21.142 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@242 -- # '[' -z /var/spdk/dependencies ']' 00:42:21.142 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@245 -- # export DEPENDENCY_DIR 00:42:21.142 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@249 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:42:21.142 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@249 -- # SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:42:21.142 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@250 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:42:21.142 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@250 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:42:21.142 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@253 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:42:21.142 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@253 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:42:21.142 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@254 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:42:21.142 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@254 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:42:21.142 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@256 -- # export AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:42:21.142 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@256 -- # AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:42:21.142 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@259 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:42:21.142 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@259 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:42:21.142 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@262 -- # '[' 0 -eq 0 ']' 00:42:21.142 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@263 -- # export valgrind= 00:42:21.142 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@263 -- # valgrind= 00:42:21.142 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@269 -- # uname -s 00:42:21.142 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@269 -- # '[' Linux = Linux ']' 00:42:21.142 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@270 -- # HUGEMEM=4096 00:42:21.142 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@271 -- # export CLEAR_HUGE=yes 00:42:21.142 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@271 -- # CLEAR_HUGE=yes 00:42:21.142 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@272 -- # [[ 1 -eq 1 ]] 00:42:21.142 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@276 -- # export HUGE_EVEN_ALLOC=yes 00:42:21.142 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@276 -- # HUGE_EVEN_ALLOC=yes 00:42:21.142 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@279 -- # MAKE=make 00:42:21.142 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@280 -- # MAKEFLAGS=-j72 00:42:21.142 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@296 -- # export HUGEMEM=4096 00:42:21.142 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@296 -- # HUGEMEM=4096 00:42:21.142 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@298 -- # NO_HUGE=() 00:42:21.142 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@299 -- # TEST_MODE= 00:42:21.142 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@318 -- # [[ -z 266195 ]] 00:42:21.142 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@318 -- # kill -0 266195 00:42:21.142 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@1679 -- # set_test_storage 2147483648 00:42:21.142 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@328 -- # [[ -v testdir ]] 00:42:21.142 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@330 -- # local requested_size=2147483648 00:42:21.142 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@331 -- # local mount target_dir 00:42:21.142 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@333 -- # local -A mounts fss sizes avails uses 00:42:21.142 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@334 -- # local source fs size avail mount use 00:42:21.142 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@336 -- # local storage_fallback storage_candidates 00:42:21.142 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@338 -- # mktemp -udt spdk.XXXXXX 00:42:21.402 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@338 -- # storage_fallback=/tmp/spdk.EBzJx8 00:42:21.402 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@343 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:42:21.402 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@345 -- # [[ -n '' ]] 00:42:21.402 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@350 -- # [[ -n '' ]] 00:42:21.402 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@355 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt /tmp/spdk.EBzJx8/tests/interrupt /tmp/spdk.EBzJx8 00:42:21.402 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@358 -- # requested_size=2214592512 00:42:21.402 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:42:21.402 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@327 -- # df -T 00:42:21.402 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@327 -- # grep -v Filesystem 00:42:21.402 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_devtmpfs 00:42:21.402 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=devtmpfs 00:42:21.402 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=67108864 00:42:21.402 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=67108864 00:42:21.402 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=0 00:42:21.402 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:42:21.402 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=/dev/pmem0 00:42:21.402 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=ext2 00:42:21.402 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=956399616 00:42:21.402 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=5284429824 00:42:21.402 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=4328030208 00:42:21.402 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:42:21.402 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_root 00:42:21.402 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=overlay 00:42:21.402 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=84617773056 00:42:21.402 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=94508552192 00:42:21.402 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=9890779136 00:42:21.402 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:42:21.402 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:42:21.402 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:42:21.402 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=47249563648 00:42:21.402 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=47254274048 00:42:21.402 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=4710400 00:42:21.402 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:42:21.402 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:42:21.402 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:42:21.402 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=18892316672 00:42:21.402 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=18901712896 00:42:21.402 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=9396224 00:42:21.402 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:42:21.402 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:42:21.402 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:42:21.402 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=47253676032 00:42:21.402 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=47254278144 00:42:21.402 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=602112 00:42:21.402 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:42:21.402 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:42:21.402 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:42:21.403 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=9450848256 00:42:21.403 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=9450852352 00:42:21.403 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=4096 00:42:21.403 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:42:21.403 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@366 -- # printf '* Looking for test storage...\n' 00:42:21.403 * Looking for test storage... 00:42:21.403 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@368 -- # local target_space new_size 00:42:21.403 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@369 -- # for target_dir in "${storage_candidates[@]}" 00:42:21.403 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@372 -- # df /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:42:21.403 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@372 -- # awk '$1 !~ /Filesystem/{print $6}' 00:42:21.403 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@372 -- # mount=/ 00:42:21.403 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@374 -- # target_space=84617773056 00:42:21.403 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@375 -- # (( target_space == 0 || target_space < requested_size )) 00:42:21.403 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@378 -- # (( target_space >= requested_size )) 00:42:21.403 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ overlay == tmpfs ]] 00:42:21.403 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ overlay == ramfs ]] 00:42:21.403 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ / == / ]] 00:42:21.403 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@381 -- # new_size=12105371648 00:42:21.403 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@382 -- # (( new_size * 100 / sizes[/] > 95 )) 00:42:21.403 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@387 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:42:21.403 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@387 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:42:21.403 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@388 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:42:21.403 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:42:21.403 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@389 -- # return 0 00:42:21.403 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@1681 -- # set -o errtrace 00:42:21.403 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@1682 -- # shopt -s extdebug 00:42:21.403 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@1683 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:42:21.403 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@1685 -- # PS4=' \t $test_domain -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:42:21.403 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@1686 -- # true 00:42:21.403 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@1688 -- # xtrace_fd 00:42:21.403 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@25 -- # [[ -n 13 ]] 00:42:21.403 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/13 ]] 00:42:21.403 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@27 -- # exec 00:42:21.403 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@29 -- # exec 00:42:21.403 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@31 -- # xtrace_restore 00:42:21.403 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:42:21.403 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:42:21.403 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@18 -- # set -x 00:42:21.403 11:53:05 reactor_set_interrupt -- interrupt/interrupt_common.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/common.sh 00:42:21.403 11:53:05 reactor_set_interrupt -- interrupt/interrupt_common.sh@10 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:42:21.403 11:53:05 reactor_set_interrupt -- interrupt/interrupt_common.sh@12 -- # r0_mask=0x1 00:42:21.403 11:53:05 reactor_set_interrupt -- interrupt/interrupt_common.sh@13 -- # r1_mask=0x2 00:42:21.403 11:53:05 reactor_set_interrupt -- interrupt/interrupt_common.sh@14 -- # r2_mask=0x4 00:42:21.403 11:53:05 reactor_set_interrupt -- interrupt/interrupt_common.sh@16 -- # cpu_server_mask=0x07 00:42:21.403 11:53:05 reactor_set_interrupt -- interrupt/interrupt_common.sh@17 -- # rpc_server_addr=/var/tmp/spdk.sock 00:42:21.403 11:53:05 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@11 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:42:21.403 11:53:05 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@11 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:42:21.403 11:53:05 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@86 -- # start_intr_tgt 00:42:21.403 11:53:05 reactor_set_interrupt -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:42:21.403 11:53:05 reactor_set_interrupt -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:42:21.403 11:53:05 reactor_set_interrupt -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:42:21.403 11:53:05 reactor_set_interrupt -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=266236 00:42:21.403 11:53:05 reactor_set_interrupt -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:42:21.403 11:53:05 reactor_set_interrupt -- interrupt/interrupt_common.sh@26 -- # waitforlisten 266236 /var/tmp/spdk.sock 00:42:21.403 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@830 -- # '[' -z 266236 ']' 00:42:21.403 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:42:21.403 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@835 -- # local max_retries=100 00:42:21.403 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:42:21.403 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:42:21.403 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@839 -- # xtrace_disable 00:42:21.403 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:42:21.403 [2024-06-10 11:53:05.148650] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:42:21.403 [2024-06-10 11:53:05.148697] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid266236 ] 00:42:21.403 [2024-06-10 11:53:05.233661] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:42:21.403 [2024-06-10 11:53:05.321500] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:42:21.403 [2024-06-10 11:53:05.321585] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:42:21.403 [2024-06-10 11:53:05.321587] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:42:21.662 [2024-06-10 11:53:05.390807] thread.c:2095:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:42:22.228 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:42:22.228 11:53:05 reactor_set_interrupt -- common/autotest_common.sh@863 -- # return 0 00:42:22.228 11:53:05 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@87 -- # setup_bdev_mem 00:42:22.228 11:53:05 reactor_set_interrupt -- interrupt/common.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:42:22.228 Malloc0 00:42:22.228 Malloc1 00:42:22.228 Malloc2 00:42:22.487 11:53:06 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@88 -- # setup_bdev_aio 00:42:22.487 11:53:06 reactor_set_interrupt -- interrupt/common.sh@75 -- # uname -s 00:42:22.487 11:53:06 reactor_set_interrupt -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:42:22.487 11:53:06 reactor_set_interrupt -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:42:22.487 5000+0 records in 00:42:22.487 5000+0 records out 00:42:22.487 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0250815 s, 408 MB/s 00:42:22.487 11:53:06 reactor_set_interrupt -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:42:22.487 AIO0 00:42:22.745 11:53:06 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@90 -- # reactor_set_mode_without_threads 266236 00:42:22.745 11:53:06 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@76 -- # reactor_set_intr_mode 266236 without_thd 00:42:22.745 11:53:06 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@14 -- # local spdk_pid=266236 00:42:22.745 11:53:06 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@15 -- # local without_thd=without_thd 00:42:22.745 11:53:06 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # thd0_ids=($(reactor_get_thread_ids $r0_mask)) 00:42:22.745 11:53:06 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # reactor_get_thread_ids 0x1 00:42:22.745 11:53:06 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x1 00:42:22.745 11:53:06 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:42:22.745 11:53:06 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=1 00:42:22.745 11:53:06 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:42:22.745 11:53:06 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 1 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:42:22.745 11:53:06 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:42:22.745 11:53:06 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo 1 00:42:22.745 11:53:06 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # thd2_ids=($(reactor_get_thread_ids $r2_mask)) 00:42:22.745 11:53:06 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # reactor_get_thread_ids 0x4 00:42:22.745 11:53:06 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x4 00:42:22.745 11:53:06 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:42:22.745 11:53:06 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=4 00:42:22.745 11:53:06 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:42:22.745 11:53:06 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:42:22.745 11:53:06 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 4 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:42:23.004 11:53:06 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo '' 00:42:23.004 11:53:06 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@21 -- # [[ 1 -eq 0 ]] 00:42:23.004 11:53:06 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@25 -- # echo 'spdk_thread ids are 1 on reactor0.' 00:42:23.004 spdk_thread ids are 1 on reactor0. 00:42:23.004 11:53:06 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:42:23.004 11:53:06 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 266236 0 00:42:23.004 11:53:06 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 266236 0 idle 00:42:23.004 11:53:06 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=266236 00:42:23.004 11:53:06 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:42:23.004 11:53:06 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:42:23.004 11:53:06 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:42:23.004 11:53:06 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:42:23.004 11:53:06 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:42:23.004 11:53:06 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:42:23.004 11:53:06 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:42:23.004 11:53:06 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 266236 -w 256 00:42:23.004 11:53:06 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:42:23.264 11:53:06 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 266236 root 20 0 128.2g 36288 23040 S 0.0 0.0 0:00.29 reactor_0' 00:42:23.264 11:53:06 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 266236 root 20 0 128.2g 36288 23040 S 0.0 0.0 0:00.29 reactor_0 00:42:23.264 11:53:06 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:42:23.264 11:53:06 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:42:23.264 11:53:06 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:42:23.264 11:53:06 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:42:23.264 11:53:06 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:42:23.264 11:53:06 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:42:23.264 11:53:06 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:42:23.264 11:53:06 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:42:23.264 11:53:06 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:42:23.264 11:53:06 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 266236 1 00:42:23.264 11:53:06 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 266236 1 idle 00:42:23.264 11:53:06 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=266236 00:42:23.264 11:53:06 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=1 00:42:23.264 11:53:06 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:42:23.264 11:53:06 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:42:23.264 11:53:06 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:42:23.264 11:53:06 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:42:23.264 11:53:06 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:42:23.264 11:53:06 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:42:23.264 11:53:06 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 266236 -w 256 00:42:23.264 11:53:06 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_1 00:42:23.264 11:53:07 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 266239 root 20 0 128.2g 36288 23040 S 0.0 0.0 0:00.00 reactor_1' 00:42:23.264 11:53:07 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 266239 root 20 0 128.2g 36288 23040 S 0.0 0.0 0:00.00 reactor_1 00:42:23.264 11:53:07 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:42:23.264 11:53:07 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:42:23.264 11:53:07 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:42:23.264 11:53:07 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:42:23.264 11:53:07 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:42:23.264 11:53:07 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:42:23.264 11:53:07 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:42:23.264 11:53:07 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:42:23.264 11:53:07 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:42:23.264 11:53:07 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 266236 2 00:42:23.264 11:53:07 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 266236 2 idle 00:42:23.264 11:53:07 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=266236 00:42:23.264 11:53:07 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:42:23.264 11:53:07 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:42:23.264 11:53:07 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:42:23.264 11:53:07 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:42:23.264 11:53:07 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:42:23.264 11:53:07 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:42:23.264 11:53:07 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:42:23.264 11:53:07 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 266236 -w 256 00:42:23.264 11:53:07 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:42:23.523 11:53:07 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 266240 root 20 0 128.2g 36288 23040 S 0.0 0.0 0:00.00 reactor_2' 00:42:23.524 11:53:07 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 266240 root 20 0 128.2g 36288 23040 S 0.0 0.0 0:00.00 reactor_2 00:42:23.524 11:53:07 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:42:23.524 11:53:07 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:42:23.524 11:53:07 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:42:23.524 11:53:07 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:42:23.524 11:53:07 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:42:23.524 11:53:07 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:42:23.524 11:53:07 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:42:23.524 11:53:07 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:42:23.524 11:53:07 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@33 -- # '[' without_thdx '!=' x ']' 00:42:23.524 11:53:07 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@35 -- # for i in "${thd0_ids[@]}" 00:42:23.524 11:53:07 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@36 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_set_cpumask -i 1 -m 0x2 00:42:23.782 [2024-06-10 11:53:07.486335] thread.c:2095:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:42:23.782 11:53:07 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@43 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 -d 00:42:23.782 [2024-06-10 11:53:07.658115] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 0. 00:42:23.782 [2024-06-10 11:53:07.658557] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:42:23.782 11:53:07 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 -d 00:42:24.041 [2024-06-10 11:53:07.821997] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 2. 00:42:24.041 [2024-06-10 11:53:07.822176] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:42:24.041 11:53:07 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:42:24.041 11:53:07 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 266236 0 00:42:24.041 11:53:07 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 266236 0 busy 00:42:24.041 11:53:07 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=266236 00:42:24.041 11:53:07 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:42:24.041 11:53:07 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:42:24.041 11:53:07 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:42:24.041 11:53:07 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:42:24.041 11:53:07 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:42:24.041 11:53:07 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:42:24.041 11:53:07 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 266236 -w 256 00:42:24.041 11:53:07 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:42:24.299 11:53:08 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 266236 root 20 0 128.2g 36288 23040 R 99.9 0.0 0:00.65 reactor_0' 00:42:24.299 11:53:08 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 266236 root 20 0 128.2g 36288 23040 R 99.9 0.0 0:00.65 reactor_0 00:42:24.299 11:53:08 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:42:24.299 11:53:08 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:42:24.299 11:53:08 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:42:24.299 11:53:08 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:42:24.299 11:53:08 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:42:24.299 11:53:08 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:42:24.300 11:53:08 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:42:24.300 11:53:08 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:42:24.300 11:53:08 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:42:24.300 11:53:08 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 266236 2 00:42:24.300 11:53:08 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 266236 2 busy 00:42:24.300 11:53:08 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=266236 00:42:24.300 11:53:08 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:42:24.300 11:53:08 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:42:24.300 11:53:08 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:42:24.300 11:53:08 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:42:24.300 11:53:08 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:42:24.300 11:53:08 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:42:24.300 11:53:08 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 266236 -w 256 00:42:24.300 11:53:08 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:42:24.300 11:53:08 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 266240 root 20 0 128.2g 36288 23040 R 99.9 0.0 0:00.36 reactor_2' 00:42:24.300 11:53:08 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 266240 root 20 0 128.2g 36288 23040 R 99.9 0.0 0:00.36 reactor_2 00:42:24.300 11:53:08 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:42:24.300 11:53:08 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:42:24.300 11:53:08 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:42:24.300 11:53:08 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:42:24.300 11:53:08 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:42:24.300 11:53:08 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:42:24.300 11:53:08 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:42:24.300 11:53:08 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:42:24.300 11:53:08 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 00:42:24.587 [2024-06-10 11:53:08.361997] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 2. 00:42:24.587 [2024-06-10 11:53:08.362117] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:42:24.587 11:53:08 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@52 -- # '[' without_thdx '!=' x ']' 00:42:24.587 11:53:08 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@59 -- # reactor_is_idle 266236 2 00:42:24.587 11:53:08 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 266236 2 idle 00:42:24.587 11:53:08 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=266236 00:42:24.587 11:53:08 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:42:24.587 11:53:08 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:42:24.587 11:53:08 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:42:24.587 11:53:08 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:42:24.587 11:53:08 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:42:24.587 11:53:08 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:42:24.587 11:53:08 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:42:24.587 11:53:08 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 266236 -w 256 00:42:24.587 11:53:08 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:42:24.877 11:53:08 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 266240 root 20 0 128.2g 36288 23040 S 0.0 0.0 0:00.53 reactor_2' 00:42:24.877 11:53:08 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 266240 root 20 0 128.2g 36288 23040 S 0.0 0.0 0:00.53 reactor_2 00:42:24.877 11:53:08 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:42:24.877 11:53:08 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:42:24.877 11:53:08 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:42:24.877 11:53:08 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:42:24.877 11:53:08 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:42:24.877 11:53:08 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:42:24.877 11:53:08 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:42:24.877 11:53:08 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:42:24.877 11:53:08 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 00:42:24.877 [2024-06-10 11:53:08.709994] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 0. 00:42:24.877 [2024-06-10 11:53:08.710121] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:42:24.877 11:53:08 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@63 -- # '[' without_thdx '!=' x ']' 00:42:24.877 11:53:08 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@65 -- # for i in "${thd0_ids[@]}" 00:42:24.877 11:53:08 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@66 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_set_cpumask -i 1 -m 0x1 00:42:25.137 [2024-06-10 11:53:08.882126] thread.c:2095:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:42:25.137 11:53:08 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@70 -- # reactor_is_idle 266236 0 00:42:25.137 11:53:08 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 266236 0 idle 00:42:25.137 11:53:08 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=266236 00:42:25.137 11:53:08 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:42:25.137 11:53:08 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:42:25.138 11:53:08 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:42:25.138 11:53:08 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:42:25.138 11:53:08 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:42:25.138 11:53:08 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:42:25.138 11:53:08 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:42:25.138 11:53:08 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 266236 -w 256 00:42:25.138 11:53:08 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:42:25.138 11:53:09 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 266236 root 20 0 128.2g 36288 23040 S 0.0 0.0 0:01.35 reactor_0' 00:42:25.138 11:53:09 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 266236 root 20 0 128.2g 36288 23040 S 0.0 0.0 0:01.35 reactor_0 00:42:25.138 11:53:09 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:42:25.138 11:53:09 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:42:25.138 11:53:09 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:42:25.138 11:53:09 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:42:25.138 11:53:09 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:42:25.138 11:53:09 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:42:25.138 11:53:09 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:42:25.138 11:53:09 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:42:25.138 11:53:09 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@72 -- # return 0 00:42:25.138 11:53:09 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@77 -- # return 0 00:42:25.138 11:53:09 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@92 -- # trap - SIGINT SIGTERM EXIT 00:42:25.138 11:53:09 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@93 -- # killprocess 266236 00:42:25.138 11:53:09 reactor_set_interrupt -- common/autotest_common.sh@949 -- # '[' -z 266236 ']' 00:42:25.138 11:53:09 reactor_set_interrupt -- common/autotest_common.sh@953 -- # kill -0 266236 00:42:25.138 11:53:09 reactor_set_interrupt -- common/autotest_common.sh@954 -- # uname 00:42:25.397 11:53:09 reactor_set_interrupt -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:42:25.397 11:53:09 reactor_set_interrupt -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 266236 00:42:25.397 11:53:09 reactor_set_interrupt -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:42:25.397 11:53:09 reactor_set_interrupt -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:42:25.397 11:53:09 reactor_set_interrupt -- common/autotest_common.sh@967 -- # echo 'killing process with pid 266236' 00:42:25.397 killing process with pid 266236 00:42:25.397 11:53:09 reactor_set_interrupt -- common/autotest_common.sh@968 -- # kill 266236 00:42:25.397 11:53:09 reactor_set_interrupt -- common/autotest_common.sh@973 -- # wait 266236 00:42:25.655 11:53:09 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@94 -- # cleanup 00:42:25.655 11:53:09 reactor_set_interrupt -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:42:25.655 11:53:09 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@97 -- # start_intr_tgt 00:42:25.655 11:53:09 reactor_set_interrupt -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:42:25.655 11:53:09 reactor_set_interrupt -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:42:25.655 11:53:09 reactor_set_interrupt -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=266845 00:42:25.655 11:53:09 reactor_set_interrupt -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:42:25.655 11:53:09 reactor_set_interrupt -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:42:25.655 11:53:09 reactor_set_interrupt -- interrupt/interrupt_common.sh@26 -- # waitforlisten 266845 /var/tmp/spdk.sock 00:42:25.655 11:53:09 reactor_set_interrupt -- common/autotest_common.sh@830 -- # '[' -z 266845 ']' 00:42:25.655 11:53:09 reactor_set_interrupt -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:42:25.655 11:53:09 reactor_set_interrupt -- common/autotest_common.sh@835 -- # local max_retries=100 00:42:25.655 11:53:09 reactor_set_interrupt -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:42:25.655 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:42:25.655 11:53:09 reactor_set_interrupt -- common/autotest_common.sh@839 -- # xtrace_disable 00:42:25.655 11:53:09 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:42:25.655 [2024-06-10 11:53:09.421419] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:42:25.655 [2024-06-10 11:53:09.421477] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid266845 ] 00:42:25.655 [2024-06-10 11:53:09.507536] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:42:25.655 [2024-06-10 11:53:09.586062] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:42:25.655 [2024-06-10 11:53:09.586147] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:42:25.655 [2024-06-10 11:53:09.586149] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:42:25.915 [2024-06-10 11:53:09.652931] thread.c:2095:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:42:26.481 11:53:10 reactor_set_interrupt -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:42:26.481 11:53:10 reactor_set_interrupt -- common/autotest_common.sh@863 -- # return 0 00:42:26.481 11:53:10 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@98 -- # setup_bdev_mem 00:42:26.481 11:53:10 reactor_set_interrupt -- interrupt/common.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:42:26.739 Malloc0 00:42:26.739 Malloc1 00:42:26.739 Malloc2 00:42:26.739 11:53:10 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@99 -- # setup_bdev_aio 00:42:26.739 11:53:10 reactor_set_interrupt -- interrupt/common.sh@75 -- # uname -s 00:42:26.739 11:53:10 reactor_set_interrupt -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:42:26.739 11:53:10 reactor_set_interrupt -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:42:26.739 5000+0 records in 00:42:26.739 5000+0 records out 00:42:26.739 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0258167 s, 397 MB/s 00:42:26.739 11:53:10 reactor_set_interrupt -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:42:26.739 AIO0 00:42:26.739 11:53:10 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@101 -- # reactor_set_mode_with_threads 266845 00:42:26.739 11:53:10 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@81 -- # reactor_set_intr_mode 266845 00:42:26.739 11:53:10 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@14 -- # local spdk_pid=266845 00:42:26.739 11:53:10 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@15 -- # local without_thd= 00:42:26.739 11:53:10 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # thd0_ids=($(reactor_get_thread_ids $r0_mask)) 00:42:26.739 11:53:10 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # reactor_get_thread_ids 0x1 00:42:26.739 11:53:10 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x1 00:42:26.739 11:53:10 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:42:26.739 11:53:10 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=1 00:42:26.739 11:53:10 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:42:26.739 11:53:10 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:42:26.739 11:53:10 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 1 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:42:26.997 11:53:10 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo 1 00:42:26.997 11:53:10 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # thd2_ids=($(reactor_get_thread_ids $r2_mask)) 00:42:26.997 11:53:10 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # reactor_get_thread_ids 0x4 00:42:26.997 11:53:10 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x4 00:42:26.997 11:53:10 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:42:26.997 11:53:10 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=4 00:42:26.997 11:53:10 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:42:26.997 11:53:10 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:42:26.997 11:53:10 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 4 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:42:27.257 11:53:11 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo '' 00:42:27.257 11:53:11 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@21 -- # [[ 1 -eq 0 ]] 00:42:27.257 11:53:11 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@25 -- # echo 'spdk_thread ids are 1 on reactor0.' 00:42:27.257 spdk_thread ids are 1 on reactor0. 00:42:27.257 11:53:11 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:42:27.257 11:53:11 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 266845 0 00:42:27.257 11:53:11 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 266845 0 idle 00:42:27.257 11:53:11 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=266845 00:42:27.257 11:53:11 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:42:27.257 11:53:11 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:42:27.257 11:53:11 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:42:27.257 11:53:11 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:42:27.257 11:53:11 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:42:27.257 11:53:11 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:42:27.257 11:53:11 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:42:27.257 11:53:11 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:42:27.257 11:53:11 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 266845 -w 256 00:42:27.516 11:53:11 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 266845 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.29 reactor_0' 00:42:27.516 11:53:11 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 266845 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.29 reactor_0 00:42:27.516 11:53:11 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:42:27.516 11:53:11 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:42:27.516 11:53:11 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:42:27.516 11:53:11 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:42:27.516 11:53:11 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:42:27.516 11:53:11 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:42:27.516 11:53:11 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:42:27.516 11:53:11 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:42:27.516 11:53:11 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:42:27.516 11:53:11 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 266845 1 00:42:27.516 11:53:11 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 266845 1 idle 00:42:27.516 11:53:11 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=266845 00:42:27.516 11:53:11 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=1 00:42:27.516 11:53:11 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:42:27.516 11:53:11 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:42:27.516 11:53:11 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:42:27.516 11:53:11 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:42:27.516 11:53:11 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:42:27.516 11:53:11 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:42:27.516 11:53:11 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 266845 -w 256 00:42:27.516 11:53:11 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_1 00:42:27.516 11:53:11 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 266884 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_1' 00:42:27.516 11:53:11 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 266884 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_1 00:42:27.516 11:53:11 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:42:27.516 11:53:11 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:42:27.516 11:53:11 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:42:27.516 11:53:11 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:42:27.516 11:53:11 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:42:27.516 11:53:11 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:42:27.516 11:53:11 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:42:27.516 11:53:11 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:42:27.516 11:53:11 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:42:27.516 11:53:11 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 266845 2 00:42:27.516 11:53:11 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 266845 2 idle 00:42:27.516 11:53:11 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=266845 00:42:27.516 11:53:11 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:42:27.516 11:53:11 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:42:27.516 11:53:11 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:42:27.516 11:53:11 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:42:27.516 11:53:11 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:42:27.516 11:53:11 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:42:27.516 11:53:11 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:42:27.516 11:53:11 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 266845 -w 256 00:42:27.516 11:53:11 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:42:27.775 11:53:11 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 266885 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_2' 00:42:27.775 11:53:11 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 266885 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_2 00:42:27.775 11:53:11 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:42:27.775 11:53:11 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:42:27.775 11:53:11 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:42:27.775 11:53:11 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:42:27.775 11:53:11 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:42:27.775 11:53:11 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:42:27.775 11:53:11 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:42:27.775 11:53:11 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:42:27.775 11:53:11 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@33 -- # '[' x '!=' x ']' 00:42:27.775 11:53:11 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@43 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 -d 00:42:28.034 [2024-06-10 11:53:11.742699] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 0. 00:42:28.034 [2024-06-10 11:53:11.742893] thread.c:2095:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to poll mode from intr mode. 00:42:28.034 [2024-06-10 11:53:11.743002] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:42:28.034 11:53:11 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 -d 00:42:28.034 [2024-06-10 11:53:11.923036] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 2. 00:42:28.034 [2024-06-10 11:53:11.923188] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:42:28.034 11:53:11 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:42:28.034 11:53:11 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 266845 0 00:42:28.034 11:53:11 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 266845 0 busy 00:42:28.034 11:53:11 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=266845 00:42:28.034 11:53:11 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:42:28.034 11:53:11 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:42:28.034 11:53:11 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:42:28.034 11:53:11 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:42:28.034 11:53:11 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:42:28.034 11:53:11 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:42:28.034 11:53:11 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 266845 -w 256 00:42:28.034 11:53:11 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:42:28.292 11:53:12 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 266845 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.67 reactor_0' 00:42:28.292 11:53:12 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:42:28.292 11:53:12 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 266845 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.67 reactor_0 00:42:28.292 11:53:12 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:42:28.292 11:53:12 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:42:28.292 11:53:12 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:42:28.292 11:53:12 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:42:28.292 11:53:12 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:42:28.292 11:53:12 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:42:28.292 11:53:12 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:42:28.292 11:53:12 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:42:28.292 11:53:12 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 266845 2 00:42:28.292 11:53:12 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 266845 2 busy 00:42:28.292 11:53:12 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=266845 00:42:28.292 11:53:12 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:42:28.292 11:53:12 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:42:28.292 11:53:12 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:42:28.292 11:53:12 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:42:28.292 11:53:12 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:42:28.292 11:53:12 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:42:28.292 11:53:12 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 266845 -w 256 00:42:28.292 11:53:12 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:42:28.550 11:53:12 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 266885 root 20 0 128.2g 36864 23616 R 93.3 0.0 0:00.35 reactor_2' 00:42:28.550 11:53:12 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 266885 root 20 0 128.2g 36864 23616 R 93.3 0.0 0:00.35 reactor_2 00:42:28.550 11:53:12 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:42:28.550 11:53:12 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:42:28.550 11:53:12 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=93.3 00:42:28.550 11:53:12 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=93 00:42:28.550 11:53:12 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:42:28.550 11:53:12 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 93 -lt 70 ]] 00:42:28.550 11:53:12 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:42:28.550 11:53:12 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:42:28.550 11:53:12 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 00:42:28.550 [2024-06-10 11:53:12.460517] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 2. 00:42:28.550 [2024-06-10 11:53:12.460607] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:42:28.550 11:53:12 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@52 -- # '[' x '!=' x ']' 00:42:28.550 11:53:12 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@59 -- # reactor_is_idle 266845 2 00:42:28.550 11:53:12 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 266845 2 idle 00:42:28.550 11:53:12 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=266845 00:42:28.550 11:53:12 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:42:28.550 11:53:12 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:42:28.550 11:53:12 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:42:28.550 11:53:12 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:42:28.550 11:53:12 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:42:28.550 11:53:12 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:42:28.550 11:53:12 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:42:28.550 11:53:12 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 266845 -w 256 00:42:28.550 11:53:12 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:42:28.808 11:53:12 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 266885 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.53 reactor_2' 00:42:28.808 11:53:12 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 266885 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.53 reactor_2 00:42:28.808 11:53:12 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:42:28.808 11:53:12 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:42:28.808 11:53:12 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:42:28.808 11:53:12 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:42:28.808 11:53:12 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:42:28.808 11:53:12 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:42:28.808 11:53:12 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:42:28.808 11:53:12 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:42:28.808 11:53:12 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 00:42:29.066 [2024-06-10 11:53:12.813398] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 0. 00:42:29.066 [2024-06-10 11:53:12.813559] thread.c:2095:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from poll mode. 00:42:29.066 [2024-06-10 11:53:12.813576] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:42:29.066 11:53:12 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@63 -- # '[' x '!=' x ']' 00:42:29.066 11:53:12 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@70 -- # reactor_is_idle 266845 0 00:42:29.066 11:53:12 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 266845 0 idle 00:42:29.066 11:53:12 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=266845 00:42:29.066 11:53:12 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:42:29.066 11:53:12 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:42:29.066 11:53:12 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:42:29.066 11:53:12 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:42:29.066 11:53:12 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:42:29.066 11:53:12 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:42:29.066 11:53:12 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:42:29.066 11:53:12 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 266845 -w 256 00:42:29.066 11:53:12 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:42:29.066 11:53:13 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 266845 root 20 0 128.2g 36864 23616 S 6.7 0.0 0:01.38 reactor_0' 00:42:29.066 11:53:13 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:42:29.066 11:53:13 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 266845 root 20 0 128.2g 36864 23616 S 6.7 0.0 0:01.38 reactor_0 00:42:29.066 11:53:13 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:42:29.066 11:53:13 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=6.7 00:42:29.066 11:53:13 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=6 00:42:29.066 11:53:13 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:42:29.066 11:53:13 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:42:29.066 11:53:13 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 6 -gt 30 ]] 00:42:29.066 11:53:13 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:42:29.066 11:53:13 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@72 -- # return 0 00:42:29.066 11:53:13 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@82 -- # return 0 00:42:29.066 11:53:13 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@103 -- # trap - SIGINT SIGTERM EXIT 00:42:29.066 11:53:13 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@104 -- # killprocess 266845 00:42:29.066 11:53:13 reactor_set_interrupt -- common/autotest_common.sh@949 -- # '[' -z 266845 ']' 00:42:29.066 11:53:13 reactor_set_interrupt -- common/autotest_common.sh@953 -- # kill -0 266845 00:42:29.325 11:53:13 reactor_set_interrupt -- common/autotest_common.sh@954 -- # uname 00:42:29.325 11:53:13 reactor_set_interrupt -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:42:29.325 11:53:13 reactor_set_interrupt -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 266845 00:42:29.325 11:53:13 reactor_set_interrupt -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:42:29.325 11:53:13 reactor_set_interrupt -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:42:29.325 11:53:13 reactor_set_interrupt -- common/autotest_common.sh@967 -- # echo 'killing process with pid 266845' 00:42:29.325 killing process with pid 266845 00:42:29.325 11:53:13 reactor_set_interrupt -- common/autotest_common.sh@968 -- # kill 266845 00:42:29.325 11:53:13 reactor_set_interrupt -- common/autotest_common.sh@973 -- # wait 266845 00:42:29.584 11:53:13 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@105 -- # cleanup 00:42:29.584 11:53:13 reactor_set_interrupt -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:42:29.584 00:42:29.584 real 0m8.457s 00:42:29.584 user 0m7.446s 00:42:29.584 sys 0m1.816s 00:42:29.584 11:53:13 reactor_set_interrupt -- common/autotest_common.sh@1125 -- # xtrace_disable 00:42:29.584 11:53:13 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:42:29.584 ************************************ 00:42:29.584 END TEST reactor_set_interrupt 00:42:29.584 ************************************ 00:42:29.584 11:53:13 -- spdk/autotest.sh@194 -- # run_test reap_unregistered_poller /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:42:29.584 11:53:13 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:42:29.584 11:53:13 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:42:29.584 11:53:13 -- common/autotest_common.sh@10 -- # set +x 00:42:29.584 ************************************ 00:42:29.584 START TEST reap_unregistered_poller 00:42:29.584 ************************************ 00:42:29.585 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:42:29.585 * Looking for test storage... 00:42:29.585 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:42:29.585 11:53:13 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/interrupt_common.sh 00:42:29.585 11:53:13 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:42:29.585 11:53:13 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:42:29.585 11:53:13 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # testdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:42:29.585 11:53:13 reap_unregistered_poller -- interrupt/interrupt_common.sh@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/../.. 00:42:29.585 11:53:13 reap_unregistered_poller -- interrupt/interrupt_common.sh@6 -- # rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:42:29.585 11:53:13 reap_unregistered_poller -- interrupt/interrupt_common.sh@7 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh 00:42:29.585 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:42:29.585 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@34 -- # set -e 00:42:29.585 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:42:29.585 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@36 -- # shopt -s extglob 00:42:29.585 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:42:29.585 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/crypto-phy-autotest/spdk/../output ']' 00:42:29.585 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh ]] 00:42:29.847 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh 00:42:29.847 11:53:13 reap_unregistered_poller -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:42:29.847 11:53:13 reap_unregistered_poller -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:42:29.847 11:53:13 reap_unregistered_poller -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=y 00:42:29.847 11:53:13 reap_unregistered_poller -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:42:29.847 11:53:13 reap_unregistered_poller -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:42:29.847 11:53:13 reap_unregistered_poller -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:42:29.847 11:53:13 reap_unregistered_poller -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:42:29.847 11:53:13 reap_unregistered_poller -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:42:29.847 11:53:13 reap_unregistered_poller -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:42:29.847 11:53:13 reap_unregistered_poller -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:42:29.847 11:53:13 reap_unregistered_poller -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:42:29.847 11:53:13 reap_unregistered_poller -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:42:29.847 11:53:13 reap_unregistered_poller -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:42:29.847 11:53:13 reap_unregistered_poller -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:42:29.847 11:53:13 reap_unregistered_poller -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:42:29.847 11:53:13 reap_unregistered_poller -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:42:29.847 11:53:13 reap_unregistered_poller -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:42:29.847 11:53:13 reap_unregistered_poller -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:42:29.847 11:53:13 reap_unregistered_poller -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:42:29.847 11:53:13 reap_unregistered_poller -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:42:29.847 11:53:13 reap_unregistered_poller -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:42:29.847 11:53:13 reap_unregistered_poller -- common/build_config.sh@22 -- # CONFIG_CET=n 00:42:29.847 11:53:13 reap_unregistered_poller -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=y 00:42:29.847 11:53:13 reap_unregistered_poller -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:42:29.847 11:53:13 reap_unregistered_poller -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:42:29.847 11:53:13 reap_unregistered_poller -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:42:29.847 11:53:13 reap_unregistered_poller -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:42:29.847 11:53:13 reap_unregistered_poller -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:42:29.847 11:53:13 reap_unregistered_poller -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:42:29.847 11:53:13 reap_unregistered_poller -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:42:29.847 11:53:13 reap_unregistered_poller -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:42:29.847 11:53:13 reap_unregistered_poller -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:42:29.847 11:53:13 reap_unregistered_poller -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:42:29.847 11:53:13 reap_unregistered_poller -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB= 00:42:29.847 11:53:13 reap_unregistered_poller -- common/build_config.sh@35 -- # CONFIG_FUZZER=n 00:42:29.847 11:53:13 reap_unregistered_poller -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:42:29.847 11:53:13 reap_unregistered_poller -- common/build_config.sh@37 -- # CONFIG_CRYPTO=y 00:42:29.847 11:53:13 reap_unregistered_poller -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:42:29.847 11:53:13 reap_unregistered_poller -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:42:29.847 11:53:13 reap_unregistered_poller -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:42:29.847 11:53:13 reap_unregistered_poller -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:42:29.847 11:53:13 reap_unregistered_poller -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:42:29.847 11:53:13 reap_unregistered_poller -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:42:29.847 11:53:13 reap_unregistered_poller -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:42:29.847 11:53:13 reap_unregistered_poller -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:42:29.847 11:53:13 reap_unregistered_poller -- common/build_config.sh@46 -- # CONFIG_DPDK_UADK=n 00:42:29.847 11:53:13 reap_unregistered_poller -- common/build_config.sh@47 -- # CONFIG_COVERAGE=y 00:42:29.847 11:53:13 reap_unregistered_poller -- common/build_config.sh@48 -- # CONFIG_RDMA=y 00:42:29.847 11:53:13 reap_unregistered_poller -- common/build_config.sh@49 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:42:29.847 11:53:13 reap_unregistered_poller -- common/build_config.sh@50 -- # CONFIG_URING_PATH= 00:42:29.847 11:53:13 reap_unregistered_poller -- common/build_config.sh@51 -- # CONFIG_XNVME=n 00:42:29.847 11:53:13 reap_unregistered_poller -- common/build_config.sh@52 -- # CONFIG_VFIO_USER=n 00:42:29.847 11:53:13 reap_unregistered_poller -- common/build_config.sh@53 -- # CONFIG_ARCH=native 00:42:29.847 11:53:13 reap_unregistered_poller -- common/build_config.sh@54 -- # CONFIG_HAVE_EVP_MAC=y 00:42:29.847 11:53:13 reap_unregistered_poller -- common/build_config.sh@55 -- # CONFIG_URING_ZNS=n 00:42:29.847 11:53:13 reap_unregistered_poller -- common/build_config.sh@56 -- # CONFIG_WERROR=y 00:42:29.847 11:53:13 reap_unregistered_poller -- common/build_config.sh@57 -- # CONFIG_HAVE_LIBBSD=n 00:42:29.847 11:53:13 reap_unregistered_poller -- common/build_config.sh@58 -- # CONFIG_UBSAN=y 00:42:29.847 11:53:13 reap_unregistered_poller -- common/build_config.sh@59 -- # CONFIG_IPSEC_MB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:42:29.847 11:53:13 reap_unregistered_poller -- common/build_config.sh@60 -- # CONFIG_GOLANG=n 00:42:29.847 11:53:13 reap_unregistered_poller -- common/build_config.sh@61 -- # CONFIG_ISAL=y 00:42:29.847 11:53:13 reap_unregistered_poller -- common/build_config.sh@62 -- # CONFIG_IDXD_KERNEL=y 00:42:29.847 11:53:13 reap_unregistered_poller -- common/build_config.sh@63 -- # CONFIG_DPDK_LIB_DIR= 00:42:29.847 11:53:13 reap_unregistered_poller -- common/build_config.sh@64 -- # CONFIG_RDMA_PROV=verbs 00:42:29.847 11:53:13 reap_unregistered_poller -- common/build_config.sh@65 -- # CONFIG_APPS=y 00:42:29.847 11:53:13 reap_unregistered_poller -- common/build_config.sh@66 -- # CONFIG_SHARED=y 00:42:29.847 11:53:13 reap_unregistered_poller -- common/build_config.sh@67 -- # CONFIG_HAVE_KEYUTILS=y 00:42:29.848 11:53:13 reap_unregistered_poller -- common/build_config.sh@68 -- # CONFIG_FC_PATH= 00:42:29.848 11:53:13 reap_unregistered_poller -- common/build_config.sh@69 -- # CONFIG_DPDK_PKG_CONFIG=n 00:42:29.848 11:53:13 reap_unregistered_poller -- common/build_config.sh@70 -- # CONFIG_FC=n 00:42:29.848 11:53:13 reap_unregistered_poller -- common/build_config.sh@71 -- # CONFIG_AVAHI=n 00:42:29.848 11:53:13 reap_unregistered_poller -- common/build_config.sh@72 -- # CONFIG_FIO_PLUGIN=y 00:42:29.848 11:53:13 reap_unregistered_poller -- common/build_config.sh@73 -- # CONFIG_RAID5F=n 00:42:29.848 11:53:13 reap_unregistered_poller -- common/build_config.sh@74 -- # CONFIG_EXAMPLES=y 00:42:29.848 11:53:13 reap_unregistered_poller -- common/build_config.sh@75 -- # CONFIG_TESTS=y 00:42:29.848 11:53:13 reap_unregistered_poller -- common/build_config.sh@76 -- # CONFIG_CRYPTO_MLX5=y 00:42:29.848 11:53:13 reap_unregistered_poller -- common/build_config.sh@77 -- # CONFIG_MAX_LCORES= 00:42:29.848 11:53:13 reap_unregistered_poller -- common/build_config.sh@78 -- # CONFIG_IPSEC_MB=y 00:42:29.848 11:53:13 reap_unregistered_poller -- common/build_config.sh@79 -- # CONFIG_PGO_DIR= 00:42:29.848 11:53:13 reap_unregistered_poller -- common/build_config.sh@80 -- # CONFIG_DEBUG=y 00:42:29.848 11:53:13 reap_unregistered_poller -- common/build_config.sh@81 -- # CONFIG_DPDK_COMPRESSDEV=y 00:42:29.848 11:53:13 reap_unregistered_poller -- common/build_config.sh@82 -- # CONFIG_CROSS_PREFIX= 00:42:29.848 11:53:13 reap_unregistered_poller -- common/build_config.sh@83 -- # CONFIG_URING=n 00:42:29.848 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:42:29.848 11:53:13 reap_unregistered_poller -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:42:29.848 11:53:13 reap_unregistered_poller -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:42:29.848 11:53:13 reap_unregistered_poller -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:42:29.848 11:53:13 reap_unregistered_poller -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:42:29.848 11:53:13 reap_unregistered_poller -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:42:29.848 11:53:13 reap_unregistered_poller -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:42:29.848 11:53:13 reap_unregistered_poller -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:42:29.848 11:53:13 reap_unregistered_poller -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:42:29.848 11:53:13 reap_unregistered_poller -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:42:29.848 11:53:13 reap_unregistered_poller -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:42:29.848 11:53:13 reap_unregistered_poller -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:42:29.848 11:53:13 reap_unregistered_poller -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:42:29.848 11:53:13 reap_unregistered_poller -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:42:29.848 11:53:13 reap_unregistered_poller -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/config.h ]] 00:42:29.848 11:53:13 reap_unregistered_poller -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:42:29.848 #define SPDK_CONFIG_H 00:42:29.848 #define SPDK_CONFIG_APPS 1 00:42:29.848 #define SPDK_CONFIG_ARCH native 00:42:29.848 #undef SPDK_CONFIG_ASAN 00:42:29.848 #undef SPDK_CONFIG_AVAHI 00:42:29.848 #undef SPDK_CONFIG_CET 00:42:29.848 #define SPDK_CONFIG_COVERAGE 1 00:42:29.848 #define SPDK_CONFIG_CROSS_PREFIX 00:42:29.848 #define SPDK_CONFIG_CRYPTO 1 00:42:29.848 #define SPDK_CONFIG_CRYPTO_MLX5 1 00:42:29.848 #undef SPDK_CONFIG_CUSTOMOCF 00:42:29.848 #undef SPDK_CONFIG_DAOS 00:42:29.848 #define SPDK_CONFIG_DAOS_DIR 00:42:29.848 #define SPDK_CONFIG_DEBUG 1 00:42:29.848 #define SPDK_CONFIG_DPDK_COMPRESSDEV 1 00:42:29.848 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:42:29.848 #define SPDK_CONFIG_DPDK_INC_DIR 00:42:29.848 #define SPDK_CONFIG_DPDK_LIB_DIR 00:42:29.848 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:42:29.848 #undef SPDK_CONFIG_DPDK_UADK 00:42:29.848 #define SPDK_CONFIG_ENV /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:42:29.848 #define SPDK_CONFIG_EXAMPLES 1 00:42:29.848 #undef SPDK_CONFIG_FC 00:42:29.848 #define SPDK_CONFIG_FC_PATH 00:42:29.848 #define SPDK_CONFIG_FIO_PLUGIN 1 00:42:29.848 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:42:29.848 #undef SPDK_CONFIG_FUSE 00:42:29.848 #undef SPDK_CONFIG_FUZZER 00:42:29.848 #define SPDK_CONFIG_FUZZER_LIB 00:42:29.848 #undef SPDK_CONFIG_GOLANG 00:42:29.848 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:42:29.848 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:42:29.848 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:42:29.848 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:42:29.848 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:42:29.848 #undef SPDK_CONFIG_HAVE_LIBBSD 00:42:29.848 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:42:29.848 #define SPDK_CONFIG_IDXD 1 00:42:29.848 #define SPDK_CONFIG_IDXD_KERNEL 1 00:42:29.848 #define SPDK_CONFIG_IPSEC_MB 1 00:42:29.848 #define SPDK_CONFIG_IPSEC_MB_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:42:29.848 #define SPDK_CONFIG_ISAL 1 00:42:29.848 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:42:29.848 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:42:29.848 #define SPDK_CONFIG_LIBDIR 00:42:29.848 #undef SPDK_CONFIG_LTO 00:42:29.848 #define SPDK_CONFIG_MAX_LCORES 00:42:29.848 #define SPDK_CONFIG_NVME_CUSE 1 00:42:29.848 #undef SPDK_CONFIG_OCF 00:42:29.848 #define SPDK_CONFIG_OCF_PATH 00:42:29.848 #define SPDK_CONFIG_OPENSSL_PATH 00:42:29.848 #undef SPDK_CONFIG_PGO_CAPTURE 00:42:29.848 #define SPDK_CONFIG_PGO_DIR 00:42:29.848 #undef SPDK_CONFIG_PGO_USE 00:42:29.848 #define SPDK_CONFIG_PREFIX /usr/local 00:42:29.848 #undef SPDK_CONFIG_RAID5F 00:42:29.848 #undef SPDK_CONFIG_RBD 00:42:29.848 #define SPDK_CONFIG_RDMA 1 00:42:29.848 #define SPDK_CONFIG_RDMA_PROV verbs 00:42:29.848 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:42:29.848 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:42:29.848 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:42:29.848 #define SPDK_CONFIG_SHARED 1 00:42:29.848 #undef SPDK_CONFIG_SMA 00:42:29.848 #define SPDK_CONFIG_TESTS 1 00:42:29.848 #undef SPDK_CONFIG_TSAN 00:42:29.848 #define SPDK_CONFIG_UBLK 1 00:42:29.848 #define SPDK_CONFIG_UBSAN 1 00:42:29.848 #undef SPDK_CONFIG_UNIT_TESTS 00:42:29.848 #undef SPDK_CONFIG_URING 00:42:29.848 #define SPDK_CONFIG_URING_PATH 00:42:29.848 #undef SPDK_CONFIG_URING_ZNS 00:42:29.848 #undef SPDK_CONFIG_USDT 00:42:29.848 #define SPDK_CONFIG_VBDEV_COMPRESS 1 00:42:29.848 #define SPDK_CONFIG_VBDEV_COMPRESS_MLX5 1 00:42:29.848 #undef SPDK_CONFIG_VFIO_USER 00:42:29.848 #define SPDK_CONFIG_VFIO_USER_DIR 00:42:29.848 #define SPDK_CONFIG_VHOST 1 00:42:29.848 #define SPDK_CONFIG_VIRTIO 1 00:42:29.848 #undef SPDK_CONFIG_VTUNE 00:42:29.848 #define SPDK_CONFIG_VTUNE_DIR 00:42:29.848 #define SPDK_CONFIG_WERROR 1 00:42:29.848 #define SPDK_CONFIG_WPDK_DIR 00:42:29.848 #undef SPDK_CONFIG_XNVME 00:42:29.848 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:42:29.848 11:53:13 reap_unregistered_poller -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:42:29.848 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:42:29.848 11:53:13 reap_unregistered_poller -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:42:29.848 11:53:13 reap_unregistered_poller -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:42:29.848 11:53:13 reap_unregistered_poller -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:42:29.848 11:53:13 reap_unregistered_poller -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:42:29.849 11:53:13 reap_unregistered_poller -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:42:29.849 11:53:13 reap_unregistered_poller -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:42:29.849 11:53:13 reap_unregistered_poller -- paths/export.sh@5 -- # export PATH 00:42:29.849 11:53:13 reap_unregistered_poller -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:42:29.849 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:42:29.849 11:53:13 reap_unregistered_poller -- pm/common@6 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:42:29.849 11:53:13 reap_unregistered_poller -- pm/common@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:42:29.849 11:53:13 reap_unregistered_poller -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:42:29.849 11:53:13 reap_unregistered_poller -- pm/common@7 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/../../../ 00:42:29.849 11:53:13 reap_unregistered_poller -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:42:29.849 11:53:13 reap_unregistered_poller -- pm/common@64 -- # TEST_TAG=N/A 00:42:29.849 11:53:13 reap_unregistered_poller -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/crypto-phy-autotest/spdk/.run_test_name 00:42:29.849 11:53:13 reap_unregistered_poller -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power 00:42:29.849 11:53:13 reap_unregistered_poller -- pm/common@68 -- # uname -s 00:42:29.849 11:53:13 reap_unregistered_poller -- pm/common@68 -- # PM_OS=Linux 00:42:29.849 11:53:13 reap_unregistered_poller -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:42:29.849 11:53:13 reap_unregistered_poller -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:42:29.849 11:53:13 reap_unregistered_poller -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:42:29.849 11:53:13 reap_unregistered_poller -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:42:29.849 11:53:13 reap_unregistered_poller -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:42:29.849 11:53:13 reap_unregistered_poller -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:42:29.849 11:53:13 reap_unregistered_poller -- pm/common@76 -- # SUDO[0]= 00:42:29.849 11:53:13 reap_unregistered_poller -- pm/common@76 -- # SUDO[1]='sudo -E' 00:42:29.849 11:53:13 reap_unregistered_poller -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:42:29.849 11:53:13 reap_unregistered_poller -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:42:29.849 11:53:13 reap_unregistered_poller -- pm/common@81 -- # [[ Linux == Linux ]] 00:42:29.849 11:53:13 reap_unregistered_poller -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:42:29.849 11:53:13 reap_unregistered_poller -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:42:29.849 11:53:13 reap_unregistered_poller -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:42:29.849 11:53:13 reap_unregistered_poller -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:42:29.849 11:53:13 reap_unregistered_poller -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power ]] 00:42:29.849 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@58 -- # : 0 00:42:29.849 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:42:29.849 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@62 -- # : 0 00:42:29.849 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:42:29.849 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@64 -- # : 0 00:42:29.849 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:42:29.849 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@66 -- # : 1 00:42:29.849 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:42:29.849 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@68 -- # : 0 00:42:29.849 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:42:29.849 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@70 -- # : 00:42:29.849 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:42:29.849 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@72 -- # : 0 00:42:29.849 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:42:29.849 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@74 -- # : 1 00:42:29.849 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:42:29.849 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@76 -- # : 0 00:42:29.849 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:42:29.849 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@78 -- # : 0 00:42:29.849 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:42:29.849 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@80 -- # : 0 00:42:29.849 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:42:29.849 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@82 -- # : 0 00:42:29.849 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:42:29.849 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@84 -- # : 0 00:42:29.849 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:42:29.849 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@86 -- # : 0 00:42:29.849 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:42:29.849 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@88 -- # : 0 00:42:29.849 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:42:29.849 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@90 -- # : 0 00:42:29.849 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:42:29.849 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@92 -- # : 0 00:42:29.849 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:42:29.849 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@94 -- # : 0 00:42:29.849 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:42:29.849 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@96 -- # : 0 00:42:29.849 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:42:29.849 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@98 -- # : 0 00:42:29.849 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:42:29.849 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@100 -- # : 0 00:42:29.849 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:42:29.849 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@102 -- # : rdma 00:42:29.849 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:42:29.849 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@104 -- # : 0 00:42:29.849 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:42:29.849 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@106 -- # : 0 00:42:29.849 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:42:29.849 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@108 -- # : 1 00:42:29.849 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:42:29.849 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@110 -- # : 0 00:42:29.849 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@111 -- # export SPDK_TEST_IOAT 00:42:29.849 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@112 -- # : 0 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@113 -- # export SPDK_TEST_BLOBFS 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@114 -- # : 0 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@115 -- # export SPDK_TEST_VHOST_INIT 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@116 -- # : 0 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@117 -- # export SPDK_TEST_LVOL 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@118 -- # : 1 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@119 -- # export SPDK_TEST_VBDEV_COMPRESS 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@120 -- # : 0 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@121 -- # export SPDK_RUN_ASAN 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@122 -- # : 1 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@123 -- # export SPDK_RUN_UBSAN 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@124 -- # : 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@125 -- # export SPDK_RUN_EXTERNAL_DPDK 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@126 -- # : 0 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@127 -- # export SPDK_RUN_NON_ROOT 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@128 -- # : 1 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@129 -- # export SPDK_TEST_CRYPTO 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@130 -- # : 0 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@131 -- # export SPDK_TEST_FTL 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@132 -- # : 0 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@133 -- # export SPDK_TEST_OCF 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@134 -- # : 0 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@135 -- # export SPDK_TEST_VMD 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@136 -- # : 0 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@137 -- # export SPDK_TEST_OPAL 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@138 -- # : 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@139 -- # export SPDK_TEST_NATIVE_DPDK 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@140 -- # : true 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@141 -- # export SPDK_AUTOTEST_X 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@142 -- # : 0 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@143 -- # export SPDK_TEST_RAID5 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@144 -- # : 0 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@146 -- # : 0 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@148 -- # : 0 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@150 -- # : 0 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@152 -- # : 0 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@154 -- # : 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@156 -- # : 0 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@158 -- # : 0 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@160 -- # : 0 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@162 -- # : 0 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL_DSA 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@164 -- # : 0 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_IAA 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@167 -- # : 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@168 -- # export SPDK_TEST_FUZZER_TARGET 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@169 -- # : 0 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@170 -- # export SPDK_TEST_NVMF_MDNS 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@171 -- # : 0 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@172 -- # export SPDK_JSONRPC_GO_CLIENT 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@175 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@175 -- # SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@176 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@176 -- # DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@177 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@177 -- # VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@178 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@178 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@181 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@181 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@185 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@185 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@189 -- # export PYTHONDONTWRITEBYTECODE=1 00:42:29.850 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@189 -- # PYTHONDONTWRITEBYTECODE=1 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@193 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@193 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@194 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@194 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@198 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@199 -- # rm -rf /var/tmp/asan_suppression_file 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@200 -- # cat 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@236 -- # echo leak:libfuse3.so 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@238 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@238 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@240 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@240 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@242 -- # '[' -z /var/spdk/dependencies ']' 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@245 -- # export DEPENDENCY_DIR 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@249 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@249 -- # SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@250 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@250 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@253 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@253 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@254 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@254 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@256 -- # export AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@256 -- # AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@259 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@259 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@262 -- # '[' 0 -eq 0 ']' 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@263 -- # export valgrind= 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@263 -- # valgrind= 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@269 -- # uname -s 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@269 -- # '[' Linux = Linux ']' 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@270 -- # HUGEMEM=4096 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@271 -- # export CLEAR_HUGE=yes 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@271 -- # CLEAR_HUGE=yes 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@272 -- # [[ 1 -eq 1 ]] 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@276 -- # export HUGE_EVEN_ALLOC=yes 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@276 -- # HUGE_EVEN_ALLOC=yes 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@279 -- # MAKE=make 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@280 -- # MAKEFLAGS=-j72 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@296 -- # export HUGEMEM=4096 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@296 -- # HUGEMEM=4096 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@298 -- # NO_HUGE=() 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@299 -- # TEST_MODE= 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@318 -- # [[ -z 267482 ]] 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@318 -- # kill -0 267482 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@1679 -- # set_test_storage 2147483648 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@328 -- # [[ -v testdir ]] 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@330 -- # local requested_size=2147483648 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@331 -- # local mount target_dir 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@333 -- # local -A mounts fss sizes avails uses 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@334 -- # local source fs size avail mount use 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@336 -- # local storage_fallback storage_candidates 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@338 -- # mktemp -udt spdk.XXXXXX 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@338 -- # storage_fallback=/tmp/spdk.8Fs9f1 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@343 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@345 -- # [[ -n '' ]] 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@350 -- # [[ -n '' ]] 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@355 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt /tmp/spdk.8Fs9f1/tests/interrupt /tmp/spdk.8Fs9f1 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@358 -- # requested_size=2214592512 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@327 -- # df -T 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@327 -- # grep -v Filesystem 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_devtmpfs 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=devtmpfs 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=67108864 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=67108864 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=0 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=/dev/pmem0 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=ext2 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=956399616 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=5284429824 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=4328030208 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_root 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=overlay 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=84617625600 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=94508552192 00:42:29.851 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=9890926592 00:42:29.852 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:42:29.852 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:42:29.852 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:42:29.852 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=47249563648 00:42:29.852 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=47254274048 00:42:29.852 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=4710400 00:42:29.852 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:42:29.852 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:42:29.852 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:42:29.852 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=18892316672 00:42:29.852 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=18901712896 00:42:29.852 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=9396224 00:42:29.852 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:42:29.852 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:42:29.852 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:42:29.852 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=47253676032 00:42:29.852 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=47254278144 00:42:29.852 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=602112 00:42:29.852 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:42:29.852 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:42:29.852 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:42:29.852 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=9450848256 00:42:29.852 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=9450852352 00:42:29.852 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=4096 00:42:29.852 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:42:29.852 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@366 -- # printf '* Looking for test storage...\n' 00:42:29.852 * Looking for test storage... 00:42:29.852 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@368 -- # local target_space new_size 00:42:29.852 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@369 -- # for target_dir in "${storage_candidates[@]}" 00:42:29.852 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@372 -- # df /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:42:29.852 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@372 -- # awk '$1 !~ /Filesystem/{print $6}' 00:42:29.852 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@372 -- # mount=/ 00:42:29.852 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@374 -- # target_space=84617625600 00:42:29.852 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@375 -- # (( target_space == 0 || target_space < requested_size )) 00:42:29.852 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@378 -- # (( target_space >= requested_size )) 00:42:29.852 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ overlay == tmpfs ]] 00:42:29.852 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ overlay == ramfs ]] 00:42:29.852 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ / == / ]] 00:42:29.852 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@381 -- # new_size=12105519104 00:42:29.852 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@382 -- # (( new_size * 100 / sizes[/] > 95 )) 00:42:29.852 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@387 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:42:29.852 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@387 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:42:29.852 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@388 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:42:29.852 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:42:29.852 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@389 -- # return 0 00:42:29.852 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@1681 -- # set -o errtrace 00:42:29.852 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@1682 -- # shopt -s extdebug 00:42:29.852 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@1683 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:42:29.852 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@1685 -- # PS4=' \t $test_domain -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:42:29.852 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@1686 -- # true 00:42:29.852 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@1688 -- # xtrace_fd 00:42:29.852 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@25 -- # [[ -n 13 ]] 00:42:29.852 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/13 ]] 00:42:29.852 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@27 -- # exec 00:42:29.852 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@29 -- # exec 00:42:29.852 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@31 -- # xtrace_restore 00:42:29.852 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:42:29.852 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:42:29.852 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@18 -- # set -x 00:42:29.852 11:53:13 reap_unregistered_poller -- interrupt/interrupt_common.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/common.sh 00:42:29.852 11:53:13 reap_unregistered_poller -- interrupt/interrupt_common.sh@10 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:42:29.852 11:53:13 reap_unregistered_poller -- interrupt/interrupt_common.sh@12 -- # r0_mask=0x1 00:42:29.852 11:53:13 reap_unregistered_poller -- interrupt/interrupt_common.sh@13 -- # r1_mask=0x2 00:42:29.852 11:53:13 reap_unregistered_poller -- interrupt/interrupt_common.sh@14 -- # r2_mask=0x4 00:42:29.852 11:53:13 reap_unregistered_poller -- interrupt/interrupt_common.sh@16 -- # cpu_server_mask=0x07 00:42:29.852 11:53:13 reap_unregistered_poller -- interrupt/interrupt_common.sh@17 -- # rpc_server_addr=/var/tmp/spdk.sock 00:42:29.852 11:53:13 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@14 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:42:29.853 11:53:13 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@14 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:42:29.853 11:53:13 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@17 -- # start_intr_tgt 00:42:29.853 11:53:13 reap_unregistered_poller -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:42:29.853 11:53:13 reap_unregistered_poller -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:42:29.853 11:53:13 reap_unregistered_poller -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=267647 00:42:29.853 11:53:13 reap_unregistered_poller -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:42:29.853 11:53:13 reap_unregistered_poller -- interrupt/interrupt_common.sh@26 -- # waitforlisten 267647 /var/tmp/spdk.sock 00:42:29.853 11:53:13 reap_unregistered_poller -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:42:29.853 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@830 -- # '[' -z 267647 ']' 00:42:29.853 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:42:29.853 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@835 -- # local max_retries=100 00:42:29.853 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:42:29.853 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:42:29.853 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@839 -- # xtrace_disable 00:42:29.853 11:53:13 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:42:29.853 [2024-06-10 11:53:13.741016] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:42:29.853 [2024-06-10 11:53:13.741078] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid267647 ] 00:42:30.112 [2024-06-10 11:53:13.829903] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:42:30.112 [2024-06-10 11:53:13.922320] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:42:30.112 [2024-06-10 11:53:13.922404] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:42:30.112 [2024-06-10 11:53:13.922406] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:42:30.112 [2024-06-10 11:53:13.991271] thread.c:2095:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:42:30.680 11:53:14 reap_unregistered_poller -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:42:30.680 11:53:14 reap_unregistered_poller -- common/autotest_common.sh@863 -- # return 0 00:42:30.680 11:53:14 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # rpc_cmd thread_get_pollers 00:42:30.680 11:53:14 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # jq -r '.threads[0]' 00:42:30.680 11:53:14 reap_unregistered_poller -- common/autotest_common.sh@560 -- # xtrace_disable 00:42:30.680 11:53:14 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:42:30.680 11:53:14 reap_unregistered_poller -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:42:30.680 11:53:14 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # app_thread='{ 00:42:30.680 "name": "app_thread", 00:42:30.680 "id": 1, 00:42:30.680 "active_pollers": [], 00:42:30.680 "timed_pollers": [ 00:42:30.680 { 00:42:30.680 "name": "rpc_subsystem_poll_servers", 00:42:30.680 "id": 1, 00:42:30.680 "state": "waiting", 00:42:30.680 "run_count": 0, 00:42:30.680 "busy_count": 0, 00:42:30.680 "period_ticks": 9200000 00:42:30.680 } 00:42:30.680 ], 00:42:30.680 "paused_pollers": [] 00:42:30.680 }' 00:42:30.680 11:53:14 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@21 -- # jq -r '.active_pollers[].name' 00:42:30.681 11:53:14 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@21 -- # native_pollers= 00:42:30.681 11:53:14 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@22 -- # native_pollers+=' ' 00:42:30.681 11:53:14 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@23 -- # jq -r '.timed_pollers[].name' 00:42:30.940 11:53:14 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@23 -- # native_pollers+=rpc_subsystem_poll_servers 00:42:30.940 11:53:14 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@28 -- # setup_bdev_aio 00:42:30.940 11:53:14 reap_unregistered_poller -- interrupt/common.sh@75 -- # uname -s 00:42:30.940 11:53:14 reap_unregistered_poller -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:42:30.940 11:53:14 reap_unregistered_poller -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:42:30.940 5000+0 records in 00:42:30.940 5000+0 records out 00:42:30.940 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0253889 s, 403 MB/s 00:42:30.940 11:53:14 reap_unregistered_poller -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:42:30.940 AIO0 00:42:30.940 11:53:14 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:42:31.199 11:53:15 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@34 -- # sleep 0.1 00:42:31.459 11:53:15 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # rpc_cmd thread_get_pollers 00:42:31.459 11:53:15 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # jq -r '.threads[0]' 00:42:31.459 11:53:15 reap_unregistered_poller -- common/autotest_common.sh@560 -- # xtrace_disable 00:42:31.459 11:53:15 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:42:31.459 11:53:15 reap_unregistered_poller -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:42:31.459 11:53:15 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # app_thread='{ 00:42:31.459 "name": "app_thread", 00:42:31.459 "id": 1, 00:42:31.459 "active_pollers": [], 00:42:31.459 "timed_pollers": [ 00:42:31.459 { 00:42:31.459 "name": "rpc_subsystem_poll_servers", 00:42:31.459 "id": 1, 00:42:31.459 "state": "waiting", 00:42:31.459 "run_count": 0, 00:42:31.459 "busy_count": 0, 00:42:31.459 "period_ticks": 9200000 00:42:31.459 } 00:42:31.459 ], 00:42:31.459 "paused_pollers": [] 00:42:31.459 }' 00:42:31.459 11:53:15 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@38 -- # jq -r '.active_pollers[].name' 00:42:31.459 11:53:15 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@38 -- # remaining_pollers= 00:42:31.459 11:53:15 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@39 -- # remaining_pollers+=' ' 00:42:31.460 11:53:15 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@40 -- # jq -r '.timed_pollers[].name' 00:42:31.460 11:53:15 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@40 -- # remaining_pollers+=rpc_subsystem_poll_servers 00:42:31.460 11:53:15 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@44 -- # [[ rpc_subsystem_poll_servers == \ \r\p\c\_\s\u\b\s\y\s\t\e\m\_\p\o\l\l\_\s\e\r\v\e\r\s ]] 00:42:31.460 11:53:15 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@46 -- # trap - SIGINT SIGTERM EXIT 00:42:31.460 11:53:15 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@47 -- # killprocess 267647 00:42:31.460 11:53:15 reap_unregistered_poller -- common/autotest_common.sh@949 -- # '[' -z 267647 ']' 00:42:31.460 11:53:15 reap_unregistered_poller -- common/autotest_common.sh@953 -- # kill -0 267647 00:42:31.460 11:53:15 reap_unregistered_poller -- common/autotest_common.sh@954 -- # uname 00:42:31.460 11:53:15 reap_unregistered_poller -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:42:31.460 11:53:15 reap_unregistered_poller -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 267647 00:42:31.460 11:53:15 reap_unregistered_poller -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:42:31.460 11:53:15 reap_unregistered_poller -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:42:31.460 11:53:15 reap_unregistered_poller -- common/autotest_common.sh@967 -- # echo 'killing process with pid 267647' 00:42:31.460 killing process with pid 267647 00:42:31.460 11:53:15 reap_unregistered_poller -- common/autotest_common.sh@968 -- # kill 267647 00:42:31.460 11:53:15 reap_unregistered_poller -- common/autotest_common.sh@973 -- # wait 267647 00:42:31.719 11:53:15 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@48 -- # cleanup 00:42:31.719 11:53:15 reap_unregistered_poller -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:42:31.719 00:42:31.719 real 0m2.130s 00:42:31.719 user 0m1.172s 00:42:31.719 sys 0m0.649s 00:42:31.719 11:53:15 reap_unregistered_poller -- common/autotest_common.sh@1125 -- # xtrace_disable 00:42:31.719 11:53:15 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:42:31.719 ************************************ 00:42:31.719 END TEST reap_unregistered_poller 00:42:31.719 ************************************ 00:42:31.719 11:53:15 -- spdk/autotest.sh@198 -- # uname -s 00:42:31.719 11:53:15 -- spdk/autotest.sh@198 -- # [[ Linux == Linux ]] 00:42:31.719 11:53:15 -- spdk/autotest.sh@199 -- # [[ 1 -eq 1 ]] 00:42:31.719 11:53:15 -- spdk/autotest.sh@205 -- # [[ 1 -eq 0 ]] 00:42:31.719 11:53:15 -- spdk/autotest.sh@211 -- # '[' 0 -eq 1 ']' 00:42:31.719 11:53:15 -- spdk/autotest.sh@256 -- # '[' 0 -eq 1 ']' 00:42:31.719 11:53:15 -- spdk/autotest.sh@260 -- # timing_exit lib 00:42:31.719 11:53:15 -- common/autotest_common.sh@729 -- # xtrace_disable 00:42:31.719 11:53:15 -- common/autotest_common.sh@10 -- # set +x 00:42:31.719 11:53:15 -- spdk/autotest.sh@262 -- # '[' 0 -eq 1 ']' 00:42:31.719 11:53:15 -- spdk/autotest.sh@270 -- # '[' 0 -eq 1 ']' 00:42:31.719 11:53:15 -- spdk/autotest.sh@279 -- # '[' 0 -eq 1 ']' 00:42:31.719 11:53:15 -- spdk/autotest.sh@308 -- # '[' 0 -eq 1 ']' 00:42:31.719 11:53:15 -- spdk/autotest.sh@312 -- # '[' 0 -eq 1 ']' 00:42:31.719 11:53:15 -- spdk/autotest.sh@316 -- # '[' 0 -eq 1 ']' 00:42:31.719 11:53:15 -- spdk/autotest.sh@321 -- # '[' 0 -eq 1 ']' 00:42:31.719 11:53:15 -- spdk/autotest.sh@330 -- # '[' 0 -eq 1 ']' 00:42:31.719 11:53:15 -- spdk/autotest.sh@335 -- # '[' 0 -eq 1 ']' 00:42:31.719 11:53:15 -- spdk/autotest.sh@339 -- # '[' 0 -eq 1 ']' 00:42:31.719 11:53:15 -- spdk/autotest.sh@343 -- # '[' 0 -eq 1 ']' 00:42:31.719 11:53:15 -- spdk/autotest.sh@347 -- # '[' 1 -eq 1 ']' 00:42:31.719 11:53:15 -- spdk/autotest.sh@348 -- # run_test compress_compdev /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh compdev 00:42:31.719 11:53:15 -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:42:31.719 11:53:15 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:42:31.719 11:53:15 -- common/autotest_common.sh@10 -- # set +x 00:42:31.719 ************************************ 00:42:31.719 START TEST compress_compdev 00:42:31.719 ************************************ 00:42:31.719 11:53:15 compress_compdev -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh compdev 00:42:31.979 * Looking for test storage... 00:42:31.979 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress 00:42:31.979 11:53:15 compress_compdev -- compress/compress.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:42:31.979 11:53:15 compress_compdev -- nvmf/common.sh@7 -- # uname -s 00:42:31.979 11:53:15 compress_compdev -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:42:31.979 11:53:15 compress_compdev -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:42:31.979 11:53:15 compress_compdev -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:42:31.979 11:53:15 compress_compdev -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:42:31.979 11:53:15 compress_compdev -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:42:31.979 11:53:15 compress_compdev -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:42:31.979 11:53:15 compress_compdev -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:42:31.979 11:53:15 compress_compdev -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:42:31.979 11:53:15 compress_compdev -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:42:31.979 11:53:15 compress_compdev -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:42:31.979 11:53:15 compress_compdev -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:804da62e-425e-e711-906e-0017a4403562 00:42:31.979 11:53:15 compress_compdev -- nvmf/common.sh@18 -- # NVME_HOSTID=804da62e-425e-e711-906e-0017a4403562 00:42:31.979 11:53:15 compress_compdev -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:42:31.979 11:53:15 compress_compdev -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:42:31.979 11:53:15 compress_compdev -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:42:31.979 11:53:15 compress_compdev -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:42:31.979 11:53:15 compress_compdev -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:42:31.979 11:53:15 compress_compdev -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:42:31.979 11:53:15 compress_compdev -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:42:31.979 11:53:15 compress_compdev -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:42:31.979 11:53:15 compress_compdev -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:42:31.979 11:53:15 compress_compdev -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:42:31.979 11:53:15 compress_compdev -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:42:31.979 11:53:15 compress_compdev -- paths/export.sh@5 -- # export PATH 00:42:31.979 11:53:15 compress_compdev -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:42:31.979 11:53:15 compress_compdev -- nvmf/common.sh@47 -- # : 0 00:42:31.979 11:53:15 compress_compdev -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:42:31.979 11:53:15 compress_compdev -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:42:31.979 11:53:15 compress_compdev -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:42:31.979 11:53:15 compress_compdev -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:42:31.979 11:53:15 compress_compdev -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:42:31.979 11:53:15 compress_compdev -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:42:31.979 11:53:15 compress_compdev -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:42:31.979 11:53:15 compress_compdev -- nvmf/common.sh@51 -- # have_pci_nics=0 00:42:31.979 11:53:15 compress_compdev -- compress/compress.sh@17 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:42:31.979 11:53:15 compress_compdev -- compress/compress.sh@81 -- # mkdir -p /tmp/pmem 00:42:31.979 11:53:15 compress_compdev -- compress/compress.sh@82 -- # test_type=compdev 00:42:31.979 11:53:15 compress_compdev -- compress/compress.sh@86 -- # run_bdevperf 32 4096 3 00:42:31.979 11:53:15 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:42:31.979 11:53:15 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=267955 00:42:31.979 11:53:15 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:42:31.979 11:53:15 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 267955 00:42:31.979 11:53:15 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:42:31.979 11:53:15 compress_compdev -- common/autotest_common.sh@830 -- # '[' -z 267955 ']' 00:42:31.979 11:53:15 compress_compdev -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:42:31.979 11:53:15 compress_compdev -- common/autotest_common.sh@835 -- # local max_retries=100 00:42:31.979 11:53:15 compress_compdev -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:42:31.979 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:42:31.979 11:53:15 compress_compdev -- common/autotest_common.sh@839 -- # xtrace_disable 00:42:31.979 11:53:15 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:42:31.979 [2024-06-10 11:53:15.867156] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:42:31.979 [2024-06-10 11:53:15.867213] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid267955 ] 00:42:32.239 [2024-06-10 11:53:15.954552] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:42:32.239 [2024-06-10 11:53:16.040131] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:42:32.239 [2024-06-10 11:53:16.040134] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:42:32.806 [2024-06-10 11:53:16.580939] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:42:32.806 11:53:16 compress_compdev -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:42:32.806 11:53:16 compress_compdev -- common/autotest_common.sh@863 -- # return 0 00:42:32.806 11:53:16 compress_compdev -- compress/compress.sh@74 -- # create_vols 00:42:32.806 11:53:16 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:42:32.806 11:53:16 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:42:33.373 [2024-06-10 11:53:17.146132] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x18b4db0 PMD being used: compress_qat 00:42:33.373 11:53:17 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:42:33.373 11:53:17 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_name=Nvme0n1 00:42:33.373 11:53:17 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:42:33.373 11:53:17 compress_compdev -- common/autotest_common.sh@900 -- # local i 00:42:33.373 11:53:17 compress_compdev -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:42:33.373 11:53:17 compress_compdev -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:42:33.373 11:53:17 compress_compdev -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:42:33.631 11:53:17 compress_compdev -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:42:33.631 [ 00:42:33.631 { 00:42:33.631 "name": "Nvme0n1", 00:42:33.631 "aliases": [ 00:42:33.631 "01000000-0000-0000-5cd2-e4cdeb7b5351" 00:42:33.631 ], 00:42:33.631 "product_name": "NVMe disk", 00:42:33.631 "block_size": 512, 00:42:33.631 "num_blocks": 7501476528, 00:42:33.631 "uuid": "01000000-0000-0000-5cd2-e4cdeb7b5351", 00:42:33.631 "assigned_rate_limits": { 00:42:33.631 "rw_ios_per_sec": 0, 00:42:33.631 "rw_mbytes_per_sec": 0, 00:42:33.631 "r_mbytes_per_sec": 0, 00:42:33.631 "w_mbytes_per_sec": 0 00:42:33.631 }, 00:42:33.631 "claimed": false, 00:42:33.631 "zoned": false, 00:42:33.631 "supported_io_types": { 00:42:33.631 "read": true, 00:42:33.631 "write": true, 00:42:33.631 "unmap": true, 00:42:33.631 "write_zeroes": true, 00:42:33.631 "flush": true, 00:42:33.631 "reset": true, 00:42:33.631 "compare": false, 00:42:33.631 "compare_and_write": false, 00:42:33.631 "abort": true, 00:42:33.631 "nvme_admin": true, 00:42:33.631 "nvme_io": true 00:42:33.631 }, 00:42:33.631 "driver_specific": { 00:42:33.631 "nvme": [ 00:42:33.631 { 00:42:33.631 "pci_address": "0000:5e:00.0", 00:42:33.631 "trid": { 00:42:33.631 "trtype": "PCIe", 00:42:33.631 "traddr": "0000:5e:00.0" 00:42:33.631 }, 00:42:33.631 "ctrlr_data": { 00:42:33.631 "cntlid": 0, 00:42:33.631 "vendor_id": "0x8086", 00:42:33.631 "model_number": "INTEL SSDPF2KX038T1", 00:42:33.631 "serial_number": "PHAX137100A93P8CGN", 00:42:33.631 "firmware_revision": "9CV10015", 00:42:33.631 "subnqn": "nqn.2021-09.com.intel:PHAX137100A93P8CGN ", 00:42:33.631 "oacs": { 00:42:33.631 "security": 0, 00:42:33.631 "format": 1, 00:42:33.631 "firmware": 1, 00:42:33.631 "ns_manage": 1 00:42:33.631 }, 00:42:33.631 "multi_ctrlr": false, 00:42:33.631 "ana_reporting": false 00:42:33.631 }, 00:42:33.631 "vs": { 00:42:33.631 "nvme_version": "1.4" 00:42:33.631 }, 00:42:33.631 "ns_data": { 00:42:33.631 "id": 1, 00:42:33.631 "can_share": false 00:42:33.631 } 00:42:33.631 } 00:42:33.631 ], 00:42:33.631 "mp_policy": "active_passive" 00:42:33.631 } 00:42:33.631 } 00:42:33.631 ] 00:42:33.631 11:53:17 compress_compdev -- common/autotest_common.sh@906 -- # return 0 00:42:33.631 11:53:17 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:42:33.889 [2024-06-10 11:53:17.678309] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1702e70 PMD being used: compress_qat 00:42:33.889 3f50bb5a-b07d-4bbb-83ba-f8d1cff9bf3d 00:42:33.889 11:53:17 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:42:34.147 221a1a01-1690-497f-8875-dc976a6eed86 00:42:34.147 11:53:17 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:42:34.147 11:53:17 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_name=lvs0/lv0 00:42:34.147 11:53:17 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:42:34.147 11:53:17 compress_compdev -- common/autotest_common.sh@900 -- # local i 00:42:34.147 11:53:17 compress_compdev -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:42:34.147 11:53:17 compress_compdev -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:42:34.147 11:53:17 compress_compdev -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:42:34.147 11:53:18 compress_compdev -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:42:34.405 [ 00:42:34.405 { 00:42:34.405 "name": "221a1a01-1690-497f-8875-dc976a6eed86", 00:42:34.405 "aliases": [ 00:42:34.405 "lvs0/lv0" 00:42:34.405 ], 00:42:34.405 "product_name": "Logical Volume", 00:42:34.405 "block_size": 512, 00:42:34.405 "num_blocks": 204800, 00:42:34.405 "uuid": "221a1a01-1690-497f-8875-dc976a6eed86", 00:42:34.405 "assigned_rate_limits": { 00:42:34.405 "rw_ios_per_sec": 0, 00:42:34.405 "rw_mbytes_per_sec": 0, 00:42:34.405 "r_mbytes_per_sec": 0, 00:42:34.405 "w_mbytes_per_sec": 0 00:42:34.405 }, 00:42:34.405 "claimed": false, 00:42:34.405 "zoned": false, 00:42:34.405 "supported_io_types": { 00:42:34.405 "read": true, 00:42:34.405 "write": true, 00:42:34.405 "unmap": true, 00:42:34.405 "write_zeroes": true, 00:42:34.405 "flush": false, 00:42:34.405 "reset": true, 00:42:34.405 "compare": false, 00:42:34.405 "compare_and_write": false, 00:42:34.405 "abort": false, 00:42:34.405 "nvme_admin": false, 00:42:34.405 "nvme_io": false 00:42:34.405 }, 00:42:34.405 "driver_specific": { 00:42:34.405 "lvol": { 00:42:34.405 "lvol_store_uuid": "3f50bb5a-b07d-4bbb-83ba-f8d1cff9bf3d", 00:42:34.405 "base_bdev": "Nvme0n1", 00:42:34.405 "thin_provision": true, 00:42:34.405 "num_allocated_clusters": 0, 00:42:34.405 "snapshot": false, 00:42:34.405 "clone": false, 00:42:34.405 "esnap_clone": false 00:42:34.405 } 00:42:34.405 } 00:42:34.405 } 00:42:34.405 ] 00:42:34.405 11:53:18 compress_compdev -- common/autotest_common.sh@906 -- # return 0 00:42:34.405 11:53:18 compress_compdev -- compress/compress.sh@41 -- # '[' -z '' ']' 00:42:34.405 11:53:18 compress_compdev -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:42:34.664 [2024-06-10 11:53:18.369226] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:42:34.664 COMP_lvs0/lv0 00:42:34.664 11:53:18 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:42:34.664 11:53:18 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_name=COMP_lvs0/lv0 00:42:34.664 11:53:18 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:42:34.664 11:53:18 compress_compdev -- common/autotest_common.sh@900 -- # local i 00:42:34.664 11:53:18 compress_compdev -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:42:34.664 11:53:18 compress_compdev -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:42:34.664 11:53:18 compress_compdev -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:42:34.664 11:53:18 compress_compdev -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:42:34.921 [ 00:42:34.921 { 00:42:34.921 "name": "COMP_lvs0/lv0", 00:42:34.921 "aliases": [ 00:42:34.921 "012b7380-859d-5bd6-831e-2eec47e1761d" 00:42:34.921 ], 00:42:34.922 "product_name": "compress", 00:42:34.922 "block_size": 512, 00:42:34.922 "num_blocks": 200704, 00:42:34.922 "uuid": "012b7380-859d-5bd6-831e-2eec47e1761d", 00:42:34.922 "assigned_rate_limits": { 00:42:34.922 "rw_ios_per_sec": 0, 00:42:34.922 "rw_mbytes_per_sec": 0, 00:42:34.922 "r_mbytes_per_sec": 0, 00:42:34.922 "w_mbytes_per_sec": 0 00:42:34.922 }, 00:42:34.922 "claimed": false, 00:42:34.922 "zoned": false, 00:42:34.922 "supported_io_types": { 00:42:34.922 "read": true, 00:42:34.922 "write": true, 00:42:34.922 "unmap": false, 00:42:34.922 "write_zeroes": true, 00:42:34.922 "flush": false, 00:42:34.922 "reset": false, 00:42:34.922 "compare": false, 00:42:34.922 "compare_and_write": false, 00:42:34.922 "abort": false, 00:42:34.922 "nvme_admin": false, 00:42:34.922 "nvme_io": false 00:42:34.922 }, 00:42:34.922 "driver_specific": { 00:42:34.922 "compress": { 00:42:34.922 "name": "COMP_lvs0/lv0", 00:42:34.922 "base_bdev_name": "221a1a01-1690-497f-8875-dc976a6eed86" 00:42:34.922 } 00:42:34.922 } 00:42:34.922 } 00:42:34.922 ] 00:42:34.922 11:53:18 compress_compdev -- common/autotest_common.sh@906 -- # return 0 00:42:34.922 11:53:18 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:42:34.922 [2024-06-10 11:53:18.803018] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f9ce01b15c0 PMD being used: compress_qat 00:42:34.922 [2024-06-10 11:53:18.804679] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x178de20 PMD being used: compress_qat 00:42:34.922 Running I/O for 3 seconds... 00:42:38.203 00:42:38.203 Latency(us) 00:42:38.203 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:42:38.204 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:42:38.204 Verification LBA range: start 0x0 length 0x3100 00:42:38.204 COMP_lvs0/lv0 : 3.00 5399.12 21.09 0.00 0.00 5893.66 463.03 6183.18 00:42:38.204 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:42:38.204 Verification LBA range: start 0x3100 length 0x3100 00:42:38.204 COMP_lvs0/lv0 : 3.00 5684.02 22.20 0.00 0.00 5600.19 379.33 6154.69 00:42:38.204 =================================================================================================================== 00:42:38.204 Total : 11083.14 43.29 0.00 0.00 5743.15 379.33 6183.18 00:42:38.204 0 00:42:38.204 11:53:21 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:42:38.204 11:53:21 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:42:38.204 11:53:22 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:42:38.461 11:53:22 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:42:38.461 11:53:22 compress_compdev -- compress/compress.sh@78 -- # killprocess 267955 00:42:38.461 11:53:22 compress_compdev -- common/autotest_common.sh@949 -- # '[' -z 267955 ']' 00:42:38.461 11:53:22 compress_compdev -- common/autotest_common.sh@953 -- # kill -0 267955 00:42:38.461 11:53:22 compress_compdev -- common/autotest_common.sh@954 -- # uname 00:42:38.462 11:53:22 compress_compdev -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:42:38.462 11:53:22 compress_compdev -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 267955 00:42:38.462 11:53:22 compress_compdev -- common/autotest_common.sh@955 -- # process_name=reactor_1 00:42:38.462 11:53:22 compress_compdev -- common/autotest_common.sh@959 -- # '[' reactor_1 = sudo ']' 00:42:38.462 11:53:22 compress_compdev -- common/autotest_common.sh@967 -- # echo 'killing process with pid 267955' 00:42:38.462 killing process with pid 267955 00:42:38.462 11:53:22 compress_compdev -- common/autotest_common.sh@968 -- # kill 267955 00:42:38.462 Received shutdown signal, test time was about 3.000000 seconds 00:42:38.462 00:42:38.462 Latency(us) 00:42:38.462 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:42:38.462 =================================================================================================================== 00:42:38.462 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:42:38.462 11:53:22 compress_compdev -- common/autotest_common.sh@973 -- # wait 267955 00:42:40.362 11:53:23 compress_compdev -- compress/compress.sh@87 -- # run_bdevperf 32 4096 3 512 00:42:40.362 11:53:23 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:42:40.362 11:53:23 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:42:40.362 11:53:23 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=269036 00:42:40.362 11:53:23 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:42:40.362 11:53:23 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 269036 00:42:40.362 11:53:23 compress_compdev -- common/autotest_common.sh@830 -- # '[' -z 269036 ']' 00:42:40.362 11:53:23 compress_compdev -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:42:40.362 11:53:23 compress_compdev -- common/autotest_common.sh@835 -- # local max_retries=100 00:42:40.362 11:53:23 compress_compdev -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:42:40.362 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:42:40.362 11:53:23 compress_compdev -- common/autotest_common.sh@839 -- # xtrace_disable 00:42:40.362 11:53:23 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:42:40.362 [2024-06-10 11:53:23.842327] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:42:40.362 [2024-06-10 11:53:23.842379] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid269036 ] 00:42:40.362 [2024-06-10 11:53:23.927421] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:42:40.362 [2024-06-10 11:53:24.011910] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:42:40.362 [2024-06-10 11:53:24.011912] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:42:40.620 [2024-06-10 11:53:24.551148] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:42:40.878 11:53:24 compress_compdev -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:42:40.878 11:53:24 compress_compdev -- common/autotest_common.sh@863 -- # return 0 00:42:40.878 11:53:24 compress_compdev -- compress/compress.sh@74 -- # create_vols 512 00:42:40.878 11:53:24 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:42:40.878 11:53:24 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:42:41.444 [2024-06-10 11:53:25.145830] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1b24db0 PMD being used: compress_qat 00:42:41.444 11:53:25 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:42:41.444 11:53:25 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_name=Nvme0n1 00:42:41.444 11:53:25 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:42:41.444 11:53:25 compress_compdev -- common/autotest_common.sh@900 -- # local i 00:42:41.444 11:53:25 compress_compdev -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:42:41.444 11:53:25 compress_compdev -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:42:41.444 11:53:25 compress_compdev -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:42:41.444 11:53:25 compress_compdev -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:42:41.701 [ 00:42:41.701 { 00:42:41.701 "name": "Nvme0n1", 00:42:41.701 "aliases": [ 00:42:41.701 "01000000-0000-0000-5cd2-e4cdeb7b5351" 00:42:41.701 ], 00:42:41.701 "product_name": "NVMe disk", 00:42:41.701 "block_size": 512, 00:42:41.701 "num_blocks": 7501476528, 00:42:41.701 "uuid": "01000000-0000-0000-5cd2-e4cdeb7b5351", 00:42:41.701 "assigned_rate_limits": { 00:42:41.701 "rw_ios_per_sec": 0, 00:42:41.701 "rw_mbytes_per_sec": 0, 00:42:41.701 "r_mbytes_per_sec": 0, 00:42:41.701 "w_mbytes_per_sec": 0 00:42:41.701 }, 00:42:41.701 "claimed": false, 00:42:41.701 "zoned": false, 00:42:41.701 "supported_io_types": { 00:42:41.701 "read": true, 00:42:41.701 "write": true, 00:42:41.701 "unmap": true, 00:42:41.701 "write_zeroes": true, 00:42:41.701 "flush": true, 00:42:41.701 "reset": true, 00:42:41.701 "compare": false, 00:42:41.701 "compare_and_write": false, 00:42:41.701 "abort": true, 00:42:41.701 "nvme_admin": true, 00:42:41.701 "nvme_io": true 00:42:41.701 }, 00:42:41.701 "driver_specific": { 00:42:41.701 "nvme": [ 00:42:41.701 { 00:42:41.701 "pci_address": "0000:5e:00.0", 00:42:41.701 "trid": { 00:42:41.701 "trtype": "PCIe", 00:42:41.701 "traddr": "0000:5e:00.0" 00:42:41.702 }, 00:42:41.702 "ctrlr_data": { 00:42:41.702 "cntlid": 0, 00:42:41.702 "vendor_id": "0x8086", 00:42:41.702 "model_number": "INTEL SSDPF2KX038T1", 00:42:41.702 "serial_number": "PHAX137100A93P8CGN", 00:42:41.702 "firmware_revision": "9CV10015", 00:42:41.702 "subnqn": "nqn.2021-09.com.intel:PHAX137100A93P8CGN ", 00:42:41.702 "oacs": { 00:42:41.702 "security": 0, 00:42:41.702 "format": 1, 00:42:41.702 "firmware": 1, 00:42:41.702 "ns_manage": 1 00:42:41.702 }, 00:42:41.702 "multi_ctrlr": false, 00:42:41.702 "ana_reporting": false 00:42:41.702 }, 00:42:41.702 "vs": { 00:42:41.702 "nvme_version": "1.4" 00:42:41.702 }, 00:42:41.702 "ns_data": { 00:42:41.702 "id": 1, 00:42:41.702 "can_share": false 00:42:41.702 } 00:42:41.702 } 00:42:41.702 ], 00:42:41.702 "mp_policy": "active_passive" 00:42:41.702 } 00:42:41.702 } 00:42:41.702 ] 00:42:41.702 11:53:25 compress_compdev -- common/autotest_common.sh@906 -- # return 0 00:42:41.702 11:53:25 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:42:41.960 [2024-06-10 11:53:25.669756] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1973150 PMD being used: compress_qat 00:42:41.960 cb4e2ef2-cf08-4741-aaa4-bae69eff28b8 00:42:41.960 11:53:25 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:42:41.960 c2dba513-d29f-4bef-adcd-1f889d86f76c 00:42:41.960 11:53:25 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:42:41.960 11:53:25 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_name=lvs0/lv0 00:42:41.960 11:53:25 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:42:41.960 11:53:25 compress_compdev -- common/autotest_common.sh@900 -- # local i 00:42:41.960 11:53:25 compress_compdev -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:42:41.960 11:53:25 compress_compdev -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:42:41.960 11:53:25 compress_compdev -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:42:42.218 11:53:26 compress_compdev -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:42:42.477 [ 00:42:42.477 { 00:42:42.477 "name": "c2dba513-d29f-4bef-adcd-1f889d86f76c", 00:42:42.477 "aliases": [ 00:42:42.477 "lvs0/lv0" 00:42:42.477 ], 00:42:42.477 "product_name": "Logical Volume", 00:42:42.477 "block_size": 512, 00:42:42.477 "num_blocks": 204800, 00:42:42.477 "uuid": "c2dba513-d29f-4bef-adcd-1f889d86f76c", 00:42:42.477 "assigned_rate_limits": { 00:42:42.477 "rw_ios_per_sec": 0, 00:42:42.477 "rw_mbytes_per_sec": 0, 00:42:42.477 "r_mbytes_per_sec": 0, 00:42:42.477 "w_mbytes_per_sec": 0 00:42:42.477 }, 00:42:42.477 "claimed": false, 00:42:42.477 "zoned": false, 00:42:42.477 "supported_io_types": { 00:42:42.477 "read": true, 00:42:42.477 "write": true, 00:42:42.477 "unmap": true, 00:42:42.477 "write_zeroes": true, 00:42:42.477 "flush": false, 00:42:42.477 "reset": true, 00:42:42.477 "compare": false, 00:42:42.477 "compare_and_write": false, 00:42:42.477 "abort": false, 00:42:42.477 "nvme_admin": false, 00:42:42.477 "nvme_io": false 00:42:42.477 }, 00:42:42.477 "driver_specific": { 00:42:42.477 "lvol": { 00:42:42.477 "lvol_store_uuid": "cb4e2ef2-cf08-4741-aaa4-bae69eff28b8", 00:42:42.477 "base_bdev": "Nvme0n1", 00:42:42.477 "thin_provision": true, 00:42:42.477 "num_allocated_clusters": 0, 00:42:42.477 "snapshot": false, 00:42:42.477 "clone": false, 00:42:42.477 "esnap_clone": false 00:42:42.477 } 00:42:42.477 } 00:42:42.477 } 00:42:42.477 ] 00:42:42.477 11:53:26 compress_compdev -- common/autotest_common.sh@906 -- # return 0 00:42:42.477 11:53:26 compress_compdev -- compress/compress.sh@41 -- # '[' -z 512 ']' 00:42:42.477 11:53:26 compress_compdev -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 512 00:42:42.477 [2024-06-10 11:53:26.380949] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:42:42.477 COMP_lvs0/lv0 00:42:42.477 11:53:26 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:42:42.477 11:53:26 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_name=COMP_lvs0/lv0 00:42:42.477 11:53:26 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:42:42.477 11:53:26 compress_compdev -- common/autotest_common.sh@900 -- # local i 00:42:42.477 11:53:26 compress_compdev -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:42:42.477 11:53:26 compress_compdev -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:42:42.477 11:53:26 compress_compdev -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:42:42.736 11:53:26 compress_compdev -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:42:42.994 [ 00:42:42.994 { 00:42:42.994 "name": "COMP_lvs0/lv0", 00:42:42.994 "aliases": [ 00:42:42.994 "51236a6b-d497-5a04-81fe-da0b967309a7" 00:42:42.994 ], 00:42:42.994 "product_name": "compress", 00:42:42.994 "block_size": 512, 00:42:42.994 "num_blocks": 200704, 00:42:42.994 "uuid": "51236a6b-d497-5a04-81fe-da0b967309a7", 00:42:42.994 "assigned_rate_limits": { 00:42:42.994 "rw_ios_per_sec": 0, 00:42:42.994 "rw_mbytes_per_sec": 0, 00:42:42.994 "r_mbytes_per_sec": 0, 00:42:42.994 "w_mbytes_per_sec": 0 00:42:42.994 }, 00:42:42.994 "claimed": false, 00:42:42.994 "zoned": false, 00:42:42.994 "supported_io_types": { 00:42:42.994 "read": true, 00:42:42.994 "write": true, 00:42:42.994 "unmap": false, 00:42:42.994 "write_zeroes": true, 00:42:42.994 "flush": false, 00:42:42.994 "reset": false, 00:42:42.994 "compare": false, 00:42:42.994 "compare_and_write": false, 00:42:42.994 "abort": false, 00:42:42.994 "nvme_admin": false, 00:42:42.994 "nvme_io": false 00:42:42.994 }, 00:42:42.994 "driver_specific": { 00:42:42.994 "compress": { 00:42:42.994 "name": "COMP_lvs0/lv0", 00:42:42.994 "base_bdev_name": "c2dba513-d29f-4bef-adcd-1f889d86f76c" 00:42:42.994 } 00:42:42.994 } 00:42:42.994 } 00:42:42.994 ] 00:42:42.994 11:53:26 compress_compdev -- common/autotest_common.sh@906 -- # return 0 00:42:42.994 11:53:26 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:42:42.994 [2024-06-10 11:53:26.814728] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f42e01b15c0 PMD being used: compress_qat 00:42:42.994 [2024-06-10 11:53:26.816438] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1a04c80 PMD being used: compress_qat 00:42:42.994 Running I/O for 3 seconds... 00:42:46.278 00:42:46.278 Latency(us) 00:42:46.278 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:42:46.278 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:42:46.278 Verification LBA range: start 0x0 length 0x3100 00:42:46.278 COMP_lvs0/lv0 : 3.00 5402.15 21.10 0.00 0.00 5890.89 413.16 5385.35 00:42:46.278 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:42:46.278 Verification LBA range: start 0x3100 length 0x3100 00:42:46.278 COMP_lvs0/lv0 : 3.00 5689.07 22.22 0.00 0.00 5595.81 256.45 5299.87 00:42:46.278 =================================================================================================================== 00:42:46.278 Total : 11091.22 43.33 0.00 0.00 5739.53 256.45 5385.35 00:42:46.278 0 00:42:46.278 11:53:29 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:42:46.278 11:53:29 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:42:46.278 11:53:30 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:42:46.278 11:53:30 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:42:46.278 11:53:30 compress_compdev -- compress/compress.sh@78 -- # killprocess 269036 00:42:46.278 11:53:30 compress_compdev -- common/autotest_common.sh@949 -- # '[' -z 269036 ']' 00:42:46.278 11:53:30 compress_compdev -- common/autotest_common.sh@953 -- # kill -0 269036 00:42:46.278 11:53:30 compress_compdev -- common/autotest_common.sh@954 -- # uname 00:42:46.536 11:53:30 compress_compdev -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:42:46.536 11:53:30 compress_compdev -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 269036 00:42:46.536 11:53:30 compress_compdev -- common/autotest_common.sh@955 -- # process_name=reactor_1 00:42:46.536 11:53:30 compress_compdev -- common/autotest_common.sh@959 -- # '[' reactor_1 = sudo ']' 00:42:46.536 11:53:30 compress_compdev -- common/autotest_common.sh@967 -- # echo 'killing process with pid 269036' 00:42:46.536 killing process with pid 269036 00:42:46.536 11:53:30 compress_compdev -- common/autotest_common.sh@968 -- # kill 269036 00:42:46.536 Received shutdown signal, test time was about 3.000000 seconds 00:42:46.536 00:42:46.536 Latency(us) 00:42:46.536 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:42:46.536 =================================================================================================================== 00:42:46.536 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:42:46.536 11:53:30 compress_compdev -- common/autotest_common.sh@973 -- # wait 269036 00:42:48.437 11:53:31 compress_compdev -- compress/compress.sh@88 -- # run_bdevperf 32 4096 3 4096 00:42:48.437 11:53:31 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:42:48.437 11:53:31 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=270113 00:42:48.437 11:53:31 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:42:48.437 11:53:31 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:42:48.437 11:53:31 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 270113 00:42:48.437 11:53:31 compress_compdev -- common/autotest_common.sh@830 -- # '[' -z 270113 ']' 00:42:48.437 11:53:31 compress_compdev -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:42:48.437 11:53:31 compress_compdev -- common/autotest_common.sh@835 -- # local max_retries=100 00:42:48.437 11:53:31 compress_compdev -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:42:48.437 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:42:48.437 11:53:31 compress_compdev -- common/autotest_common.sh@839 -- # xtrace_disable 00:42:48.437 11:53:31 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:42:48.437 [2024-06-10 11:53:31.964971] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:42:48.437 [2024-06-10 11:53:31.965027] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid270113 ] 00:42:48.437 [2024-06-10 11:53:32.051911] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:42:48.437 [2024-06-10 11:53:32.133913] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:42:48.437 [2024-06-10 11:53:32.133916] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:42:49.002 [2024-06-10 11:53:32.670494] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:42:49.002 11:53:32 compress_compdev -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:42:49.002 11:53:32 compress_compdev -- common/autotest_common.sh@863 -- # return 0 00:42:49.002 11:53:32 compress_compdev -- compress/compress.sh@74 -- # create_vols 4096 00:42:49.003 11:53:32 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:42:49.003 11:53:32 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:42:49.571 [2024-06-10 11:53:33.260115] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x11b9db0 PMD being used: compress_qat 00:42:49.571 11:53:33 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:42:49.571 11:53:33 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_name=Nvme0n1 00:42:49.571 11:53:33 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:42:49.571 11:53:33 compress_compdev -- common/autotest_common.sh@900 -- # local i 00:42:49.571 11:53:33 compress_compdev -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:42:49.571 11:53:33 compress_compdev -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:42:49.571 11:53:33 compress_compdev -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:42:49.571 11:53:33 compress_compdev -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:42:49.894 [ 00:42:49.894 { 00:42:49.894 "name": "Nvme0n1", 00:42:49.894 "aliases": [ 00:42:49.894 "01000000-0000-0000-5cd2-e4cdeb7b5351" 00:42:49.894 ], 00:42:49.894 "product_name": "NVMe disk", 00:42:49.894 "block_size": 512, 00:42:49.895 "num_blocks": 7501476528, 00:42:49.895 "uuid": "01000000-0000-0000-5cd2-e4cdeb7b5351", 00:42:49.895 "assigned_rate_limits": { 00:42:49.895 "rw_ios_per_sec": 0, 00:42:49.895 "rw_mbytes_per_sec": 0, 00:42:49.895 "r_mbytes_per_sec": 0, 00:42:49.895 "w_mbytes_per_sec": 0 00:42:49.895 }, 00:42:49.895 "claimed": false, 00:42:49.895 "zoned": false, 00:42:49.895 "supported_io_types": { 00:42:49.895 "read": true, 00:42:49.895 "write": true, 00:42:49.895 "unmap": true, 00:42:49.895 "write_zeroes": true, 00:42:49.895 "flush": true, 00:42:49.895 "reset": true, 00:42:49.895 "compare": false, 00:42:49.895 "compare_and_write": false, 00:42:49.895 "abort": true, 00:42:49.895 "nvme_admin": true, 00:42:49.895 "nvme_io": true 00:42:49.895 }, 00:42:49.895 "driver_specific": { 00:42:49.895 "nvme": [ 00:42:49.895 { 00:42:49.895 "pci_address": "0000:5e:00.0", 00:42:49.895 "trid": { 00:42:49.895 "trtype": "PCIe", 00:42:49.895 "traddr": "0000:5e:00.0" 00:42:49.895 }, 00:42:49.895 "ctrlr_data": { 00:42:49.895 "cntlid": 0, 00:42:49.895 "vendor_id": "0x8086", 00:42:49.895 "model_number": "INTEL SSDPF2KX038T1", 00:42:49.895 "serial_number": "PHAX137100A93P8CGN", 00:42:49.895 "firmware_revision": "9CV10015", 00:42:49.895 "subnqn": "nqn.2021-09.com.intel:PHAX137100A93P8CGN ", 00:42:49.895 "oacs": { 00:42:49.895 "security": 0, 00:42:49.895 "format": 1, 00:42:49.895 "firmware": 1, 00:42:49.895 "ns_manage": 1 00:42:49.895 }, 00:42:49.895 "multi_ctrlr": false, 00:42:49.895 "ana_reporting": false 00:42:49.895 }, 00:42:49.895 "vs": { 00:42:49.895 "nvme_version": "1.4" 00:42:49.895 }, 00:42:49.895 "ns_data": { 00:42:49.895 "id": 1, 00:42:49.895 "can_share": false 00:42:49.895 } 00:42:49.895 } 00:42:49.895 ], 00:42:49.895 "mp_policy": "active_passive" 00:42:49.895 } 00:42:49.895 } 00:42:49.895 ] 00:42:49.895 11:53:33 compress_compdev -- common/autotest_common.sh@906 -- # return 0 00:42:49.895 11:53:33 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:42:49.895 [2024-06-10 11:53:33.796292] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1008150 PMD being used: compress_qat 00:42:49.895 b5f4eed9-c181-411d-951d-03a334466d42 00:42:49.895 11:53:33 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:42:50.153 52171982-253a-4af3-aff8-7b3d17cac9fb 00:42:50.153 11:53:33 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:42:50.153 11:53:33 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_name=lvs0/lv0 00:42:50.153 11:53:33 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:42:50.153 11:53:33 compress_compdev -- common/autotest_common.sh@900 -- # local i 00:42:50.153 11:53:33 compress_compdev -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:42:50.153 11:53:33 compress_compdev -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:42:50.153 11:53:33 compress_compdev -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:42:50.411 11:53:34 compress_compdev -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:42:50.411 [ 00:42:50.411 { 00:42:50.411 "name": "52171982-253a-4af3-aff8-7b3d17cac9fb", 00:42:50.411 "aliases": [ 00:42:50.411 "lvs0/lv0" 00:42:50.411 ], 00:42:50.411 "product_name": "Logical Volume", 00:42:50.411 "block_size": 512, 00:42:50.411 "num_blocks": 204800, 00:42:50.411 "uuid": "52171982-253a-4af3-aff8-7b3d17cac9fb", 00:42:50.411 "assigned_rate_limits": { 00:42:50.411 "rw_ios_per_sec": 0, 00:42:50.411 "rw_mbytes_per_sec": 0, 00:42:50.411 "r_mbytes_per_sec": 0, 00:42:50.411 "w_mbytes_per_sec": 0 00:42:50.411 }, 00:42:50.411 "claimed": false, 00:42:50.411 "zoned": false, 00:42:50.411 "supported_io_types": { 00:42:50.411 "read": true, 00:42:50.411 "write": true, 00:42:50.411 "unmap": true, 00:42:50.411 "write_zeroes": true, 00:42:50.411 "flush": false, 00:42:50.411 "reset": true, 00:42:50.411 "compare": false, 00:42:50.411 "compare_and_write": false, 00:42:50.411 "abort": false, 00:42:50.411 "nvme_admin": false, 00:42:50.411 "nvme_io": false 00:42:50.411 }, 00:42:50.411 "driver_specific": { 00:42:50.411 "lvol": { 00:42:50.411 "lvol_store_uuid": "b5f4eed9-c181-411d-951d-03a334466d42", 00:42:50.411 "base_bdev": "Nvme0n1", 00:42:50.411 "thin_provision": true, 00:42:50.411 "num_allocated_clusters": 0, 00:42:50.411 "snapshot": false, 00:42:50.411 "clone": false, 00:42:50.411 "esnap_clone": false 00:42:50.411 } 00:42:50.411 } 00:42:50.411 } 00:42:50.411 ] 00:42:50.411 11:53:34 compress_compdev -- common/autotest_common.sh@906 -- # return 0 00:42:50.411 11:53:34 compress_compdev -- compress/compress.sh@41 -- # '[' -z 4096 ']' 00:42:50.411 11:53:34 compress_compdev -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 4096 00:42:50.670 [2024-06-10 11:53:34.495330] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:42:50.670 COMP_lvs0/lv0 00:42:50.670 11:53:34 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:42:50.670 11:53:34 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_name=COMP_lvs0/lv0 00:42:50.670 11:53:34 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:42:50.670 11:53:34 compress_compdev -- common/autotest_common.sh@900 -- # local i 00:42:50.670 11:53:34 compress_compdev -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:42:50.670 11:53:34 compress_compdev -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:42:50.670 11:53:34 compress_compdev -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:42:50.928 11:53:34 compress_compdev -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:42:50.928 [ 00:42:50.928 { 00:42:50.928 "name": "COMP_lvs0/lv0", 00:42:50.928 "aliases": [ 00:42:50.928 "bbe3b12d-8a5a-52b8-8d91-9a1bc1e7f286" 00:42:50.928 ], 00:42:50.928 "product_name": "compress", 00:42:50.928 "block_size": 4096, 00:42:50.928 "num_blocks": 25088, 00:42:50.928 "uuid": "bbe3b12d-8a5a-52b8-8d91-9a1bc1e7f286", 00:42:50.928 "assigned_rate_limits": { 00:42:50.928 "rw_ios_per_sec": 0, 00:42:50.928 "rw_mbytes_per_sec": 0, 00:42:50.928 "r_mbytes_per_sec": 0, 00:42:50.928 "w_mbytes_per_sec": 0 00:42:50.928 }, 00:42:50.928 "claimed": false, 00:42:50.928 "zoned": false, 00:42:50.928 "supported_io_types": { 00:42:50.928 "read": true, 00:42:50.928 "write": true, 00:42:50.928 "unmap": false, 00:42:50.928 "write_zeroes": true, 00:42:50.928 "flush": false, 00:42:50.928 "reset": false, 00:42:50.928 "compare": false, 00:42:50.928 "compare_and_write": false, 00:42:50.929 "abort": false, 00:42:50.929 "nvme_admin": false, 00:42:50.929 "nvme_io": false 00:42:50.929 }, 00:42:50.929 "driver_specific": { 00:42:50.929 "compress": { 00:42:50.929 "name": "COMP_lvs0/lv0", 00:42:50.929 "base_bdev_name": "52171982-253a-4af3-aff8-7b3d17cac9fb" 00:42:50.929 } 00:42:50.929 } 00:42:50.929 } 00:42:50.929 ] 00:42:50.929 11:53:34 compress_compdev -- common/autotest_common.sh@906 -- # return 0 00:42:50.929 11:53:34 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:42:51.187 [2024-06-10 11:53:34.933077] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fb5001b15c0 PMD being used: compress_qat 00:42:51.187 [2024-06-10 11:53:34.934745] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1099c80 PMD being used: compress_qat 00:42:51.187 Running I/O for 3 seconds... 00:42:54.470 00:42:54.470 Latency(us) 00:42:54.470 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:42:54.470 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:42:54.470 Verification LBA range: start 0x0 length 0x3100 00:42:54.470 COMP_lvs0/lv0 : 3.00 5375.24 21.00 0.00 0.00 5918.55 375.76 5470.83 00:42:54.470 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:42:54.470 Verification LBA range: start 0x3100 length 0x3100 00:42:54.470 COMP_lvs0/lv0 : 3.00 5628.16 21.99 0.00 0.00 5656.28 322.34 5214.39 00:42:54.470 =================================================================================================================== 00:42:54.470 Total : 11003.41 42.98 0.00 0.00 5784.41 322.34 5470.83 00:42:54.470 0 00:42:54.470 11:53:37 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:42:54.470 11:53:37 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:42:54.470 11:53:38 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:42:54.470 11:53:38 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:42:54.470 11:53:38 compress_compdev -- compress/compress.sh@78 -- # killprocess 270113 00:42:54.470 11:53:38 compress_compdev -- common/autotest_common.sh@949 -- # '[' -z 270113 ']' 00:42:54.470 11:53:38 compress_compdev -- common/autotest_common.sh@953 -- # kill -0 270113 00:42:54.470 11:53:38 compress_compdev -- common/autotest_common.sh@954 -- # uname 00:42:54.470 11:53:38 compress_compdev -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:42:54.470 11:53:38 compress_compdev -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 270113 00:42:54.470 11:53:38 compress_compdev -- common/autotest_common.sh@955 -- # process_name=reactor_1 00:42:54.470 11:53:38 compress_compdev -- common/autotest_common.sh@959 -- # '[' reactor_1 = sudo ']' 00:42:54.470 11:53:38 compress_compdev -- common/autotest_common.sh@967 -- # echo 'killing process with pid 270113' 00:42:54.470 killing process with pid 270113 00:42:54.470 11:53:38 compress_compdev -- common/autotest_common.sh@968 -- # kill 270113 00:42:54.470 Received shutdown signal, test time was about 3.000000 seconds 00:42:54.470 00:42:54.470 Latency(us) 00:42:54.470 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:42:54.470 =================================================================================================================== 00:42:54.470 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:42:54.471 11:53:38 compress_compdev -- common/autotest_common.sh@973 -- # wait 270113 00:42:56.372 11:53:39 compress_compdev -- compress/compress.sh@89 -- # run_bdevio 00:42:56.372 11:53:39 compress_compdev -- compress/compress.sh@50 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:42:56.373 11:53:39 compress_compdev -- compress/compress.sh@55 -- # bdevio_pid=271201 00:42:56.373 11:53:39 compress_compdev -- compress/compress.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json -w 00:42:56.373 11:53:39 compress_compdev -- compress/compress.sh@56 -- # trap 'killprocess $bdevio_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:42:56.373 11:53:39 compress_compdev -- compress/compress.sh@57 -- # waitforlisten 271201 00:42:56.373 11:53:39 compress_compdev -- common/autotest_common.sh@830 -- # '[' -z 271201 ']' 00:42:56.373 11:53:39 compress_compdev -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:42:56.373 11:53:39 compress_compdev -- common/autotest_common.sh@835 -- # local max_retries=100 00:42:56.373 11:53:39 compress_compdev -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:42:56.373 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:42:56.373 11:53:39 compress_compdev -- common/autotest_common.sh@839 -- # xtrace_disable 00:42:56.373 11:53:39 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:42:56.373 [2024-06-10 11:53:40.030534] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:42:56.373 [2024-06-10 11:53:40.030609] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid271201 ] 00:42:56.373 [2024-06-10 11:53:40.127161] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:42:56.373 [2024-06-10 11:53:40.212174] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:42:56.373 [2024-06-10 11:53:40.212261] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:42:56.373 [2024-06-10 11:53:40.212263] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:42:56.939 [2024-06-10 11:53:40.749066] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:42:56.939 11:53:40 compress_compdev -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:42:56.939 11:53:40 compress_compdev -- common/autotest_common.sh@863 -- # return 0 00:42:56.939 11:53:40 compress_compdev -- compress/compress.sh@58 -- # create_vols 00:42:56.939 11:53:40 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:42:56.939 11:53:40 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:42:57.507 [2024-06-10 11:53:41.346129] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x181d7f0 PMD being used: compress_qat 00:42:57.507 11:53:41 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:42:57.507 11:53:41 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_name=Nvme0n1 00:42:57.507 11:53:41 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:42:57.507 11:53:41 compress_compdev -- common/autotest_common.sh@900 -- # local i 00:42:57.507 11:53:41 compress_compdev -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:42:57.507 11:53:41 compress_compdev -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:42:57.507 11:53:41 compress_compdev -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:42:57.766 11:53:41 compress_compdev -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:42:57.766 [ 00:42:57.766 { 00:42:57.766 "name": "Nvme0n1", 00:42:57.766 "aliases": [ 00:42:57.766 "01000000-0000-0000-5cd2-e4cdeb7b5351" 00:42:57.766 ], 00:42:57.766 "product_name": "NVMe disk", 00:42:57.766 "block_size": 512, 00:42:57.766 "num_blocks": 7501476528, 00:42:57.766 "uuid": "01000000-0000-0000-5cd2-e4cdeb7b5351", 00:42:57.766 "assigned_rate_limits": { 00:42:57.766 "rw_ios_per_sec": 0, 00:42:57.766 "rw_mbytes_per_sec": 0, 00:42:57.766 "r_mbytes_per_sec": 0, 00:42:57.766 "w_mbytes_per_sec": 0 00:42:57.766 }, 00:42:57.766 "claimed": false, 00:42:57.766 "zoned": false, 00:42:57.766 "supported_io_types": { 00:42:57.766 "read": true, 00:42:57.766 "write": true, 00:42:57.766 "unmap": true, 00:42:57.766 "write_zeroes": true, 00:42:57.766 "flush": true, 00:42:57.766 "reset": true, 00:42:57.766 "compare": false, 00:42:57.766 "compare_and_write": false, 00:42:57.766 "abort": true, 00:42:57.766 "nvme_admin": true, 00:42:57.766 "nvme_io": true 00:42:57.766 }, 00:42:57.766 "driver_specific": { 00:42:57.766 "nvme": [ 00:42:57.766 { 00:42:57.766 "pci_address": "0000:5e:00.0", 00:42:57.766 "trid": { 00:42:57.766 "trtype": "PCIe", 00:42:57.766 "traddr": "0000:5e:00.0" 00:42:57.766 }, 00:42:57.766 "ctrlr_data": { 00:42:57.766 "cntlid": 0, 00:42:57.766 "vendor_id": "0x8086", 00:42:57.766 "model_number": "INTEL SSDPF2KX038T1", 00:42:57.766 "serial_number": "PHAX137100A93P8CGN", 00:42:57.766 "firmware_revision": "9CV10015", 00:42:57.766 "subnqn": "nqn.2021-09.com.intel:PHAX137100A93P8CGN ", 00:42:57.766 "oacs": { 00:42:57.766 "security": 0, 00:42:57.766 "format": 1, 00:42:57.766 "firmware": 1, 00:42:57.766 "ns_manage": 1 00:42:57.766 }, 00:42:57.766 "multi_ctrlr": false, 00:42:57.766 "ana_reporting": false 00:42:57.766 }, 00:42:57.766 "vs": { 00:42:57.766 "nvme_version": "1.4" 00:42:57.766 }, 00:42:57.766 "ns_data": { 00:42:57.766 "id": 1, 00:42:57.766 "can_share": false 00:42:57.766 } 00:42:57.766 } 00:42:57.766 ], 00:42:57.766 "mp_policy": "active_passive" 00:42:57.766 } 00:42:57.766 } 00:42:57.766 ] 00:42:58.025 11:53:41 compress_compdev -- common/autotest_common.sh@906 -- # return 0 00:42:58.025 11:53:41 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:42:58.025 [2024-06-10 11:53:41.866272] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x166bb90 PMD being used: compress_qat 00:42:58.025 b302ce1e-0fc1-469f-9fb2-6212af646b11 00:42:58.025 11:53:41 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:42:58.284 188a30c6-90a0-4efb-96fd-661b9cacf8be 00:42:58.284 11:53:42 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:42:58.284 11:53:42 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_name=lvs0/lv0 00:42:58.284 11:53:42 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:42:58.284 11:53:42 compress_compdev -- common/autotest_common.sh@900 -- # local i 00:42:58.284 11:53:42 compress_compdev -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:42:58.284 11:53:42 compress_compdev -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:42:58.284 11:53:42 compress_compdev -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:42:58.284 11:53:42 compress_compdev -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:42:58.542 [ 00:42:58.542 { 00:42:58.542 "name": "188a30c6-90a0-4efb-96fd-661b9cacf8be", 00:42:58.542 "aliases": [ 00:42:58.542 "lvs0/lv0" 00:42:58.542 ], 00:42:58.542 "product_name": "Logical Volume", 00:42:58.542 "block_size": 512, 00:42:58.542 "num_blocks": 204800, 00:42:58.542 "uuid": "188a30c6-90a0-4efb-96fd-661b9cacf8be", 00:42:58.542 "assigned_rate_limits": { 00:42:58.542 "rw_ios_per_sec": 0, 00:42:58.542 "rw_mbytes_per_sec": 0, 00:42:58.542 "r_mbytes_per_sec": 0, 00:42:58.542 "w_mbytes_per_sec": 0 00:42:58.542 }, 00:42:58.542 "claimed": false, 00:42:58.542 "zoned": false, 00:42:58.542 "supported_io_types": { 00:42:58.542 "read": true, 00:42:58.542 "write": true, 00:42:58.542 "unmap": true, 00:42:58.542 "write_zeroes": true, 00:42:58.542 "flush": false, 00:42:58.542 "reset": true, 00:42:58.542 "compare": false, 00:42:58.542 "compare_and_write": false, 00:42:58.542 "abort": false, 00:42:58.542 "nvme_admin": false, 00:42:58.542 "nvme_io": false 00:42:58.542 }, 00:42:58.542 "driver_specific": { 00:42:58.542 "lvol": { 00:42:58.542 "lvol_store_uuid": "b302ce1e-0fc1-469f-9fb2-6212af646b11", 00:42:58.542 "base_bdev": "Nvme0n1", 00:42:58.542 "thin_provision": true, 00:42:58.542 "num_allocated_clusters": 0, 00:42:58.542 "snapshot": false, 00:42:58.542 "clone": false, 00:42:58.542 "esnap_clone": false 00:42:58.542 } 00:42:58.542 } 00:42:58.542 } 00:42:58.542 ] 00:42:58.542 11:53:42 compress_compdev -- common/autotest_common.sh@906 -- # return 0 00:42:58.542 11:53:42 compress_compdev -- compress/compress.sh@41 -- # '[' -z '' ']' 00:42:58.542 11:53:42 compress_compdev -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:42:58.801 [2024-06-10 11:53:42.573456] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:42:58.801 COMP_lvs0/lv0 00:42:58.801 11:53:42 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:42:58.801 11:53:42 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_name=COMP_lvs0/lv0 00:42:58.801 11:53:42 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:42:58.801 11:53:42 compress_compdev -- common/autotest_common.sh@900 -- # local i 00:42:58.801 11:53:42 compress_compdev -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:42:58.801 11:53:42 compress_compdev -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:42:58.801 11:53:42 compress_compdev -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:42:59.060 11:53:42 compress_compdev -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:42:59.060 [ 00:42:59.060 { 00:42:59.060 "name": "COMP_lvs0/lv0", 00:42:59.060 "aliases": [ 00:42:59.060 "d644f1fb-830e-569b-871f-9d036e5cc072" 00:42:59.060 ], 00:42:59.060 "product_name": "compress", 00:42:59.060 "block_size": 512, 00:42:59.060 "num_blocks": 200704, 00:42:59.060 "uuid": "d644f1fb-830e-569b-871f-9d036e5cc072", 00:42:59.060 "assigned_rate_limits": { 00:42:59.060 "rw_ios_per_sec": 0, 00:42:59.060 "rw_mbytes_per_sec": 0, 00:42:59.060 "r_mbytes_per_sec": 0, 00:42:59.060 "w_mbytes_per_sec": 0 00:42:59.060 }, 00:42:59.060 "claimed": false, 00:42:59.060 "zoned": false, 00:42:59.060 "supported_io_types": { 00:42:59.060 "read": true, 00:42:59.060 "write": true, 00:42:59.060 "unmap": false, 00:42:59.060 "write_zeroes": true, 00:42:59.060 "flush": false, 00:42:59.060 "reset": false, 00:42:59.060 "compare": false, 00:42:59.060 "compare_and_write": false, 00:42:59.060 "abort": false, 00:42:59.060 "nvme_admin": false, 00:42:59.060 "nvme_io": false 00:42:59.060 }, 00:42:59.060 "driver_specific": { 00:42:59.060 "compress": { 00:42:59.060 "name": "COMP_lvs0/lv0", 00:42:59.060 "base_bdev_name": "188a30c6-90a0-4efb-96fd-661b9cacf8be" 00:42:59.060 } 00:42:59.060 } 00:42:59.060 } 00:42:59.060 ] 00:42:59.060 11:53:42 compress_compdev -- common/autotest_common.sh@906 -- # return 0 00:42:59.060 11:53:42 compress_compdev -- compress/compress.sh@59 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:42:59.319 [2024-06-10 11:53:43.014321] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fb1701b1350 PMD being used: compress_qat 00:42:59.319 I/O targets: 00:42:59.319 COMP_lvs0/lv0: 200704 blocks of 512 bytes (98 MiB) 00:42:59.319 00:42:59.319 00:42:59.319 CUnit - A unit testing framework for C - Version 2.1-3 00:42:59.319 http://cunit.sourceforge.net/ 00:42:59.319 00:42:59.319 00:42:59.319 Suite: bdevio tests on: COMP_lvs0/lv0 00:42:59.319 Test: blockdev write read block ...passed 00:42:59.319 Test: blockdev write zeroes read block ...passed 00:42:59.319 Test: blockdev write zeroes read no split ...passed 00:42:59.319 Test: blockdev write zeroes read split ...passed 00:42:59.319 Test: blockdev write zeroes read split partial ...passed 00:42:59.319 Test: blockdev reset ...[2024-06-10 11:53:43.051563] vbdev_compress.c: 252:vbdev_compress_submit_request: *ERROR*: Unknown I/O type 5 00:42:59.319 passed 00:42:59.319 Test: blockdev write read 8 blocks ...passed 00:42:59.319 Test: blockdev write read size > 128k ...passed 00:42:59.319 Test: blockdev write read invalid size ...passed 00:42:59.319 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:42:59.319 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:42:59.319 Test: blockdev write read max offset ...passed 00:42:59.319 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:42:59.319 Test: blockdev writev readv 8 blocks ...passed 00:42:59.319 Test: blockdev writev readv 30 x 1block ...passed 00:42:59.319 Test: blockdev writev readv block ...passed 00:42:59.319 Test: blockdev writev readv size > 128k ...passed 00:42:59.319 Test: blockdev writev readv size > 128k in two iovs ...passed 00:42:59.320 Test: blockdev comparev and writev ...passed 00:42:59.320 Test: blockdev nvme passthru rw ...passed 00:42:59.320 Test: blockdev nvme passthru vendor specific ...passed 00:42:59.320 Test: blockdev nvme admin passthru ...passed 00:42:59.320 Test: blockdev copy ...passed 00:42:59.320 00:42:59.320 Run Summary: Type Total Ran Passed Failed Inactive 00:42:59.320 suites 1 1 n/a 0 0 00:42:59.320 tests 23 23 23 0 0 00:42:59.320 asserts 130 130 130 0 n/a 00:42:59.320 00:42:59.320 Elapsed time = 0.092 seconds 00:42:59.320 0 00:42:59.320 11:53:43 compress_compdev -- compress/compress.sh@60 -- # destroy_vols 00:42:59.320 11:53:43 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:42:59.320 11:53:43 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:42:59.578 11:53:43 compress_compdev -- compress/compress.sh@61 -- # trap - SIGINT SIGTERM EXIT 00:42:59.578 11:53:43 compress_compdev -- compress/compress.sh@62 -- # killprocess 271201 00:42:59.578 11:53:43 compress_compdev -- common/autotest_common.sh@949 -- # '[' -z 271201 ']' 00:42:59.578 11:53:43 compress_compdev -- common/autotest_common.sh@953 -- # kill -0 271201 00:42:59.578 11:53:43 compress_compdev -- common/autotest_common.sh@954 -- # uname 00:42:59.578 11:53:43 compress_compdev -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:42:59.578 11:53:43 compress_compdev -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 271201 00:42:59.578 11:53:43 compress_compdev -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:42:59.578 11:53:43 compress_compdev -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:42:59.578 11:53:43 compress_compdev -- common/autotest_common.sh@967 -- # echo 'killing process with pid 271201' 00:42:59.578 killing process with pid 271201 00:42:59.578 11:53:43 compress_compdev -- common/autotest_common.sh@968 -- # kill 271201 00:42:59.578 11:53:43 compress_compdev -- common/autotest_common.sh@973 -- # wait 271201 00:43:01.479 11:53:45 compress_compdev -- compress/compress.sh@91 -- # '[' 0 -eq 1 ']' 00:43:01.479 11:53:45 compress_compdev -- compress/compress.sh@120 -- # rm -rf /tmp/pmem 00:43:01.479 00:43:01.479 real 0m29.418s 00:43:01.479 user 1m6.817s 00:43:01.479 sys 0m4.604s 00:43:01.479 11:53:45 compress_compdev -- common/autotest_common.sh@1125 -- # xtrace_disable 00:43:01.479 11:53:45 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:43:01.479 ************************************ 00:43:01.479 END TEST compress_compdev 00:43:01.479 ************************************ 00:43:01.479 11:53:45 -- spdk/autotest.sh@349 -- # run_test compress_isal /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh isal 00:43:01.479 11:53:45 -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:43:01.479 11:53:45 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:43:01.479 11:53:45 -- common/autotest_common.sh@10 -- # set +x 00:43:01.479 ************************************ 00:43:01.479 START TEST compress_isal 00:43:01.479 ************************************ 00:43:01.479 11:53:45 compress_isal -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh isal 00:43:01.479 * Looking for test storage... 00:43:01.480 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress 00:43:01.480 11:53:45 compress_isal -- compress/compress.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:43:01.480 11:53:45 compress_isal -- nvmf/common.sh@7 -- # uname -s 00:43:01.480 11:53:45 compress_isal -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:43:01.480 11:53:45 compress_isal -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:43:01.480 11:53:45 compress_isal -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:43:01.480 11:53:45 compress_isal -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:43:01.480 11:53:45 compress_isal -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:43:01.480 11:53:45 compress_isal -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:43:01.480 11:53:45 compress_isal -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:43:01.480 11:53:45 compress_isal -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:43:01.480 11:53:45 compress_isal -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:43:01.480 11:53:45 compress_isal -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:43:01.480 11:53:45 compress_isal -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:804da62e-425e-e711-906e-0017a4403562 00:43:01.480 11:53:45 compress_isal -- nvmf/common.sh@18 -- # NVME_HOSTID=804da62e-425e-e711-906e-0017a4403562 00:43:01.480 11:53:45 compress_isal -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:43:01.480 11:53:45 compress_isal -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:43:01.480 11:53:45 compress_isal -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:43:01.480 11:53:45 compress_isal -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:43:01.480 11:53:45 compress_isal -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:43:01.480 11:53:45 compress_isal -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:43:01.480 11:53:45 compress_isal -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:43:01.480 11:53:45 compress_isal -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:43:01.480 11:53:45 compress_isal -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:43:01.480 11:53:45 compress_isal -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:43:01.480 11:53:45 compress_isal -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:43:01.480 11:53:45 compress_isal -- paths/export.sh@5 -- # export PATH 00:43:01.480 11:53:45 compress_isal -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:43:01.480 11:53:45 compress_isal -- nvmf/common.sh@47 -- # : 0 00:43:01.480 11:53:45 compress_isal -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:43:01.480 11:53:45 compress_isal -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:43:01.480 11:53:45 compress_isal -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:43:01.480 11:53:45 compress_isal -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:43:01.480 11:53:45 compress_isal -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:43:01.480 11:53:45 compress_isal -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:43:01.480 11:53:45 compress_isal -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:43:01.480 11:53:45 compress_isal -- nvmf/common.sh@51 -- # have_pci_nics=0 00:43:01.480 11:53:45 compress_isal -- compress/compress.sh@17 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:43:01.480 11:53:45 compress_isal -- compress/compress.sh@81 -- # mkdir -p /tmp/pmem 00:43:01.480 11:53:45 compress_isal -- compress/compress.sh@82 -- # test_type=isal 00:43:01.480 11:53:45 compress_isal -- compress/compress.sh@86 -- # run_bdevperf 32 4096 3 00:43:01.480 11:53:45 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:43:01.480 11:53:45 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=271984 00:43:01.480 11:53:45 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:43:01.480 11:53:45 compress_isal -- compress/compress.sh@73 -- # waitforlisten 271984 00:43:01.480 11:53:45 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:43:01.480 11:53:45 compress_isal -- common/autotest_common.sh@830 -- # '[' -z 271984 ']' 00:43:01.480 11:53:45 compress_isal -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:43:01.480 11:53:45 compress_isal -- common/autotest_common.sh@835 -- # local max_retries=100 00:43:01.480 11:53:45 compress_isal -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:43:01.480 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:43:01.480 11:53:45 compress_isal -- common/autotest_common.sh@839 -- # xtrace_disable 00:43:01.480 11:53:45 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:43:01.480 [2024-06-10 11:53:45.367945] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:43:01.480 [2024-06-10 11:53:45.368000] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid271984 ] 00:43:01.738 [2024-06-10 11:53:45.454466] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:43:01.738 [2024-06-10 11:53:45.538254] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:43:01.738 [2024-06-10 11:53:45.538257] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:43:02.318 11:53:46 compress_isal -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:43:02.319 11:53:46 compress_isal -- common/autotest_common.sh@863 -- # return 0 00:43:02.319 11:53:46 compress_isal -- compress/compress.sh@74 -- # create_vols 00:43:02.319 11:53:46 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:43:02.319 11:53:46 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:43:02.884 11:53:46 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:43:02.884 11:53:46 compress_isal -- common/autotest_common.sh@898 -- # local bdev_name=Nvme0n1 00:43:02.884 11:53:46 compress_isal -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:43:02.884 11:53:46 compress_isal -- common/autotest_common.sh@900 -- # local i 00:43:02.884 11:53:46 compress_isal -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:43:02.884 11:53:46 compress_isal -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:43:02.884 11:53:46 compress_isal -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:43:03.143 11:53:46 compress_isal -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:43:03.143 [ 00:43:03.143 { 00:43:03.143 "name": "Nvme0n1", 00:43:03.143 "aliases": [ 00:43:03.143 "01000000-0000-0000-5cd2-e4cdeb7b5351" 00:43:03.143 ], 00:43:03.143 "product_name": "NVMe disk", 00:43:03.143 "block_size": 512, 00:43:03.143 "num_blocks": 7501476528, 00:43:03.143 "uuid": "01000000-0000-0000-5cd2-e4cdeb7b5351", 00:43:03.143 "assigned_rate_limits": { 00:43:03.143 "rw_ios_per_sec": 0, 00:43:03.143 "rw_mbytes_per_sec": 0, 00:43:03.143 "r_mbytes_per_sec": 0, 00:43:03.143 "w_mbytes_per_sec": 0 00:43:03.143 }, 00:43:03.143 "claimed": false, 00:43:03.143 "zoned": false, 00:43:03.143 "supported_io_types": { 00:43:03.143 "read": true, 00:43:03.143 "write": true, 00:43:03.143 "unmap": true, 00:43:03.143 "write_zeroes": true, 00:43:03.143 "flush": true, 00:43:03.143 "reset": true, 00:43:03.143 "compare": false, 00:43:03.143 "compare_and_write": false, 00:43:03.143 "abort": true, 00:43:03.143 "nvme_admin": true, 00:43:03.143 "nvme_io": true 00:43:03.143 }, 00:43:03.143 "driver_specific": { 00:43:03.143 "nvme": [ 00:43:03.143 { 00:43:03.143 "pci_address": "0000:5e:00.0", 00:43:03.143 "trid": { 00:43:03.143 "trtype": "PCIe", 00:43:03.143 "traddr": "0000:5e:00.0" 00:43:03.143 }, 00:43:03.143 "ctrlr_data": { 00:43:03.143 "cntlid": 0, 00:43:03.143 "vendor_id": "0x8086", 00:43:03.143 "model_number": "INTEL SSDPF2KX038T1", 00:43:03.143 "serial_number": "PHAX137100A93P8CGN", 00:43:03.143 "firmware_revision": "9CV10015", 00:43:03.143 "subnqn": "nqn.2021-09.com.intel:PHAX137100A93P8CGN ", 00:43:03.143 "oacs": { 00:43:03.143 "security": 0, 00:43:03.143 "format": 1, 00:43:03.143 "firmware": 1, 00:43:03.143 "ns_manage": 1 00:43:03.143 }, 00:43:03.143 "multi_ctrlr": false, 00:43:03.143 "ana_reporting": false 00:43:03.143 }, 00:43:03.143 "vs": { 00:43:03.143 "nvme_version": "1.4" 00:43:03.143 }, 00:43:03.143 "ns_data": { 00:43:03.143 "id": 1, 00:43:03.143 "can_share": false 00:43:03.143 } 00:43:03.143 } 00:43:03.143 ], 00:43:03.143 "mp_policy": "active_passive" 00:43:03.143 } 00:43:03.143 } 00:43:03.143 ] 00:43:03.143 11:53:47 compress_isal -- common/autotest_common.sh@906 -- # return 0 00:43:03.143 11:53:47 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:43:03.401 4c3d1ea9-2e18-4365-8129-a92c68c01610 00:43:03.401 11:53:47 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:43:03.659 71b4b8a6-25dc-4939-aae1-067acd68c485 00:43:03.659 11:53:47 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:43:03.659 11:53:47 compress_isal -- common/autotest_common.sh@898 -- # local bdev_name=lvs0/lv0 00:43:03.659 11:53:47 compress_isal -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:43:03.659 11:53:47 compress_isal -- common/autotest_common.sh@900 -- # local i 00:43:03.659 11:53:47 compress_isal -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:43:03.659 11:53:47 compress_isal -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:43:03.659 11:53:47 compress_isal -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:43:03.659 11:53:47 compress_isal -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:43:03.917 [ 00:43:03.917 { 00:43:03.917 "name": "71b4b8a6-25dc-4939-aae1-067acd68c485", 00:43:03.917 "aliases": [ 00:43:03.917 "lvs0/lv0" 00:43:03.917 ], 00:43:03.917 "product_name": "Logical Volume", 00:43:03.917 "block_size": 512, 00:43:03.917 "num_blocks": 204800, 00:43:03.917 "uuid": "71b4b8a6-25dc-4939-aae1-067acd68c485", 00:43:03.917 "assigned_rate_limits": { 00:43:03.917 "rw_ios_per_sec": 0, 00:43:03.917 "rw_mbytes_per_sec": 0, 00:43:03.917 "r_mbytes_per_sec": 0, 00:43:03.917 "w_mbytes_per_sec": 0 00:43:03.917 }, 00:43:03.917 "claimed": false, 00:43:03.917 "zoned": false, 00:43:03.917 "supported_io_types": { 00:43:03.917 "read": true, 00:43:03.917 "write": true, 00:43:03.917 "unmap": true, 00:43:03.917 "write_zeroes": true, 00:43:03.917 "flush": false, 00:43:03.917 "reset": true, 00:43:03.917 "compare": false, 00:43:03.917 "compare_and_write": false, 00:43:03.917 "abort": false, 00:43:03.917 "nvme_admin": false, 00:43:03.917 "nvme_io": false 00:43:03.917 }, 00:43:03.917 "driver_specific": { 00:43:03.917 "lvol": { 00:43:03.917 "lvol_store_uuid": "4c3d1ea9-2e18-4365-8129-a92c68c01610", 00:43:03.917 "base_bdev": "Nvme0n1", 00:43:03.917 "thin_provision": true, 00:43:03.917 "num_allocated_clusters": 0, 00:43:03.917 "snapshot": false, 00:43:03.917 "clone": false, 00:43:03.917 "esnap_clone": false 00:43:03.917 } 00:43:03.917 } 00:43:03.917 } 00:43:03.917 ] 00:43:03.917 11:53:47 compress_isal -- common/autotest_common.sh@906 -- # return 0 00:43:03.917 11:53:47 compress_isal -- compress/compress.sh@41 -- # '[' -z '' ']' 00:43:03.917 11:53:47 compress_isal -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:43:04.176 [2024-06-10 11:53:47.909166] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:43:04.176 COMP_lvs0/lv0 00:43:04.176 11:53:47 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:43:04.176 11:53:47 compress_isal -- common/autotest_common.sh@898 -- # local bdev_name=COMP_lvs0/lv0 00:43:04.176 11:53:47 compress_isal -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:43:04.176 11:53:47 compress_isal -- common/autotest_common.sh@900 -- # local i 00:43:04.176 11:53:47 compress_isal -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:43:04.176 11:53:47 compress_isal -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:43:04.176 11:53:47 compress_isal -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:43:04.176 11:53:48 compress_isal -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:43:04.434 [ 00:43:04.434 { 00:43:04.434 "name": "COMP_lvs0/lv0", 00:43:04.434 "aliases": [ 00:43:04.434 "702f2f0d-6a54-5cc2-bc14-c16cb2ff23e9" 00:43:04.434 ], 00:43:04.434 "product_name": "compress", 00:43:04.434 "block_size": 512, 00:43:04.434 "num_blocks": 200704, 00:43:04.434 "uuid": "702f2f0d-6a54-5cc2-bc14-c16cb2ff23e9", 00:43:04.434 "assigned_rate_limits": { 00:43:04.434 "rw_ios_per_sec": 0, 00:43:04.434 "rw_mbytes_per_sec": 0, 00:43:04.434 "r_mbytes_per_sec": 0, 00:43:04.434 "w_mbytes_per_sec": 0 00:43:04.434 }, 00:43:04.434 "claimed": false, 00:43:04.434 "zoned": false, 00:43:04.434 "supported_io_types": { 00:43:04.434 "read": true, 00:43:04.434 "write": true, 00:43:04.434 "unmap": false, 00:43:04.434 "write_zeroes": true, 00:43:04.434 "flush": false, 00:43:04.434 "reset": false, 00:43:04.434 "compare": false, 00:43:04.434 "compare_and_write": false, 00:43:04.434 "abort": false, 00:43:04.434 "nvme_admin": false, 00:43:04.434 "nvme_io": false 00:43:04.434 }, 00:43:04.434 "driver_specific": { 00:43:04.434 "compress": { 00:43:04.434 "name": "COMP_lvs0/lv0", 00:43:04.434 "base_bdev_name": "71b4b8a6-25dc-4939-aae1-067acd68c485" 00:43:04.434 } 00:43:04.434 } 00:43:04.434 } 00:43:04.434 ] 00:43:04.434 11:53:48 compress_isal -- common/autotest_common.sh@906 -- # return 0 00:43:04.434 11:53:48 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:43:04.434 Running I/O for 3 seconds... 00:43:07.734 00:43:07.734 Latency(us) 00:43:07.734 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:43:07.734 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:43:07.734 Verification LBA range: start 0x0 length 0x3100 00:43:07.734 COMP_lvs0/lv0 : 3.00 4025.67 15.73 0.00 0.00 7911.92 466.59 8890.10 00:43:07.734 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:43:07.734 Verification LBA range: start 0x3100 length 0x3100 00:43:07.734 COMP_lvs0/lv0 : 3.00 4022.68 15.71 0.00 0.00 7918.54 690.98 8833.11 00:43:07.734 =================================================================================================================== 00:43:07.734 Total : 8048.35 31.44 0.00 0.00 7915.23 466.59 8890.10 00:43:07.734 0 00:43:07.734 11:53:51 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:43:07.734 11:53:51 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:43:07.734 11:53:51 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:43:07.993 11:53:51 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:43:07.993 11:53:51 compress_isal -- compress/compress.sh@78 -- # killprocess 271984 00:43:07.993 11:53:51 compress_isal -- common/autotest_common.sh@949 -- # '[' -z 271984 ']' 00:43:07.993 11:53:51 compress_isal -- common/autotest_common.sh@953 -- # kill -0 271984 00:43:07.993 11:53:51 compress_isal -- common/autotest_common.sh@954 -- # uname 00:43:07.993 11:53:51 compress_isal -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:43:07.993 11:53:51 compress_isal -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 271984 00:43:07.993 11:53:51 compress_isal -- common/autotest_common.sh@955 -- # process_name=reactor_1 00:43:07.993 11:53:51 compress_isal -- common/autotest_common.sh@959 -- # '[' reactor_1 = sudo ']' 00:43:07.993 11:53:51 compress_isal -- common/autotest_common.sh@967 -- # echo 'killing process with pid 271984' 00:43:07.993 killing process with pid 271984 00:43:07.993 11:53:51 compress_isal -- common/autotest_common.sh@968 -- # kill 271984 00:43:07.993 Received shutdown signal, test time was about 3.000000 seconds 00:43:07.993 00:43:07.993 Latency(us) 00:43:07.993 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:43:07.993 =================================================================================================================== 00:43:07.993 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:43:07.993 11:53:51 compress_isal -- common/autotest_common.sh@973 -- # wait 271984 00:43:09.892 11:53:53 compress_isal -- compress/compress.sh@87 -- # run_bdevperf 32 4096 3 512 00:43:09.892 11:53:53 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:43:09.892 11:53:53 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=272964 00:43:09.892 11:53:53 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:43:09.892 11:53:53 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:43:09.892 11:53:53 compress_isal -- compress/compress.sh@73 -- # waitforlisten 272964 00:43:09.892 11:53:53 compress_isal -- common/autotest_common.sh@830 -- # '[' -z 272964 ']' 00:43:09.892 11:53:53 compress_isal -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:43:09.892 11:53:53 compress_isal -- common/autotest_common.sh@835 -- # local max_retries=100 00:43:09.892 11:53:53 compress_isal -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:43:09.892 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:43:09.892 11:53:53 compress_isal -- common/autotest_common.sh@839 -- # xtrace_disable 00:43:09.892 11:53:53 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:43:09.892 [2024-06-10 11:53:53.406387] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:43:09.892 [2024-06-10 11:53:53.406449] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid272964 ] 00:43:09.892 [2024-06-10 11:53:53.495031] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:43:09.892 [2024-06-10 11:53:53.580693] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:43:09.892 [2024-06-10 11:53:53.580697] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:43:10.456 11:53:54 compress_isal -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:43:10.456 11:53:54 compress_isal -- common/autotest_common.sh@863 -- # return 0 00:43:10.456 11:53:54 compress_isal -- compress/compress.sh@74 -- # create_vols 512 00:43:10.456 11:53:54 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:43:10.456 11:53:54 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:43:11.022 11:53:54 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:43:11.022 11:53:54 compress_isal -- common/autotest_common.sh@898 -- # local bdev_name=Nvme0n1 00:43:11.022 11:53:54 compress_isal -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:43:11.022 11:53:54 compress_isal -- common/autotest_common.sh@900 -- # local i 00:43:11.022 11:53:54 compress_isal -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:43:11.022 11:53:54 compress_isal -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:43:11.022 11:53:54 compress_isal -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:43:11.022 11:53:54 compress_isal -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:43:11.280 [ 00:43:11.280 { 00:43:11.280 "name": "Nvme0n1", 00:43:11.280 "aliases": [ 00:43:11.280 "01000000-0000-0000-5cd2-e4cdeb7b5351" 00:43:11.280 ], 00:43:11.280 "product_name": "NVMe disk", 00:43:11.280 "block_size": 512, 00:43:11.280 "num_blocks": 7501476528, 00:43:11.280 "uuid": "01000000-0000-0000-5cd2-e4cdeb7b5351", 00:43:11.280 "assigned_rate_limits": { 00:43:11.280 "rw_ios_per_sec": 0, 00:43:11.280 "rw_mbytes_per_sec": 0, 00:43:11.280 "r_mbytes_per_sec": 0, 00:43:11.280 "w_mbytes_per_sec": 0 00:43:11.280 }, 00:43:11.280 "claimed": false, 00:43:11.280 "zoned": false, 00:43:11.280 "supported_io_types": { 00:43:11.280 "read": true, 00:43:11.280 "write": true, 00:43:11.280 "unmap": true, 00:43:11.280 "write_zeroes": true, 00:43:11.280 "flush": true, 00:43:11.280 "reset": true, 00:43:11.280 "compare": false, 00:43:11.280 "compare_and_write": false, 00:43:11.280 "abort": true, 00:43:11.280 "nvme_admin": true, 00:43:11.280 "nvme_io": true 00:43:11.280 }, 00:43:11.280 "driver_specific": { 00:43:11.280 "nvme": [ 00:43:11.280 { 00:43:11.280 "pci_address": "0000:5e:00.0", 00:43:11.280 "trid": { 00:43:11.280 "trtype": "PCIe", 00:43:11.280 "traddr": "0000:5e:00.0" 00:43:11.280 }, 00:43:11.280 "ctrlr_data": { 00:43:11.280 "cntlid": 0, 00:43:11.280 "vendor_id": "0x8086", 00:43:11.280 "model_number": "INTEL SSDPF2KX038T1", 00:43:11.280 "serial_number": "PHAX137100A93P8CGN", 00:43:11.280 "firmware_revision": "9CV10015", 00:43:11.280 "subnqn": "nqn.2021-09.com.intel:PHAX137100A93P8CGN ", 00:43:11.280 "oacs": { 00:43:11.280 "security": 0, 00:43:11.280 "format": 1, 00:43:11.280 "firmware": 1, 00:43:11.280 "ns_manage": 1 00:43:11.280 }, 00:43:11.280 "multi_ctrlr": false, 00:43:11.280 "ana_reporting": false 00:43:11.280 }, 00:43:11.280 "vs": { 00:43:11.280 "nvme_version": "1.4" 00:43:11.280 }, 00:43:11.280 "ns_data": { 00:43:11.280 "id": 1, 00:43:11.280 "can_share": false 00:43:11.281 } 00:43:11.281 } 00:43:11.281 ], 00:43:11.281 "mp_policy": "active_passive" 00:43:11.281 } 00:43:11.281 } 00:43:11.281 ] 00:43:11.281 11:53:55 compress_isal -- common/autotest_common.sh@906 -- # return 0 00:43:11.281 11:53:55 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:43:11.539 e42f35ef-3041-4452-8028-267923b2861f 00:43:11.539 11:53:55 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:43:11.539 5bc69fdf-974e-43f3-be20-487d4e2b9ce0 00:43:11.539 11:53:55 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:43:11.539 11:53:55 compress_isal -- common/autotest_common.sh@898 -- # local bdev_name=lvs0/lv0 00:43:11.539 11:53:55 compress_isal -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:43:11.539 11:53:55 compress_isal -- common/autotest_common.sh@900 -- # local i 00:43:11.539 11:53:55 compress_isal -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:43:11.539 11:53:55 compress_isal -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:43:11.539 11:53:55 compress_isal -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:43:11.797 11:53:55 compress_isal -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:43:12.056 [ 00:43:12.056 { 00:43:12.056 "name": "5bc69fdf-974e-43f3-be20-487d4e2b9ce0", 00:43:12.056 "aliases": [ 00:43:12.056 "lvs0/lv0" 00:43:12.056 ], 00:43:12.056 "product_name": "Logical Volume", 00:43:12.056 "block_size": 512, 00:43:12.056 "num_blocks": 204800, 00:43:12.056 "uuid": "5bc69fdf-974e-43f3-be20-487d4e2b9ce0", 00:43:12.056 "assigned_rate_limits": { 00:43:12.056 "rw_ios_per_sec": 0, 00:43:12.056 "rw_mbytes_per_sec": 0, 00:43:12.056 "r_mbytes_per_sec": 0, 00:43:12.056 "w_mbytes_per_sec": 0 00:43:12.056 }, 00:43:12.056 "claimed": false, 00:43:12.056 "zoned": false, 00:43:12.056 "supported_io_types": { 00:43:12.056 "read": true, 00:43:12.056 "write": true, 00:43:12.056 "unmap": true, 00:43:12.056 "write_zeroes": true, 00:43:12.056 "flush": false, 00:43:12.056 "reset": true, 00:43:12.056 "compare": false, 00:43:12.056 "compare_and_write": false, 00:43:12.056 "abort": false, 00:43:12.056 "nvme_admin": false, 00:43:12.056 "nvme_io": false 00:43:12.056 }, 00:43:12.056 "driver_specific": { 00:43:12.056 "lvol": { 00:43:12.056 "lvol_store_uuid": "e42f35ef-3041-4452-8028-267923b2861f", 00:43:12.056 "base_bdev": "Nvme0n1", 00:43:12.056 "thin_provision": true, 00:43:12.056 "num_allocated_clusters": 0, 00:43:12.056 "snapshot": false, 00:43:12.056 "clone": false, 00:43:12.056 "esnap_clone": false 00:43:12.056 } 00:43:12.056 } 00:43:12.056 } 00:43:12.056 ] 00:43:12.056 11:53:55 compress_isal -- common/autotest_common.sh@906 -- # return 0 00:43:12.056 11:53:55 compress_isal -- compress/compress.sh@41 -- # '[' -z 512 ']' 00:43:12.056 11:53:55 compress_isal -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 512 00:43:12.056 [2024-06-10 11:53:55.937100] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:43:12.056 COMP_lvs0/lv0 00:43:12.056 11:53:55 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:43:12.056 11:53:55 compress_isal -- common/autotest_common.sh@898 -- # local bdev_name=COMP_lvs0/lv0 00:43:12.056 11:53:55 compress_isal -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:43:12.056 11:53:55 compress_isal -- common/autotest_common.sh@900 -- # local i 00:43:12.056 11:53:55 compress_isal -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:43:12.056 11:53:55 compress_isal -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:43:12.056 11:53:55 compress_isal -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:43:12.326 11:53:56 compress_isal -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:43:12.584 [ 00:43:12.584 { 00:43:12.584 "name": "COMP_lvs0/lv0", 00:43:12.584 "aliases": [ 00:43:12.584 "b250530f-2b69-5492-9a74-b5bcafb640f9" 00:43:12.584 ], 00:43:12.584 "product_name": "compress", 00:43:12.584 "block_size": 512, 00:43:12.584 "num_blocks": 200704, 00:43:12.584 "uuid": "b250530f-2b69-5492-9a74-b5bcafb640f9", 00:43:12.584 "assigned_rate_limits": { 00:43:12.584 "rw_ios_per_sec": 0, 00:43:12.584 "rw_mbytes_per_sec": 0, 00:43:12.584 "r_mbytes_per_sec": 0, 00:43:12.584 "w_mbytes_per_sec": 0 00:43:12.584 }, 00:43:12.584 "claimed": false, 00:43:12.584 "zoned": false, 00:43:12.584 "supported_io_types": { 00:43:12.584 "read": true, 00:43:12.584 "write": true, 00:43:12.584 "unmap": false, 00:43:12.584 "write_zeroes": true, 00:43:12.585 "flush": false, 00:43:12.585 "reset": false, 00:43:12.585 "compare": false, 00:43:12.585 "compare_and_write": false, 00:43:12.585 "abort": false, 00:43:12.585 "nvme_admin": false, 00:43:12.585 "nvme_io": false 00:43:12.585 }, 00:43:12.585 "driver_specific": { 00:43:12.585 "compress": { 00:43:12.585 "name": "COMP_lvs0/lv0", 00:43:12.585 "base_bdev_name": "5bc69fdf-974e-43f3-be20-487d4e2b9ce0" 00:43:12.585 } 00:43:12.585 } 00:43:12.585 } 00:43:12.585 ] 00:43:12.585 11:53:56 compress_isal -- common/autotest_common.sh@906 -- # return 0 00:43:12.585 11:53:56 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:43:12.585 Running I/O for 3 seconds... 00:43:15.868 00:43:15.868 Latency(us) 00:43:15.868 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:43:15.868 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:43:15.868 Verification LBA range: start 0x0 length 0x3100 00:43:15.868 COMP_lvs0/lv0 : 3.00 4085.81 15.96 0.00 0.00 7795.19 740.84 6753.06 00:43:15.868 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:43:15.868 Verification LBA range: start 0x3100 length 0x3100 00:43:15.868 COMP_lvs0/lv0 : 3.00 4090.78 15.98 0.00 0.00 7787.94 477.27 6810.05 00:43:15.868 =================================================================================================================== 00:43:15.868 Total : 8176.59 31.94 0.00 0.00 7791.56 477.27 6810.05 00:43:15.868 0 00:43:15.868 11:53:59 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:43:15.868 11:53:59 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:43:15.868 11:53:59 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:43:16.126 11:53:59 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:43:16.126 11:53:59 compress_isal -- compress/compress.sh@78 -- # killprocess 272964 00:43:16.126 11:53:59 compress_isal -- common/autotest_common.sh@949 -- # '[' -z 272964 ']' 00:43:16.126 11:53:59 compress_isal -- common/autotest_common.sh@953 -- # kill -0 272964 00:43:16.126 11:53:59 compress_isal -- common/autotest_common.sh@954 -- # uname 00:43:16.127 11:53:59 compress_isal -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:43:16.127 11:53:59 compress_isal -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 272964 00:43:16.127 11:53:59 compress_isal -- common/autotest_common.sh@955 -- # process_name=reactor_1 00:43:16.127 11:53:59 compress_isal -- common/autotest_common.sh@959 -- # '[' reactor_1 = sudo ']' 00:43:16.127 11:53:59 compress_isal -- common/autotest_common.sh@967 -- # echo 'killing process with pid 272964' 00:43:16.127 killing process with pid 272964 00:43:16.127 11:53:59 compress_isal -- common/autotest_common.sh@968 -- # kill 272964 00:43:16.127 Received shutdown signal, test time was about 3.000000 seconds 00:43:16.127 00:43:16.127 Latency(us) 00:43:16.127 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:43:16.127 =================================================================================================================== 00:43:16.127 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:43:16.127 11:53:59 compress_isal -- common/autotest_common.sh@973 -- # wait 272964 00:43:18.049 11:54:01 compress_isal -- compress/compress.sh@88 -- # run_bdevperf 32 4096 3 4096 00:43:18.049 11:54:01 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:43:18.049 11:54:01 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=274128 00:43:18.049 11:54:01 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:43:18.049 11:54:01 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:43:18.049 11:54:01 compress_isal -- compress/compress.sh@73 -- # waitforlisten 274128 00:43:18.049 11:54:01 compress_isal -- common/autotest_common.sh@830 -- # '[' -z 274128 ']' 00:43:18.049 11:54:01 compress_isal -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:43:18.049 11:54:01 compress_isal -- common/autotest_common.sh@835 -- # local max_retries=100 00:43:18.049 11:54:01 compress_isal -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:43:18.049 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:43:18.049 11:54:01 compress_isal -- common/autotest_common.sh@839 -- # xtrace_disable 00:43:18.049 11:54:01 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:43:18.049 [2024-06-10 11:54:01.521551] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:43:18.049 [2024-06-10 11:54:01.521613] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid274128 ] 00:43:18.049 [2024-06-10 11:54:01.613027] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:43:18.049 [2024-06-10 11:54:01.692017] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:43:18.049 [2024-06-10 11:54:01.692020] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:43:18.668 11:54:02 compress_isal -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:43:18.668 11:54:02 compress_isal -- common/autotest_common.sh@863 -- # return 0 00:43:18.668 11:54:02 compress_isal -- compress/compress.sh@74 -- # create_vols 4096 00:43:18.668 11:54:02 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:43:18.668 11:54:02 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:43:18.927 11:54:02 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:43:18.927 11:54:02 compress_isal -- common/autotest_common.sh@898 -- # local bdev_name=Nvme0n1 00:43:18.927 11:54:02 compress_isal -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:43:18.927 11:54:02 compress_isal -- common/autotest_common.sh@900 -- # local i 00:43:18.927 11:54:02 compress_isal -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:43:18.927 11:54:02 compress_isal -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:43:18.927 11:54:02 compress_isal -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:43:19.186 11:54:03 compress_isal -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:43:19.444 [ 00:43:19.444 { 00:43:19.444 "name": "Nvme0n1", 00:43:19.444 "aliases": [ 00:43:19.444 "01000000-0000-0000-5cd2-e4cdeb7b5351" 00:43:19.444 ], 00:43:19.444 "product_name": "NVMe disk", 00:43:19.444 "block_size": 512, 00:43:19.444 "num_blocks": 7501476528, 00:43:19.444 "uuid": "01000000-0000-0000-5cd2-e4cdeb7b5351", 00:43:19.444 "assigned_rate_limits": { 00:43:19.444 "rw_ios_per_sec": 0, 00:43:19.444 "rw_mbytes_per_sec": 0, 00:43:19.444 "r_mbytes_per_sec": 0, 00:43:19.444 "w_mbytes_per_sec": 0 00:43:19.444 }, 00:43:19.444 "claimed": false, 00:43:19.444 "zoned": false, 00:43:19.444 "supported_io_types": { 00:43:19.444 "read": true, 00:43:19.444 "write": true, 00:43:19.444 "unmap": true, 00:43:19.444 "write_zeroes": true, 00:43:19.444 "flush": true, 00:43:19.444 "reset": true, 00:43:19.444 "compare": false, 00:43:19.444 "compare_and_write": false, 00:43:19.445 "abort": true, 00:43:19.445 "nvme_admin": true, 00:43:19.445 "nvme_io": true 00:43:19.445 }, 00:43:19.445 "driver_specific": { 00:43:19.445 "nvme": [ 00:43:19.445 { 00:43:19.445 "pci_address": "0000:5e:00.0", 00:43:19.445 "trid": { 00:43:19.445 "trtype": "PCIe", 00:43:19.445 "traddr": "0000:5e:00.0" 00:43:19.445 }, 00:43:19.445 "ctrlr_data": { 00:43:19.445 "cntlid": 0, 00:43:19.445 "vendor_id": "0x8086", 00:43:19.445 "model_number": "INTEL SSDPF2KX038T1", 00:43:19.445 "serial_number": "PHAX137100A93P8CGN", 00:43:19.445 "firmware_revision": "9CV10015", 00:43:19.445 "subnqn": "nqn.2021-09.com.intel:PHAX137100A93P8CGN ", 00:43:19.445 "oacs": { 00:43:19.445 "security": 0, 00:43:19.445 "format": 1, 00:43:19.445 "firmware": 1, 00:43:19.445 "ns_manage": 1 00:43:19.445 }, 00:43:19.445 "multi_ctrlr": false, 00:43:19.445 "ana_reporting": false 00:43:19.445 }, 00:43:19.445 "vs": { 00:43:19.445 "nvme_version": "1.4" 00:43:19.445 }, 00:43:19.445 "ns_data": { 00:43:19.445 "id": 1, 00:43:19.445 "can_share": false 00:43:19.445 } 00:43:19.445 } 00:43:19.445 ], 00:43:19.445 "mp_policy": "active_passive" 00:43:19.445 } 00:43:19.445 } 00:43:19.445 ] 00:43:19.445 11:54:03 compress_isal -- common/autotest_common.sh@906 -- # return 0 00:43:19.445 11:54:03 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:43:19.445 ecad4cec-b900-4bac-84b3-7cac8bd6ed33 00:43:19.445 11:54:03 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:43:19.704 82afd0bf-77db-45ef-9284-a8138c835d68 00:43:19.704 11:54:03 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:43:19.704 11:54:03 compress_isal -- common/autotest_common.sh@898 -- # local bdev_name=lvs0/lv0 00:43:19.704 11:54:03 compress_isal -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:43:19.704 11:54:03 compress_isal -- common/autotest_common.sh@900 -- # local i 00:43:19.704 11:54:03 compress_isal -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:43:19.704 11:54:03 compress_isal -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:43:19.704 11:54:03 compress_isal -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:43:19.962 11:54:03 compress_isal -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:43:19.962 [ 00:43:19.962 { 00:43:19.962 "name": "82afd0bf-77db-45ef-9284-a8138c835d68", 00:43:19.962 "aliases": [ 00:43:19.962 "lvs0/lv0" 00:43:19.962 ], 00:43:19.963 "product_name": "Logical Volume", 00:43:19.963 "block_size": 512, 00:43:19.963 "num_blocks": 204800, 00:43:19.963 "uuid": "82afd0bf-77db-45ef-9284-a8138c835d68", 00:43:19.963 "assigned_rate_limits": { 00:43:19.963 "rw_ios_per_sec": 0, 00:43:19.963 "rw_mbytes_per_sec": 0, 00:43:19.963 "r_mbytes_per_sec": 0, 00:43:19.963 "w_mbytes_per_sec": 0 00:43:19.963 }, 00:43:19.963 "claimed": false, 00:43:19.963 "zoned": false, 00:43:19.963 "supported_io_types": { 00:43:19.963 "read": true, 00:43:19.963 "write": true, 00:43:19.963 "unmap": true, 00:43:19.963 "write_zeroes": true, 00:43:19.963 "flush": false, 00:43:19.963 "reset": true, 00:43:19.963 "compare": false, 00:43:19.963 "compare_and_write": false, 00:43:19.963 "abort": false, 00:43:19.963 "nvme_admin": false, 00:43:19.963 "nvme_io": false 00:43:19.963 }, 00:43:19.963 "driver_specific": { 00:43:19.963 "lvol": { 00:43:19.963 "lvol_store_uuid": "ecad4cec-b900-4bac-84b3-7cac8bd6ed33", 00:43:19.963 "base_bdev": "Nvme0n1", 00:43:19.963 "thin_provision": true, 00:43:19.963 "num_allocated_clusters": 0, 00:43:19.963 "snapshot": false, 00:43:19.963 "clone": false, 00:43:19.963 "esnap_clone": false 00:43:19.963 } 00:43:19.963 } 00:43:19.963 } 00:43:19.963 ] 00:43:19.963 11:54:03 compress_isal -- common/autotest_common.sh@906 -- # return 0 00:43:19.963 11:54:03 compress_isal -- compress/compress.sh@41 -- # '[' -z 4096 ']' 00:43:19.963 11:54:03 compress_isal -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 4096 00:43:20.222 [2024-06-10 11:54:04.032740] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:43:20.222 COMP_lvs0/lv0 00:43:20.222 11:54:04 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:43:20.222 11:54:04 compress_isal -- common/autotest_common.sh@898 -- # local bdev_name=COMP_lvs0/lv0 00:43:20.222 11:54:04 compress_isal -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:43:20.222 11:54:04 compress_isal -- common/autotest_common.sh@900 -- # local i 00:43:20.222 11:54:04 compress_isal -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:43:20.222 11:54:04 compress_isal -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:43:20.222 11:54:04 compress_isal -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:43:20.481 11:54:04 compress_isal -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:43:20.481 [ 00:43:20.481 { 00:43:20.481 "name": "COMP_lvs0/lv0", 00:43:20.481 "aliases": [ 00:43:20.481 "92be16d9-4bdf-534b-9174-1887b0acacf1" 00:43:20.481 ], 00:43:20.481 "product_name": "compress", 00:43:20.481 "block_size": 4096, 00:43:20.481 "num_blocks": 25088, 00:43:20.481 "uuid": "92be16d9-4bdf-534b-9174-1887b0acacf1", 00:43:20.481 "assigned_rate_limits": { 00:43:20.481 "rw_ios_per_sec": 0, 00:43:20.481 "rw_mbytes_per_sec": 0, 00:43:20.481 "r_mbytes_per_sec": 0, 00:43:20.481 "w_mbytes_per_sec": 0 00:43:20.481 }, 00:43:20.481 "claimed": false, 00:43:20.481 "zoned": false, 00:43:20.481 "supported_io_types": { 00:43:20.481 "read": true, 00:43:20.481 "write": true, 00:43:20.481 "unmap": false, 00:43:20.481 "write_zeroes": true, 00:43:20.481 "flush": false, 00:43:20.481 "reset": false, 00:43:20.481 "compare": false, 00:43:20.481 "compare_and_write": false, 00:43:20.481 "abort": false, 00:43:20.481 "nvme_admin": false, 00:43:20.481 "nvme_io": false 00:43:20.481 }, 00:43:20.481 "driver_specific": { 00:43:20.481 "compress": { 00:43:20.481 "name": "COMP_lvs0/lv0", 00:43:20.481 "base_bdev_name": "82afd0bf-77db-45ef-9284-a8138c835d68" 00:43:20.481 } 00:43:20.481 } 00:43:20.481 } 00:43:20.481 ] 00:43:20.741 11:54:04 compress_isal -- common/autotest_common.sh@906 -- # return 0 00:43:20.741 11:54:04 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:43:20.741 Running I/O for 3 seconds... 00:43:24.032 00:43:24.032 Latency(us) 00:43:24.032 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:43:24.032 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:43:24.032 Verification LBA range: start 0x0 length 0x3100 00:43:24.032 COMP_lvs0/lv0 : 3.00 4072.93 15.91 0.00 0.00 7819.28 769.34 6952.51 00:43:24.032 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:43:24.032 Verification LBA range: start 0x3100 length 0x3100 00:43:24.032 COMP_lvs0/lv0 : 3.00 4079.16 15.93 0.00 0.00 7810.39 498.64 6895.53 00:43:24.032 =================================================================================================================== 00:43:24.032 Total : 8152.09 31.84 0.00 0.00 7814.83 498.64 6952.51 00:43:24.032 0 00:43:24.032 11:54:07 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:43:24.032 11:54:07 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:43:24.032 11:54:07 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:43:24.032 11:54:07 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:43:24.032 11:54:07 compress_isal -- compress/compress.sh@78 -- # killprocess 274128 00:43:24.032 11:54:07 compress_isal -- common/autotest_common.sh@949 -- # '[' -z 274128 ']' 00:43:24.032 11:54:07 compress_isal -- common/autotest_common.sh@953 -- # kill -0 274128 00:43:24.032 11:54:07 compress_isal -- common/autotest_common.sh@954 -- # uname 00:43:24.032 11:54:07 compress_isal -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:43:24.032 11:54:07 compress_isal -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 274128 00:43:24.032 11:54:07 compress_isal -- common/autotest_common.sh@955 -- # process_name=reactor_1 00:43:24.032 11:54:07 compress_isal -- common/autotest_common.sh@959 -- # '[' reactor_1 = sudo ']' 00:43:24.032 11:54:07 compress_isal -- common/autotest_common.sh@967 -- # echo 'killing process with pid 274128' 00:43:24.032 killing process with pid 274128 00:43:24.032 11:54:07 compress_isal -- common/autotest_common.sh@968 -- # kill 274128 00:43:24.032 Received shutdown signal, test time was about 3.000000 seconds 00:43:24.032 00:43:24.032 Latency(us) 00:43:24.032 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:43:24.032 =================================================================================================================== 00:43:24.032 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:43:24.032 11:54:07 compress_isal -- common/autotest_common.sh@973 -- # wait 274128 00:43:26.005 11:54:09 compress_isal -- compress/compress.sh@89 -- # run_bdevio 00:43:26.005 11:54:09 compress_isal -- compress/compress.sh@50 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:43:26.005 11:54:09 compress_isal -- compress/compress.sh@55 -- # bdevio_pid=275610 00:43:26.005 11:54:09 compress_isal -- compress/compress.sh@56 -- # trap 'killprocess $bdevio_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:43:26.005 11:54:09 compress_isal -- compress/compress.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w 00:43:26.005 11:54:09 compress_isal -- compress/compress.sh@57 -- # waitforlisten 275610 00:43:26.005 11:54:09 compress_isal -- common/autotest_common.sh@830 -- # '[' -z 275610 ']' 00:43:26.005 11:54:09 compress_isal -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:43:26.005 11:54:09 compress_isal -- common/autotest_common.sh@835 -- # local max_retries=100 00:43:26.005 11:54:09 compress_isal -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:43:26.005 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:43:26.005 11:54:09 compress_isal -- common/autotest_common.sh@839 -- # xtrace_disable 00:43:26.005 11:54:09 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:43:26.005 [2024-06-10 11:54:09.623859] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:43:26.005 [2024-06-10 11:54:09.623921] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid275610 ] 00:43:26.005 [2024-06-10 11:54:09.711748] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:43:26.005 [2024-06-10 11:54:09.797556] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:43:26.005 [2024-06-10 11:54:09.797645] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:43:26.005 [2024-06-10 11:54:09.797648] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:43:26.573 11:54:10 compress_isal -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:43:26.573 11:54:10 compress_isal -- common/autotest_common.sh@863 -- # return 0 00:43:26.573 11:54:10 compress_isal -- compress/compress.sh@58 -- # create_vols 00:43:26.573 11:54:10 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:43:26.573 11:54:10 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:43:27.141 11:54:10 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:43:27.141 11:54:10 compress_isal -- common/autotest_common.sh@898 -- # local bdev_name=Nvme0n1 00:43:27.141 11:54:10 compress_isal -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:43:27.141 11:54:10 compress_isal -- common/autotest_common.sh@900 -- # local i 00:43:27.141 11:54:10 compress_isal -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:43:27.141 11:54:10 compress_isal -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:43:27.141 11:54:10 compress_isal -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:43:27.400 11:54:11 compress_isal -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:43:27.400 [ 00:43:27.400 { 00:43:27.400 "name": "Nvme0n1", 00:43:27.400 "aliases": [ 00:43:27.400 "01000000-0000-0000-5cd2-e4cdeb7b5351" 00:43:27.400 ], 00:43:27.400 "product_name": "NVMe disk", 00:43:27.400 "block_size": 512, 00:43:27.400 "num_blocks": 7501476528, 00:43:27.400 "uuid": "01000000-0000-0000-5cd2-e4cdeb7b5351", 00:43:27.400 "assigned_rate_limits": { 00:43:27.400 "rw_ios_per_sec": 0, 00:43:27.400 "rw_mbytes_per_sec": 0, 00:43:27.400 "r_mbytes_per_sec": 0, 00:43:27.400 "w_mbytes_per_sec": 0 00:43:27.400 }, 00:43:27.400 "claimed": false, 00:43:27.400 "zoned": false, 00:43:27.400 "supported_io_types": { 00:43:27.400 "read": true, 00:43:27.400 "write": true, 00:43:27.400 "unmap": true, 00:43:27.400 "write_zeroes": true, 00:43:27.400 "flush": true, 00:43:27.400 "reset": true, 00:43:27.400 "compare": false, 00:43:27.400 "compare_and_write": false, 00:43:27.400 "abort": true, 00:43:27.400 "nvme_admin": true, 00:43:27.400 "nvme_io": true 00:43:27.400 }, 00:43:27.400 "driver_specific": { 00:43:27.400 "nvme": [ 00:43:27.400 { 00:43:27.400 "pci_address": "0000:5e:00.0", 00:43:27.400 "trid": { 00:43:27.400 "trtype": "PCIe", 00:43:27.400 "traddr": "0000:5e:00.0" 00:43:27.400 }, 00:43:27.400 "ctrlr_data": { 00:43:27.400 "cntlid": 0, 00:43:27.400 "vendor_id": "0x8086", 00:43:27.400 "model_number": "INTEL SSDPF2KX038T1", 00:43:27.400 "serial_number": "PHAX137100A93P8CGN", 00:43:27.400 "firmware_revision": "9CV10015", 00:43:27.400 "subnqn": "nqn.2021-09.com.intel:PHAX137100A93P8CGN ", 00:43:27.400 "oacs": { 00:43:27.400 "security": 0, 00:43:27.400 "format": 1, 00:43:27.400 "firmware": 1, 00:43:27.400 "ns_manage": 1 00:43:27.400 }, 00:43:27.400 "multi_ctrlr": false, 00:43:27.400 "ana_reporting": false 00:43:27.400 }, 00:43:27.400 "vs": { 00:43:27.400 "nvme_version": "1.4" 00:43:27.400 }, 00:43:27.400 "ns_data": { 00:43:27.400 "id": 1, 00:43:27.400 "can_share": false 00:43:27.400 } 00:43:27.400 } 00:43:27.400 ], 00:43:27.400 "mp_policy": "active_passive" 00:43:27.400 } 00:43:27.400 } 00:43:27.400 ] 00:43:27.400 11:54:11 compress_isal -- common/autotest_common.sh@906 -- # return 0 00:43:27.400 11:54:11 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:43:27.660 129e23b5-83ec-4c17-af9b-1e92224914a1 00:43:27.660 11:54:11 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:43:27.918 45fa99b4-d0be-455f-8fa9-f3012ede7a7c 00:43:27.918 11:54:11 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:43:27.918 11:54:11 compress_isal -- common/autotest_common.sh@898 -- # local bdev_name=lvs0/lv0 00:43:27.918 11:54:11 compress_isal -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:43:27.918 11:54:11 compress_isal -- common/autotest_common.sh@900 -- # local i 00:43:27.918 11:54:11 compress_isal -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:43:27.918 11:54:11 compress_isal -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:43:27.918 11:54:11 compress_isal -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:43:27.918 11:54:11 compress_isal -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:43:28.178 [ 00:43:28.178 { 00:43:28.178 "name": "45fa99b4-d0be-455f-8fa9-f3012ede7a7c", 00:43:28.178 "aliases": [ 00:43:28.178 "lvs0/lv0" 00:43:28.178 ], 00:43:28.178 "product_name": "Logical Volume", 00:43:28.178 "block_size": 512, 00:43:28.178 "num_blocks": 204800, 00:43:28.178 "uuid": "45fa99b4-d0be-455f-8fa9-f3012ede7a7c", 00:43:28.178 "assigned_rate_limits": { 00:43:28.178 "rw_ios_per_sec": 0, 00:43:28.178 "rw_mbytes_per_sec": 0, 00:43:28.178 "r_mbytes_per_sec": 0, 00:43:28.178 "w_mbytes_per_sec": 0 00:43:28.178 }, 00:43:28.178 "claimed": false, 00:43:28.178 "zoned": false, 00:43:28.178 "supported_io_types": { 00:43:28.178 "read": true, 00:43:28.178 "write": true, 00:43:28.178 "unmap": true, 00:43:28.178 "write_zeroes": true, 00:43:28.178 "flush": false, 00:43:28.178 "reset": true, 00:43:28.178 "compare": false, 00:43:28.178 "compare_and_write": false, 00:43:28.178 "abort": false, 00:43:28.178 "nvme_admin": false, 00:43:28.178 "nvme_io": false 00:43:28.178 }, 00:43:28.178 "driver_specific": { 00:43:28.178 "lvol": { 00:43:28.178 "lvol_store_uuid": "129e23b5-83ec-4c17-af9b-1e92224914a1", 00:43:28.178 "base_bdev": "Nvme0n1", 00:43:28.178 "thin_provision": true, 00:43:28.178 "num_allocated_clusters": 0, 00:43:28.178 "snapshot": false, 00:43:28.178 "clone": false, 00:43:28.178 "esnap_clone": false 00:43:28.178 } 00:43:28.178 } 00:43:28.178 } 00:43:28.178 ] 00:43:28.178 11:54:12 compress_isal -- common/autotest_common.sh@906 -- # return 0 00:43:28.178 11:54:12 compress_isal -- compress/compress.sh@41 -- # '[' -z '' ']' 00:43:28.178 11:54:12 compress_isal -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:43:28.436 [2024-06-10 11:54:12.167480] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:43:28.436 COMP_lvs0/lv0 00:43:28.436 11:54:12 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:43:28.436 11:54:12 compress_isal -- common/autotest_common.sh@898 -- # local bdev_name=COMP_lvs0/lv0 00:43:28.436 11:54:12 compress_isal -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:43:28.436 11:54:12 compress_isal -- common/autotest_common.sh@900 -- # local i 00:43:28.436 11:54:12 compress_isal -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:43:28.436 11:54:12 compress_isal -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:43:28.436 11:54:12 compress_isal -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:43:28.436 11:54:12 compress_isal -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:43:28.694 [ 00:43:28.694 { 00:43:28.694 "name": "COMP_lvs0/lv0", 00:43:28.694 "aliases": [ 00:43:28.694 "6fcf276b-a884-52ec-8c9e-36e2d55194f1" 00:43:28.694 ], 00:43:28.694 "product_name": "compress", 00:43:28.694 "block_size": 512, 00:43:28.694 "num_blocks": 200704, 00:43:28.694 "uuid": "6fcf276b-a884-52ec-8c9e-36e2d55194f1", 00:43:28.694 "assigned_rate_limits": { 00:43:28.694 "rw_ios_per_sec": 0, 00:43:28.694 "rw_mbytes_per_sec": 0, 00:43:28.694 "r_mbytes_per_sec": 0, 00:43:28.694 "w_mbytes_per_sec": 0 00:43:28.694 }, 00:43:28.694 "claimed": false, 00:43:28.694 "zoned": false, 00:43:28.694 "supported_io_types": { 00:43:28.694 "read": true, 00:43:28.694 "write": true, 00:43:28.694 "unmap": false, 00:43:28.694 "write_zeroes": true, 00:43:28.694 "flush": false, 00:43:28.694 "reset": false, 00:43:28.694 "compare": false, 00:43:28.695 "compare_and_write": false, 00:43:28.695 "abort": false, 00:43:28.695 "nvme_admin": false, 00:43:28.695 "nvme_io": false 00:43:28.695 }, 00:43:28.695 "driver_specific": { 00:43:28.695 "compress": { 00:43:28.695 "name": "COMP_lvs0/lv0", 00:43:28.695 "base_bdev_name": "45fa99b4-d0be-455f-8fa9-f3012ede7a7c" 00:43:28.695 } 00:43:28.695 } 00:43:28.695 } 00:43:28.695 ] 00:43:28.695 11:54:12 compress_isal -- common/autotest_common.sh@906 -- # return 0 00:43:28.695 11:54:12 compress_isal -- compress/compress.sh@59 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:43:28.695 I/O targets: 00:43:28.695 COMP_lvs0/lv0: 200704 blocks of 512 bytes (98 MiB) 00:43:28.695 00:43:28.695 00:43:28.695 CUnit - A unit testing framework for C - Version 2.1-3 00:43:28.695 http://cunit.sourceforge.net/ 00:43:28.695 00:43:28.695 00:43:28.695 Suite: bdevio tests on: COMP_lvs0/lv0 00:43:28.695 Test: blockdev write read block ...passed 00:43:28.695 Test: blockdev write zeroes read block ...passed 00:43:28.954 Test: blockdev write zeroes read no split ...passed 00:43:28.954 Test: blockdev write zeroes read split ...passed 00:43:28.954 Test: blockdev write zeroes read split partial ...passed 00:43:28.954 Test: blockdev reset ...[2024-06-10 11:54:12.683344] vbdev_compress.c: 252:vbdev_compress_submit_request: *ERROR*: Unknown I/O type 5 00:43:28.954 passed 00:43:28.954 Test: blockdev write read 8 blocks ...passed 00:43:28.954 Test: blockdev write read size > 128k ...passed 00:43:28.954 Test: blockdev write read invalid size ...passed 00:43:28.954 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:43:28.954 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:43:28.954 Test: blockdev write read max offset ...passed 00:43:28.954 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:43:28.954 Test: blockdev writev readv 8 blocks ...passed 00:43:28.954 Test: blockdev writev readv 30 x 1block ...passed 00:43:28.954 Test: blockdev writev readv block ...passed 00:43:28.954 Test: blockdev writev readv size > 128k ...passed 00:43:28.954 Test: blockdev writev readv size > 128k in two iovs ...passed 00:43:28.954 Test: blockdev comparev and writev ...passed 00:43:28.954 Test: blockdev nvme passthru rw ...passed 00:43:28.954 Test: blockdev nvme passthru vendor specific ...passed 00:43:28.954 Test: blockdev nvme admin passthru ...passed 00:43:28.954 Test: blockdev copy ...passed 00:43:28.954 00:43:28.954 Run Summary: Type Total Ran Passed Failed Inactive 00:43:28.954 suites 1 1 n/a 0 0 00:43:28.954 tests 23 23 23 0 0 00:43:28.954 asserts 130 130 130 0 n/a 00:43:28.954 00:43:28.954 Elapsed time = 0.109 seconds 00:43:28.954 0 00:43:28.954 11:54:12 compress_isal -- compress/compress.sh@60 -- # destroy_vols 00:43:28.954 11:54:12 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:43:28.954 11:54:12 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:43:29.214 11:54:13 compress_isal -- compress/compress.sh@61 -- # trap - SIGINT SIGTERM EXIT 00:43:29.214 11:54:13 compress_isal -- compress/compress.sh@62 -- # killprocess 275610 00:43:29.214 11:54:13 compress_isal -- common/autotest_common.sh@949 -- # '[' -z 275610 ']' 00:43:29.214 11:54:13 compress_isal -- common/autotest_common.sh@953 -- # kill -0 275610 00:43:29.214 11:54:13 compress_isal -- common/autotest_common.sh@954 -- # uname 00:43:29.214 11:54:13 compress_isal -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:43:29.214 11:54:13 compress_isal -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 275610 00:43:29.214 11:54:13 compress_isal -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:43:29.214 11:54:13 compress_isal -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:43:29.214 11:54:13 compress_isal -- common/autotest_common.sh@967 -- # echo 'killing process with pid 275610' 00:43:29.214 killing process with pid 275610 00:43:29.214 11:54:13 compress_isal -- common/autotest_common.sh@968 -- # kill 275610 00:43:29.214 11:54:13 compress_isal -- common/autotest_common.sh@973 -- # wait 275610 00:43:31.125 11:54:14 compress_isal -- compress/compress.sh@91 -- # '[' 0 -eq 1 ']' 00:43:31.125 11:54:14 compress_isal -- compress/compress.sh@120 -- # rm -rf /tmp/pmem 00:43:31.125 00:43:31.125 real 0m29.554s 00:43:31.125 user 1m8.428s 00:43:31.125 sys 0m3.437s 00:43:31.125 11:54:14 compress_isal -- common/autotest_common.sh@1125 -- # xtrace_disable 00:43:31.125 11:54:14 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:43:31.125 ************************************ 00:43:31.125 END TEST compress_isal 00:43:31.125 ************************************ 00:43:31.125 11:54:14 -- spdk/autotest.sh@352 -- # '[' 0 -eq 1 ']' 00:43:31.125 11:54:14 -- spdk/autotest.sh@356 -- # '[' 1 -eq 1 ']' 00:43:31.125 11:54:14 -- spdk/autotest.sh@357 -- # run_test blockdev_crypto_aesni /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_aesni 00:43:31.125 11:54:14 -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:43:31.125 11:54:14 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:43:31.125 11:54:14 -- common/autotest_common.sh@10 -- # set +x 00:43:31.125 ************************************ 00:43:31.125 START TEST blockdev_crypto_aesni 00:43:31.125 ************************************ 00:43:31.125 11:54:14 blockdev_crypto_aesni -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_aesni 00:43:31.125 * Looking for test storage... 00:43:31.125 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:43:31.125 11:54:14 blockdev_crypto_aesni -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:43:31.125 11:54:14 blockdev_crypto_aesni -- bdev/nbd_common.sh@6 -- # set -e 00:43:31.125 11:54:14 blockdev_crypto_aesni -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:43:31.125 11:54:14 blockdev_crypto_aesni -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:43:31.125 11:54:14 blockdev_crypto_aesni -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:43:31.125 11:54:14 blockdev_crypto_aesni -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:43:31.125 11:54:14 blockdev_crypto_aesni -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:43:31.126 11:54:14 blockdev_crypto_aesni -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:43:31.126 11:54:14 blockdev_crypto_aesni -- bdev/blockdev.sh@20 -- # : 00:43:31.126 11:54:14 blockdev_crypto_aesni -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:43:31.126 11:54:14 blockdev_crypto_aesni -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:43:31.126 11:54:14 blockdev_crypto_aesni -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:43:31.126 11:54:14 blockdev_crypto_aesni -- bdev/blockdev.sh@674 -- # uname -s 00:43:31.126 11:54:14 blockdev_crypto_aesni -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:43:31.126 11:54:14 blockdev_crypto_aesni -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:43:31.126 11:54:14 blockdev_crypto_aesni -- bdev/blockdev.sh@682 -- # test_type=crypto_aesni 00:43:31.126 11:54:14 blockdev_crypto_aesni -- bdev/blockdev.sh@683 -- # crypto_device= 00:43:31.126 11:54:14 blockdev_crypto_aesni -- bdev/blockdev.sh@684 -- # dek= 00:43:31.126 11:54:14 blockdev_crypto_aesni -- bdev/blockdev.sh@685 -- # env_ctx= 00:43:31.126 11:54:14 blockdev_crypto_aesni -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:43:31.126 11:54:14 blockdev_crypto_aesni -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:43:31.126 11:54:14 blockdev_crypto_aesni -- bdev/blockdev.sh@690 -- # [[ crypto_aesni == bdev ]] 00:43:31.126 11:54:14 blockdev_crypto_aesni -- bdev/blockdev.sh@690 -- # [[ crypto_aesni == crypto_* ]] 00:43:31.126 11:54:14 blockdev_crypto_aesni -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:43:31.126 11:54:14 blockdev_crypto_aesni -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:43:31.126 11:54:14 blockdev_crypto_aesni -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=276362 00:43:31.126 11:54:14 blockdev_crypto_aesni -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:43:31.126 11:54:14 blockdev_crypto_aesni -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:43:31.126 11:54:14 blockdev_crypto_aesni -- bdev/blockdev.sh@49 -- # waitforlisten 276362 00:43:31.126 11:54:14 blockdev_crypto_aesni -- common/autotest_common.sh@830 -- # '[' -z 276362 ']' 00:43:31.126 11:54:14 blockdev_crypto_aesni -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:43:31.126 11:54:14 blockdev_crypto_aesni -- common/autotest_common.sh@835 -- # local max_retries=100 00:43:31.126 11:54:14 blockdev_crypto_aesni -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:43:31.126 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:43:31.126 11:54:14 blockdev_crypto_aesni -- common/autotest_common.sh@839 -- # xtrace_disable 00:43:31.126 11:54:14 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:43:31.126 [2024-06-10 11:54:14.974199] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:43:31.126 [2024-06-10 11:54:14.974258] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid276362 ] 00:43:31.126 [2024-06-10 11:54:15.061751] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:43:31.389 [2024-06-10 11:54:15.148280] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:43:31.955 11:54:15 blockdev_crypto_aesni -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:43:31.955 11:54:15 blockdev_crypto_aesni -- common/autotest_common.sh@863 -- # return 0 00:43:31.955 11:54:15 blockdev_crypto_aesni -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:43:31.955 11:54:15 blockdev_crypto_aesni -- bdev/blockdev.sh@705 -- # setup_crypto_aesni_conf 00:43:31.955 11:54:15 blockdev_crypto_aesni -- bdev/blockdev.sh@146 -- # rpc_cmd 00:43:31.955 11:54:15 blockdev_crypto_aesni -- common/autotest_common.sh@560 -- # xtrace_disable 00:43:31.955 11:54:15 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:43:31.955 [2024-06-10 11:54:15.782199] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:43:31.955 [2024-06-10 11:54:15.790231] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:43:31.955 [2024-06-10 11:54:15.798245] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:43:31.955 [2024-06-10 11:54:15.857519] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:43:34.489 true 00:43:34.489 true 00:43:34.489 true 00:43:34.489 true 00:43:34.489 Malloc0 00:43:34.489 Malloc1 00:43:34.489 Malloc2 00:43:34.489 Malloc3 00:43:34.489 [2024-06-10 11:54:18.201851] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:43:34.489 crypto_ram 00:43:34.489 [2024-06-10 11:54:18.209879] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:43:34.489 crypto_ram2 00:43:34.489 [2024-06-10 11:54:18.217895] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:43:34.489 crypto_ram3 00:43:34.489 [2024-06-10 11:54:18.225913] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:43:34.489 crypto_ram4 00:43:34.489 11:54:18 blockdev_crypto_aesni -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:43:34.489 11:54:18 blockdev_crypto_aesni -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:43:34.489 11:54:18 blockdev_crypto_aesni -- common/autotest_common.sh@560 -- # xtrace_disable 00:43:34.489 11:54:18 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:43:34.489 11:54:18 blockdev_crypto_aesni -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:43:34.489 11:54:18 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # cat 00:43:34.489 11:54:18 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:43:34.489 11:54:18 blockdev_crypto_aesni -- common/autotest_common.sh@560 -- # xtrace_disable 00:43:34.489 11:54:18 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:43:34.489 11:54:18 blockdev_crypto_aesni -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:43:34.489 11:54:18 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:43:34.489 11:54:18 blockdev_crypto_aesni -- common/autotest_common.sh@560 -- # xtrace_disable 00:43:34.489 11:54:18 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:43:34.489 11:54:18 blockdev_crypto_aesni -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:43:34.490 11:54:18 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:43:34.490 11:54:18 blockdev_crypto_aesni -- common/autotest_common.sh@560 -- # xtrace_disable 00:43:34.490 11:54:18 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:43:34.490 11:54:18 blockdev_crypto_aesni -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:43:34.490 11:54:18 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:43:34.490 11:54:18 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:43:34.490 11:54:18 blockdev_crypto_aesni -- common/autotest_common.sh@560 -- # xtrace_disable 00:43:34.490 11:54:18 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:43:34.490 11:54:18 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:43:34.490 11:54:18 blockdev_crypto_aesni -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:43:34.490 11:54:18 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:43:34.490 11:54:18 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # jq -r .name 00:43:34.490 11:54:18 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "fe30364a-b10b-50e8-8e8b-0a2d31ea9b3e"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "fe30364a-b10b-50e8-8e8b-0a2d31ea9b3e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "2bc1a8c1-a478-574c-abbc-147afca836f4"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "2bc1a8c1-a478-574c-abbc-147afca836f4",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "da8eb50c-6ea2-5544-a953-5921563ec487"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "da8eb50c-6ea2-5544-a953-5921563ec487",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "0f9d5200-8433-5694-a444-28dbe4407d6e"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "0f9d5200-8433-5694-a444-28dbe4407d6e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:43:34.490 11:54:18 blockdev_crypto_aesni -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:43:34.490 11:54:18 blockdev_crypto_aesni -- bdev/blockdev.sh@752 -- # hello_world_bdev=crypto_ram 00:43:34.490 11:54:18 blockdev_crypto_aesni -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:43:34.490 11:54:18 blockdev_crypto_aesni -- bdev/blockdev.sh@754 -- # killprocess 276362 00:43:34.490 11:54:18 blockdev_crypto_aesni -- common/autotest_common.sh@949 -- # '[' -z 276362 ']' 00:43:34.490 11:54:18 blockdev_crypto_aesni -- common/autotest_common.sh@953 -- # kill -0 276362 00:43:34.490 11:54:18 blockdev_crypto_aesni -- common/autotest_common.sh@954 -- # uname 00:43:34.750 11:54:18 blockdev_crypto_aesni -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:43:34.750 11:54:18 blockdev_crypto_aesni -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 276362 00:43:34.750 11:54:18 blockdev_crypto_aesni -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:43:34.750 11:54:18 blockdev_crypto_aesni -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:43:34.750 11:54:18 blockdev_crypto_aesni -- common/autotest_common.sh@967 -- # echo 'killing process with pid 276362' 00:43:34.750 killing process with pid 276362 00:43:34.750 11:54:18 blockdev_crypto_aesni -- common/autotest_common.sh@968 -- # kill 276362 00:43:34.750 11:54:18 blockdev_crypto_aesni -- common/autotest_common.sh@973 -- # wait 276362 00:43:35.318 11:54:19 blockdev_crypto_aesni -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:43:35.318 11:54:19 blockdev_crypto_aesni -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:43:35.318 11:54:19 blockdev_crypto_aesni -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:43:35.318 11:54:19 blockdev_crypto_aesni -- common/autotest_common.sh@1106 -- # xtrace_disable 00:43:35.318 11:54:19 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:43:35.318 ************************************ 00:43:35.318 START TEST bdev_hello_world 00:43:35.318 ************************************ 00:43:35.318 11:54:19 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:43:35.318 [2024-06-10 11:54:19.096434] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:43:35.318 [2024-06-10 11:54:19.096483] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid276924 ] 00:43:35.318 [2024-06-10 11:54:19.184229] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:43:35.578 [2024-06-10 11:54:19.267494] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:43:35.578 [2024-06-10 11:54:19.288410] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:43:35.578 [2024-06-10 11:54:19.296440] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:43:35.578 [2024-06-10 11:54:19.304461] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:43:35.578 [2024-06-10 11:54:19.399974] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:43:38.132 [2024-06-10 11:54:21.578535] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:43:38.132 [2024-06-10 11:54:21.578594] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:43:38.132 [2024-06-10 11:54:21.578605] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:43:38.132 [2024-06-10 11:54:21.586551] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:43:38.132 [2024-06-10 11:54:21.586565] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:43:38.132 [2024-06-10 11:54:21.586572] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:43:38.132 [2024-06-10 11:54:21.594572] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:43:38.132 [2024-06-10 11:54:21.594584] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:43:38.132 [2024-06-10 11:54:21.594592] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:43:38.132 [2024-06-10 11:54:21.602591] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:43:38.132 [2024-06-10 11:54:21.602603] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:43:38.132 [2024-06-10 11:54:21.602610] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:43:38.132 [2024-06-10 11:54:21.675286] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:43:38.132 [2024-06-10 11:54:21.675327] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:43:38.132 [2024-06-10 11:54:21.675340] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:43:38.132 [2024-06-10 11:54:21.676194] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:43:38.132 [2024-06-10 11:54:21.676257] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:43:38.132 [2024-06-10 11:54:21.676269] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:43:38.132 [2024-06-10 11:54:21.676302] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:43:38.132 00:43:38.132 [2024-06-10 11:54:21.676315] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:43:38.132 00:43:38.132 real 0m3.013s 00:43:38.132 user 0m2.665s 00:43:38.132 sys 0m0.317s 00:43:38.132 11:54:22 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@1125 -- # xtrace_disable 00:43:38.132 11:54:22 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:43:38.132 ************************************ 00:43:38.132 END TEST bdev_hello_world 00:43:38.132 ************************************ 00:43:38.391 11:54:22 blockdev_crypto_aesni -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:43:38.391 11:54:22 blockdev_crypto_aesni -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:43:38.391 11:54:22 blockdev_crypto_aesni -- common/autotest_common.sh@1106 -- # xtrace_disable 00:43:38.391 11:54:22 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:43:38.391 ************************************ 00:43:38.391 START TEST bdev_bounds 00:43:38.391 ************************************ 00:43:38.391 11:54:22 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@1124 -- # bdev_bounds '' 00:43:38.391 11:54:22 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:43:38.391 11:54:22 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=277441 00:43:38.391 11:54:22 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:43:38.391 11:54:22 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 277441' 00:43:38.391 Process bdevio pid: 277441 00:43:38.391 11:54:22 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 277441 00:43:38.391 11:54:22 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@830 -- # '[' -z 277441 ']' 00:43:38.391 11:54:22 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:43:38.391 11:54:22 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@835 -- # local max_retries=100 00:43:38.391 11:54:22 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:43:38.391 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:43:38.391 11:54:22 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@839 -- # xtrace_disable 00:43:38.391 11:54:22 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:43:38.391 [2024-06-10 11:54:22.176970] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:43:38.391 [2024-06-10 11:54:22.177019] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid277441 ] 00:43:38.391 [2024-06-10 11:54:22.259771] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:43:38.650 [2024-06-10 11:54:22.350917] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:43:38.650 [2024-06-10 11:54:22.351002] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:43:38.650 [2024-06-10 11:54:22.351005] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:43:38.650 [2024-06-10 11:54:22.371945] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:43:38.650 [2024-06-10 11:54:22.379973] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:43:38.650 [2024-06-10 11:54:22.387988] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:43:38.650 [2024-06-10 11:54:22.489306] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:43:41.186 [2024-06-10 11:54:24.680723] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:43:41.186 [2024-06-10 11:54:24.680778] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:43:41.186 [2024-06-10 11:54:24.680788] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:43:41.186 [2024-06-10 11:54:24.688742] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:43:41.186 [2024-06-10 11:54:24.688757] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:43:41.186 [2024-06-10 11:54:24.688765] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:43:41.186 [2024-06-10 11:54:24.696762] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:43:41.186 [2024-06-10 11:54:24.696776] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:43:41.186 [2024-06-10 11:54:24.696783] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:43:41.186 [2024-06-10 11:54:24.704783] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:43:41.186 [2024-06-10 11:54:24.704796] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:43:41.186 [2024-06-10 11:54:24.704803] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:43:41.186 11:54:24 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:43:41.186 11:54:24 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@863 -- # return 0 00:43:41.186 11:54:24 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:43:41.186 I/O targets: 00:43:41.186 crypto_ram: 65536 blocks of 512 bytes (32 MiB) 00:43:41.187 crypto_ram2: 65536 blocks of 512 bytes (32 MiB) 00:43:41.187 crypto_ram3: 8192 blocks of 4096 bytes (32 MiB) 00:43:41.187 crypto_ram4: 8192 blocks of 4096 bytes (32 MiB) 00:43:41.187 00:43:41.187 00:43:41.187 CUnit - A unit testing framework for C - Version 2.1-3 00:43:41.187 http://cunit.sourceforge.net/ 00:43:41.187 00:43:41.187 00:43:41.187 Suite: bdevio tests on: crypto_ram4 00:43:41.187 Test: blockdev write read block ...passed 00:43:41.187 Test: blockdev write zeroes read block ...passed 00:43:41.187 Test: blockdev write zeroes read no split ...passed 00:43:41.187 Test: blockdev write zeroes read split ...passed 00:43:41.187 Test: blockdev write zeroes read split partial ...passed 00:43:41.187 Test: blockdev reset ...passed 00:43:41.187 Test: blockdev write read 8 blocks ...passed 00:43:41.187 Test: blockdev write read size > 128k ...passed 00:43:41.187 Test: blockdev write read invalid size ...passed 00:43:41.187 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:43:41.187 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:43:41.187 Test: blockdev write read max offset ...passed 00:43:41.187 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:43:41.187 Test: blockdev writev readv 8 blocks ...passed 00:43:41.187 Test: blockdev writev readv 30 x 1block ...passed 00:43:41.187 Test: blockdev writev readv block ...passed 00:43:41.187 Test: blockdev writev readv size > 128k ...passed 00:43:41.187 Test: blockdev writev readv size > 128k in two iovs ...passed 00:43:41.187 Test: blockdev comparev and writev ...passed 00:43:41.187 Test: blockdev nvme passthru rw ...passed 00:43:41.187 Test: blockdev nvme passthru vendor specific ...passed 00:43:41.187 Test: blockdev nvme admin passthru ...passed 00:43:41.187 Test: blockdev copy ...passed 00:43:41.187 Suite: bdevio tests on: crypto_ram3 00:43:41.187 Test: blockdev write read block ...passed 00:43:41.187 Test: blockdev write zeroes read block ...passed 00:43:41.187 Test: blockdev write zeroes read no split ...passed 00:43:41.187 Test: blockdev write zeroes read split ...passed 00:43:41.187 Test: blockdev write zeroes read split partial ...passed 00:43:41.187 Test: blockdev reset ...passed 00:43:41.187 Test: blockdev write read 8 blocks ...passed 00:43:41.187 Test: blockdev write read size > 128k ...passed 00:43:41.187 Test: blockdev write read invalid size ...passed 00:43:41.187 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:43:41.187 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:43:41.187 Test: blockdev write read max offset ...passed 00:43:41.187 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:43:41.187 Test: blockdev writev readv 8 blocks ...passed 00:43:41.187 Test: blockdev writev readv 30 x 1block ...passed 00:43:41.187 Test: blockdev writev readv block ...passed 00:43:41.187 Test: blockdev writev readv size > 128k ...passed 00:43:41.187 Test: blockdev writev readv size > 128k in two iovs ...passed 00:43:41.187 Test: blockdev comparev and writev ...passed 00:43:41.187 Test: blockdev nvme passthru rw ...passed 00:43:41.187 Test: blockdev nvme passthru vendor specific ...passed 00:43:41.187 Test: blockdev nvme admin passthru ...passed 00:43:41.187 Test: blockdev copy ...passed 00:43:41.187 Suite: bdevio tests on: crypto_ram2 00:43:41.187 Test: blockdev write read block ...passed 00:43:41.187 Test: blockdev write zeroes read block ...passed 00:43:41.187 Test: blockdev write zeroes read no split ...passed 00:43:41.187 Test: blockdev write zeroes read split ...passed 00:43:41.187 Test: blockdev write zeroes read split partial ...passed 00:43:41.187 Test: blockdev reset ...passed 00:43:41.187 Test: blockdev write read 8 blocks ...passed 00:43:41.187 Test: blockdev write read size > 128k ...passed 00:43:41.187 Test: blockdev write read invalid size ...passed 00:43:41.187 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:43:41.187 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:43:41.187 Test: blockdev write read max offset ...passed 00:43:41.187 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:43:41.187 Test: blockdev writev readv 8 blocks ...passed 00:43:41.187 Test: blockdev writev readv 30 x 1block ...passed 00:43:41.187 Test: blockdev writev readv block ...passed 00:43:41.187 Test: blockdev writev readv size > 128k ...passed 00:43:41.187 Test: blockdev writev readv size > 128k in two iovs ...passed 00:43:41.187 Test: blockdev comparev and writev ...passed 00:43:41.187 Test: blockdev nvme passthru rw ...passed 00:43:41.187 Test: blockdev nvme passthru vendor specific ...passed 00:43:41.187 Test: blockdev nvme admin passthru ...passed 00:43:41.187 Test: blockdev copy ...passed 00:43:41.187 Suite: bdevio tests on: crypto_ram 00:43:41.187 Test: blockdev write read block ...passed 00:43:41.187 Test: blockdev write zeroes read block ...passed 00:43:41.187 Test: blockdev write zeroes read no split ...passed 00:43:41.187 Test: blockdev write zeroes read split ...passed 00:43:41.447 Test: blockdev write zeroes read split partial ...passed 00:43:41.447 Test: blockdev reset ...passed 00:43:41.447 Test: blockdev write read 8 blocks ...passed 00:43:41.447 Test: blockdev write read size > 128k ...passed 00:43:41.447 Test: blockdev write read invalid size ...passed 00:43:41.447 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:43:41.447 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:43:41.447 Test: blockdev write read max offset ...passed 00:43:41.447 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:43:41.447 Test: blockdev writev readv 8 blocks ...passed 00:43:41.447 Test: blockdev writev readv 30 x 1block ...passed 00:43:41.447 Test: blockdev writev readv block ...passed 00:43:41.447 Test: blockdev writev readv size > 128k ...passed 00:43:41.447 Test: blockdev writev readv size > 128k in two iovs ...passed 00:43:41.447 Test: blockdev comparev and writev ...passed 00:43:41.447 Test: blockdev nvme passthru rw ...passed 00:43:41.447 Test: blockdev nvme passthru vendor specific ...passed 00:43:41.447 Test: blockdev nvme admin passthru ...passed 00:43:41.447 Test: blockdev copy ...passed 00:43:41.447 00:43:41.447 Run Summary: Type Total Ran Passed Failed Inactive 00:43:41.447 suites 4 4 n/a 0 0 00:43:41.447 tests 92 92 92 0 0 00:43:41.447 asserts 520 520 520 0 n/a 00:43:41.447 00:43:41.447 Elapsed time = 0.532 seconds 00:43:41.447 0 00:43:41.447 11:54:25 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 277441 00:43:41.447 11:54:25 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@949 -- # '[' -z 277441 ']' 00:43:41.447 11:54:25 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@953 -- # kill -0 277441 00:43:41.447 11:54:25 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@954 -- # uname 00:43:41.447 11:54:25 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:43:41.447 11:54:25 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 277441 00:43:41.447 11:54:25 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:43:41.447 11:54:25 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:43:41.447 11:54:25 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@967 -- # echo 'killing process with pid 277441' 00:43:41.447 killing process with pid 277441 00:43:41.447 11:54:25 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@968 -- # kill 277441 00:43:41.447 11:54:25 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@973 -- # wait 277441 00:43:41.707 11:54:25 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:43:41.707 00:43:41.707 real 0m3.464s 00:43:41.707 user 0m9.674s 00:43:41.707 sys 0m0.506s 00:43:41.707 11:54:25 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@1125 -- # xtrace_disable 00:43:41.707 11:54:25 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:43:41.707 ************************************ 00:43:41.707 END TEST bdev_bounds 00:43:41.707 ************************************ 00:43:41.707 11:54:25 blockdev_crypto_aesni -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '' 00:43:41.707 11:54:25 blockdev_crypto_aesni -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:43:41.707 11:54:25 blockdev_crypto_aesni -- common/autotest_common.sh@1106 -- # xtrace_disable 00:43:41.707 11:54:25 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:43:41.967 ************************************ 00:43:41.967 START TEST bdev_nbd 00:43:41.967 ************************************ 00:43:41.967 11:54:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@1124 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '' 00:43:41.967 11:54:25 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:43:41.967 11:54:25 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:43:41.967 11:54:25 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:43:41.967 11:54:25 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:43:41.967 11:54:25 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:43:41.967 11:54:25 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:43:41.967 11:54:25 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=4 00:43:41.967 11:54:25 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:43:41.967 11:54:25 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:43:41.967 11:54:25 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:43:41.967 11:54:25 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=4 00:43:41.967 11:54:25 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:43:41.967 11:54:25 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:43:41.967 11:54:25 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:43:41.967 11:54:25 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:43:41.967 11:54:25 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=277845 00:43:41.967 11:54:25 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:43:41.967 11:54:25 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:43:41.967 11:54:25 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 277845 /var/tmp/spdk-nbd.sock 00:43:41.967 11:54:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@830 -- # '[' -z 277845 ']' 00:43:41.967 11:54:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:43:41.967 11:54:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@835 -- # local max_retries=100 00:43:41.967 11:54:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:43:41.967 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:43:41.967 11:54:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@839 -- # xtrace_disable 00:43:41.967 11:54:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:43:41.967 [2024-06-10 11:54:25.745970] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:43:41.967 [2024-06-10 11:54:25.746023] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:43:41.967 [2024-06-10 11:54:25.833575] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:43:42.226 [2024-06-10 11:54:25.922235] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:43:42.227 [2024-06-10 11:54:25.943117] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:43:42.227 [2024-06-10 11:54:25.951141] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:43:42.227 [2024-06-10 11:54:25.959158] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:43:42.227 [2024-06-10 11:54:26.065361] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:43:44.763 [2024-06-10 11:54:28.259198] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:43:44.763 [2024-06-10 11:54:28.259253] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:43:44.763 [2024-06-10 11:54:28.259263] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:43:44.763 [2024-06-10 11:54:28.267217] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:43:44.763 [2024-06-10 11:54:28.267231] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:43:44.763 [2024-06-10 11:54:28.267239] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:43:44.763 [2024-06-10 11:54:28.275237] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:43:44.763 [2024-06-10 11:54:28.275253] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:43:44.763 [2024-06-10 11:54:28.275261] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:43:44.763 [2024-06-10 11:54:28.283257] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:43:44.763 [2024-06-10 11:54:28.283269] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:43:44.763 [2024-06-10 11:54:28.283276] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:43:44.763 11:54:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:43:44.763 11:54:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@863 -- # return 0 00:43:44.763 11:54:28 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' 00:43:44.763 11:54:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:43:44.763 11:54:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:43:44.763 11:54:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:43:44.763 11:54:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' 00:43:44.763 11:54:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:43:44.763 11:54:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:43:44.763 11:54:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:43:44.763 11:54:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:43:44.763 11:54:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:43:44.763 11:54:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:43:44.763 11:54:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:43:44.763 11:54:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:43:44.763 11:54:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:43:44.763 11:54:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:43:44.763 11:54:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:43:44.763 11:54:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:43:44.763 11:54:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:43:44.763 11:54:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:43:44.763 11:54:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:43:44.763 11:54:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:43:44.763 11:54:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:43:44.763 11:54:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:43:44.763 11:54:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:43:44.763 11:54:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:43:44.763 1+0 records in 00:43:44.763 1+0 records out 00:43:44.763 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000187872 s, 21.8 MB/s 00:43:44.763 11:54:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:43:44.763 11:54:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:43:44.763 11:54:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:43:44.763 11:54:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:43:44.763 11:54:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:43:44.763 11:54:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:43:44.763 11:54:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:43:44.763 11:54:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 00:43:45.022 11:54:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:43:45.022 11:54:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:43:45.022 11:54:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:43:45.022 11:54:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:43:45.022 11:54:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:43:45.023 11:54:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:43:45.023 11:54:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:43:45.023 11:54:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:43:45.023 11:54:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:43:45.023 11:54:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:43:45.023 11:54:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:43:45.023 11:54:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:43:45.023 1+0 records in 00:43:45.023 1+0 records out 00:43:45.023 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000281973 s, 14.5 MB/s 00:43:45.023 11:54:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:43:45.023 11:54:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:43:45.023 11:54:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:43:45.023 11:54:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:43:45.023 11:54:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:43:45.023 11:54:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:43:45.023 11:54:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:43:45.023 11:54:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:43:45.282 11:54:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:43:45.282 11:54:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:43:45.282 11:54:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:43:45.282 11:54:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd2 00:43:45.282 11:54:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:43:45.282 11:54:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:43:45.282 11:54:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:43:45.282 11:54:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd2 /proc/partitions 00:43:45.282 11:54:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:43:45.282 11:54:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:43:45.282 11:54:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:43:45.282 11:54:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:43:45.282 1+0 records in 00:43:45.282 1+0 records out 00:43:45.282 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000287199 s, 14.3 MB/s 00:43:45.282 11:54:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:43:45.282 11:54:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:43:45.282 11:54:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:43:45.282 11:54:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:43:45.282 11:54:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:43:45.282 11:54:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:43:45.282 11:54:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:43:45.282 11:54:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram4 00:43:45.282 11:54:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:43:45.282 11:54:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:43:45.282 11:54:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:43:45.282 11:54:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd3 00:43:45.541 11:54:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:43:45.541 11:54:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:43:45.541 11:54:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:43:45.541 11:54:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd3 /proc/partitions 00:43:45.541 11:54:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:43:45.541 11:54:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:43:45.541 11:54:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:43:45.541 11:54:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:43:45.541 1+0 records in 00:43:45.541 1+0 records out 00:43:45.541 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000268507 s, 15.3 MB/s 00:43:45.541 11:54:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:43:45.541 11:54:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:43:45.541 11:54:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:43:45.541 11:54:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:43:45.541 11:54:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:43:45.541 11:54:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:43:45.541 11:54:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:43:45.541 11:54:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:43:45.541 11:54:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:43:45.541 { 00:43:45.541 "nbd_device": "/dev/nbd0", 00:43:45.541 "bdev_name": "crypto_ram" 00:43:45.541 }, 00:43:45.541 { 00:43:45.542 "nbd_device": "/dev/nbd1", 00:43:45.542 "bdev_name": "crypto_ram2" 00:43:45.542 }, 00:43:45.542 { 00:43:45.542 "nbd_device": "/dev/nbd2", 00:43:45.542 "bdev_name": "crypto_ram3" 00:43:45.542 }, 00:43:45.542 { 00:43:45.542 "nbd_device": "/dev/nbd3", 00:43:45.542 "bdev_name": "crypto_ram4" 00:43:45.542 } 00:43:45.542 ]' 00:43:45.542 11:54:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:43:45.542 11:54:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:43:45.542 { 00:43:45.542 "nbd_device": "/dev/nbd0", 00:43:45.542 "bdev_name": "crypto_ram" 00:43:45.542 }, 00:43:45.542 { 00:43:45.542 "nbd_device": "/dev/nbd1", 00:43:45.542 "bdev_name": "crypto_ram2" 00:43:45.542 }, 00:43:45.542 { 00:43:45.542 "nbd_device": "/dev/nbd2", 00:43:45.542 "bdev_name": "crypto_ram3" 00:43:45.542 }, 00:43:45.542 { 00:43:45.542 "nbd_device": "/dev/nbd3", 00:43:45.542 "bdev_name": "crypto_ram4" 00:43:45.542 } 00:43:45.542 ]' 00:43:45.542 11:54:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:43:45.542 11:54:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3' 00:43:45.542 11:54:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:43:45.542 11:54:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3') 00:43:45.542 11:54:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:43:45.542 11:54:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:43:45.542 11:54:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:43:45.542 11:54:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:43:45.801 11:54:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:43:45.801 11:54:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:43:45.801 11:54:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:43:45.801 11:54:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:43:45.801 11:54:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:43:45.801 11:54:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:43:45.801 11:54:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:43:45.801 11:54:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:43:45.801 11:54:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:43:45.801 11:54:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:43:46.060 11:54:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:43:46.060 11:54:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:43:46.060 11:54:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:43:46.060 11:54:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:43:46.060 11:54:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:43:46.060 11:54:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:43:46.060 11:54:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:43:46.060 11:54:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:43:46.060 11:54:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:43:46.060 11:54:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:43:46.319 11:54:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:43:46.319 11:54:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:43:46.319 11:54:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:43:46.319 11:54:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:43:46.319 11:54:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:43:46.319 11:54:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:43:46.319 11:54:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:43:46.319 11:54:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:43:46.319 11:54:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:43:46.319 11:54:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:43:46.319 11:54:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:43:46.319 11:54:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:43:46.319 11:54:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:43:46.319 11:54:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:43:46.319 11:54:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:43:46.319 11:54:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:43:46.319 11:54:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:43:46.319 11:54:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:43:46.319 11:54:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:43:46.319 11:54:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:43:46.319 11:54:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:43:46.578 11:54:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:43:46.578 11:54:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:43:46.578 11:54:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:43:46.578 11:54:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:43:46.578 11:54:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:43:46.578 11:54:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:43:46.578 11:54:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:43:46.578 11:54:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:43:46.578 11:54:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:43:46.578 11:54:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:43:46.578 11:54:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:43:46.578 11:54:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:43:46.578 11:54:30 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:43:46.578 11:54:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:43:46.578 11:54:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:43:46.578 11:54:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:43:46.578 11:54:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:43:46.578 11:54:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:43:46.578 11:54:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:43:46.578 11:54:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:43:46.578 11:54:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:43:46.578 11:54:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:43:46.578 11:54:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:43:46.578 11:54:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:43:46.578 11:54:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:43:46.578 11:54:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:43:46.578 11:54:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:43:46.578 11:54:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:43:46.838 /dev/nbd0 00:43:46.838 11:54:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:43:46.838 11:54:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:43:46.838 11:54:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:43:46.838 11:54:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:43:46.838 11:54:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:43:46.838 11:54:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:43:46.838 11:54:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:43:46.838 11:54:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:43:46.838 11:54:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:43:46.838 11:54:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:43:46.838 11:54:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:43:46.838 1+0 records in 00:43:46.838 1+0 records out 00:43:46.838 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00027249 s, 15.0 MB/s 00:43:46.838 11:54:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:43:46.838 11:54:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:43:46.838 11:54:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:43:46.838 11:54:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:43:46.838 11:54:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:43:46.838 11:54:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:43:46.838 11:54:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:43:46.838 11:54:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 /dev/nbd1 00:43:47.097 /dev/nbd1 00:43:47.097 11:54:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:43:47.097 11:54:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:43:47.097 11:54:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:43:47.097 11:54:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:43:47.097 11:54:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:43:47.097 11:54:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:43:47.097 11:54:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:43:47.097 11:54:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:43:47.097 11:54:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:43:47.097 11:54:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:43:47.097 11:54:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:43:47.097 1+0 records in 00:43:47.097 1+0 records out 00:43:47.097 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000281674 s, 14.5 MB/s 00:43:47.097 11:54:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:43:47.097 11:54:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:43:47.097 11:54:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:43:47.097 11:54:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:43:47.097 11:54:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:43:47.097 11:54:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:43:47.097 11:54:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:43:47.097 11:54:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd10 00:43:47.357 /dev/nbd10 00:43:47.357 11:54:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:43:47.357 11:54:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:43:47.357 11:54:31 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd10 00:43:47.357 11:54:31 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:43:47.357 11:54:31 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:43:47.357 11:54:31 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:43:47.357 11:54:31 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd10 /proc/partitions 00:43:47.357 11:54:31 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:43:47.357 11:54:31 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:43:47.357 11:54:31 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:43:47.357 11:54:31 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:43:47.357 1+0 records in 00:43:47.357 1+0 records out 00:43:47.357 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00026311 s, 15.6 MB/s 00:43:47.357 11:54:31 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:43:47.357 11:54:31 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:43:47.357 11:54:31 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:43:47.357 11:54:31 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:43:47.357 11:54:31 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:43:47.357 11:54:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:43:47.357 11:54:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:43:47.357 11:54:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram4 /dev/nbd11 00:43:47.357 /dev/nbd11 00:43:47.357 11:54:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:43:47.357 11:54:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:43:47.357 11:54:31 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd11 00:43:47.357 11:54:31 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:43:47.357 11:54:31 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:43:47.357 11:54:31 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:43:47.357 11:54:31 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd11 /proc/partitions 00:43:47.357 11:54:31 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:43:47.357 11:54:31 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:43:47.357 11:54:31 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:43:47.357 11:54:31 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:43:47.616 1+0 records in 00:43:47.616 1+0 records out 00:43:47.616 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000289747 s, 14.1 MB/s 00:43:47.616 11:54:31 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:43:47.616 11:54:31 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:43:47.616 11:54:31 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:43:47.616 11:54:31 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:43:47.616 11:54:31 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:43:47.616 11:54:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:43:47.616 11:54:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:43:47.616 11:54:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:43:47.616 11:54:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:43:47.616 11:54:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:43:47.616 11:54:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:43:47.616 { 00:43:47.616 "nbd_device": "/dev/nbd0", 00:43:47.616 "bdev_name": "crypto_ram" 00:43:47.616 }, 00:43:47.616 { 00:43:47.616 "nbd_device": "/dev/nbd1", 00:43:47.616 "bdev_name": "crypto_ram2" 00:43:47.616 }, 00:43:47.616 { 00:43:47.616 "nbd_device": "/dev/nbd10", 00:43:47.616 "bdev_name": "crypto_ram3" 00:43:47.616 }, 00:43:47.616 { 00:43:47.616 "nbd_device": "/dev/nbd11", 00:43:47.616 "bdev_name": "crypto_ram4" 00:43:47.616 } 00:43:47.616 ]' 00:43:47.616 11:54:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:43:47.616 { 00:43:47.616 "nbd_device": "/dev/nbd0", 00:43:47.616 "bdev_name": "crypto_ram" 00:43:47.616 }, 00:43:47.616 { 00:43:47.616 "nbd_device": "/dev/nbd1", 00:43:47.616 "bdev_name": "crypto_ram2" 00:43:47.616 }, 00:43:47.616 { 00:43:47.616 "nbd_device": "/dev/nbd10", 00:43:47.616 "bdev_name": "crypto_ram3" 00:43:47.616 }, 00:43:47.616 { 00:43:47.616 "nbd_device": "/dev/nbd11", 00:43:47.616 "bdev_name": "crypto_ram4" 00:43:47.616 } 00:43:47.616 ]' 00:43:47.616 11:54:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:43:47.616 11:54:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:43:47.616 /dev/nbd1 00:43:47.616 /dev/nbd10 00:43:47.616 /dev/nbd11' 00:43:47.616 11:54:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:43:47.616 /dev/nbd1 00:43:47.616 /dev/nbd10 00:43:47.616 /dev/nbd11' 00:43:47.616 11:54:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:43:47.616 11:54:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=4 00:43:47.616 11:54:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 4 00:43:47.616 11:54:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=4 00:43:47.616 11:54:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 4 -ne 4 ']' 00:43:47.616 11:54:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' write 00:43:47.616 11:54:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:43:47.616 11:54:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:43:47.616 11:54:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:43:47.616 11:54:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:43:47.616 11:54:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:43:47.616 11:54:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:43:47.875 256+0 records in 00:43:47.875 256+0 records out 00:43:47.875 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00435646 s, 241 MB/s 00:43:47.875 11:54:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:43:47.875 11:54:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:43:47.875 256+0 records in 00:43:47.875 256+0 records out 00:43:47.875 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0369399 s, 28.4 MB/s 00:43:47.875 11:54:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:43:47.875 11:54:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:43:47.875 256+0 records in 00:43:47.875 256+0 records out 00:43:47.875 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0352799 s, 29.7 MB/s 00:43:47.875 11:54:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:43:47.875 11:54:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:43:47.875 256+0 records in 00:43:47.875 256+0 records out 00:43:47.875 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0295302 s, 35.5 MB/s 00:43:47.875 11:54:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:43:47.875 11:54:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:43:47.875 256+0 records in 00:43:47.875 256+0 records out 00:43:47.875 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.034544 s, 30.4 MB/s 00:43:47.875 11:54:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' verify 00:43:47.875 11:54:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:43:47.875 11:54:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:43:47.875 11:54:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:43:47.875 11:54:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:43:47.875 11:54:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:43:47.875 11:54:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:43:47.875 11:54:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:43:47.875 11:54:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:43:47.875 11:54:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:43:47.875 11:54:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:43:47.875 11:54:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:43:47.875 11:54:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:43:47.875 11:54:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:43:47.875 11:54:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:43:47.875 11:54:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:43:47.875 11:54:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:43:47.875 11:54:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:43:47.875 11:54:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:43:47.875 11:54:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:43:47.875 11:54:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:43:47.875 11:54:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:43:47.875 11:54:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:43:48.134 11:54:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:43:48.134 11:54:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:43:48.134 11:54:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:43:48.134 11:54:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:43:48.134 11:54:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:43:48.134 11:54:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:43:48.134 11:54:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:43:48.134 11:54:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:43:48.134 11:54:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:43:48.134 11:54:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:43:48.393 11:54:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:43:48.393 11:54:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:43:48.393 11:54:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:43:48.393 11:54:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:43:48.393 11:54:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:43:48.393 11:54:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:43:48.393 11:54:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:43:48.393 11:54:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:43:48.393 11:54:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:43:48.393 11:54:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:43:48.393 11:54:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:43:48.393 11:54:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:43:48.393 11:54:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:43:48.393 11:54:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:43:48.393 11:54:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:43:48.393 11:54:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:43:48.652 11:54:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:43:48.652 11:54:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:43:48.652 11:54:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:43:48.652 11:54:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:43:48.652 11:54:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:43:48.652 11:54:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:43:48.652 11:54:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:43:48.652 11:54:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:43:48.652 11:54:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:43:48.652 11:54:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:43:48.652 11:54:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:43:48.652 11:54:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:43:48.652 11:54:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:43:48.652 11:54:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:43:48.652 11:54:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:43:48.911 11:54:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:43:48.911 11:54:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:43:48.911 11:54:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:43:48.911 11:54:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:43:48.911 11:54:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:43:48.911 11:54:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:43:48.911 11:54:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:43:48.911 11:54:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:43:48.911 11:54:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:43:48.911 11:54:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:43:48.911 11:54:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:43:48.911 11:54:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:43:48.911 11:54:32 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:43:48.911 11:54:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:43:48.911 11:54:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:43:48.911 11:54:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:43:48.911 11:54:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:43:48.911 11:54:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:43:49.170 malloc_lvol_verify 00:43:49.170 11:54:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:43:49.170 91f25884-3b4b-4d4d-a598-16cb0d3bbef3 00:43:49.428 11:54:33 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:43:49.428 a5dfcf61-4d3c-44ea-b307-a3b193e938b4 00:43:49.428 11:54:33 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:43:49.687 /dev/nbd0 00:43:49.687 11:54:33 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:43:49.687 mke2fs 1.46.5 (30-Dec-2021) 00:43:49.687 Discarding device blocks: 0/4096 done 00:43:49.687 Creating filesystem with 4096 1k blocks and 1024 inodes 00:43:49.687 00:43:49.687 Allocating group tables: 0/1 done 00:43:49.687 Writing inode tables: 0/1 done 00:43:49.687 Creating journal (1024 blocks): done 00:43:49.687 Writing superblocks and filesystem accounting information: 0/1 done 00:43:49.687 00:43:49.688 11:54:33 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:43:49.688 11:54:33 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:43:49.688 11:54:33 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:43:49.688 11:54:33 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:43:49.688 11:54:33 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:43:49.688 11:54:33 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:43:49.688 11:54:33 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:43:49.688 11:54:33 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:43:49.947 11:54:33 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:43:49.947 11:54:33 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:43:49.947 11:54:33 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:43:49.947 11:54:33 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:43:49.947 11:54:33 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:43:49.947 11:54:33 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:43:49.947 11:54:33 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:43:49.947 11:54:33 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:43:49.947 11:54:33 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:43:49.947 11:54:33 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:43:49.947 11:54:33 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 277845 00:43:49.947 11:54:33 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@949 -- # '[' -z 277845 ']' 00:43:49.947 11:54:33 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@953 -- # kill -0 277845 00:43:49.947 11:54:33 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@954 -- # uname 00:43:49.947 11:54:33 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:43:49.947 11:54:33 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 277845 00:43:49.947 11:54:33 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:43:49.947 11:54:33 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:43:49.947 11:54:33 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@967 -- # echo 'killing process with pid 277845' 00:43:49.947 killing process with pid 277845 00:43:49.947 11:54:33 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@968 -- # kill 277845 00:43:49.947 11:54:33 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@973 -- # wait 277845 00:43:50.206 11:54:34 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:43:50.206 00:43:50.206 real 0m8.385s 00:43:50.206 user 0m10.525s 00:43:50.206 sys 0m3.207s 00:43:50.206 11:54:34 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@1125 -- # xtrace_disable 00:43:50.206 11:54:34 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:43:50.206 ************************************ 00:43:50.206 END TEST bdev_nbd 00:43:50.206 ************************************ 00:43:50.206 11:54:34 blockdev_crypto_aesni -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:43:50.206 11:54:34 blockdev_crypto_aesni -- bdev/blockdev.sh@764 -- # '[' crypto_aesni = nvme ']' 00:43:50.206 11:54:34 blockdev_crypto_aesni -- bdev/blockdev.sh@764 -- # '[' crypto_aesni = gpt ']' 00:43:50.206 11:54:34 blockdev_crypto_aesni -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:43:50.206 11:54:34 blockdev_crypto_aesni -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:43:50.206 11:54:34 blockdev_crypto_aesni -- common/autotest_common.sh@1106 -- # xtrace_disable 00:43:50.206 11:54:34 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:43:50.206 ************************************ 00:43:50.206 START TEST bdev_fio 00:43:50.206 ************************************ 00:43:50.206 11:54:34 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1124 -- # fio_test_suite '' 00:43:50.206 11:54:34 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:43:50.206 11:54:34 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:43:50.206 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:43:50.206 11:54:34 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:43:50.206 11:54:34 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:43:50.206 11:54:34 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:43:50.206 11:54:34 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:43:50.206 11:54:34 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:43:50.206 11:54:34 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1279 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:43:50.206 11:54:34 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1280 -- # local workload=verify 00:43:50.206 11:54:34 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1281 -- # local bdev_type=AIO 00:43:50.206 11:54:34 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1282 -- # local env_context= 00:43:50.206 11:54:34 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1283 -- # local fio_dir=/usr/src/fio 00:43:50.206 11:54:34 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1285 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:43:50.206 11:54:34 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1290 -- # '[' -z verify ']' 00:43:50.206 11:54:34 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1294 -- # '[' -n '' ']' 00:43:50.206 11:54:34 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1298 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:43:50.465 11:54:34 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1300 -- # cat 00:43:50.465 11:54:34 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1312 -- # '[' verify == verify ']' 00:43:50.465 11:54:34 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1313 -- # cat 00:43:50.465 11:54:34 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1322 -- # '[' AIO == AIO ']' 00:43:50.465 11:54:34 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1323 -- # /usr/src/fio/fio --version 00:43:50.465 11:54:34 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1323 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:43:50.465 11:54:34 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1324 -- # echo serialize_overlap=1 00:43:50.465 11:54:34 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:43:50.465 11:54:34 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram]' 00:43:50.465 11:54:34 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram 00:43:50.465 11:54:34 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:43:50.465 11:54:34 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram2]' 00:43:50.465 11:54:34 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram2 00:43:50.465 11:54:34 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:43:50.465 11:54:34 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram3]' 00:43:50.465 11:54:34 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram3 00:43:50.465 11:54:34 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:43:50.465 11:54:34 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram4]' 00:43:50.465 11:54:34 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram4 00:43:50.465 11:54:34 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:43:50.465 11:54:34 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:43:50.465 11:54:34 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1100 -- # '[' 11 -le 1 ']' 00:43:50.465 11:54:34 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1106 -- # xtrace_disable 00:43:50.465 11:54:34 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:43:50.465 ************************************ 00:43:50.465 START TEST bdev_fio_rw_verify 00:43:50.465 ************************************ 00:43:50.465 11:54:34 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:43:50.465 11:54:34 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1355 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:43:50.465 11:54:34 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1336 -- # local fio_dir=/usr/src/fio 00:43:50.465 11:54:34 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1338 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:43:50.465 11:54:34 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1338 -- # local sanitizers 00:43:50.465 11:54:34 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:43:50.465 11:54:34 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # shift 00:43:50.465 11:54:34 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1342 -- # local asan_lib= 00:43:50.465 11:54:34 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # for sanitizer in "${sanitizers[@]}" 00:43:50.465 11:54:34 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:43:50.465 11:54:34 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # grep libasan 00:43:50.465 11:54:34 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # awk '{print $3}' 00:43:50.465 11:54:34 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # asan_lib= 00:43:50.465 11:54:34 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # [[ -n '' ]] 00:43:50.465 11:54:34 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # for sanitizer in "${sanitizers[@]}" 00:43:50.465 11:54:34 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # grep libclang_rt.asan 00:43:50.465 11:54:34 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:43:50.465 11:54:34 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # awk '{print $3}' 00:43:50.465 11:54:34 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # asan_lib= 00:43:50.465 11:54:34 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # [[ -n '' ]] 00:43:50.465 11:54:34 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1351 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:43:50.465 11:54:34 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1351 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:43:50.725 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:43:50.725 job_crypto_ram2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:43:50.725 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:43:50.725 job_crypto_ram4: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:43:50.725 fio-3.35 00:43:50.725 Starting 4 threads 00:44:05.612 00:44:05.612 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=279642: Mon Jun 10 11:54:47 2024 00:44:05.612 read: IOPS=25.0k, BW=97.7MiB/s (102MB/s)(977MiB/10001msec) 00:44:05.612 slat (usec): min=10, max=401, avg=56.41, stdev=32.56 00:44:05.612 clat (usec): min=9, max=1794, avg=291.55, stdev=188.97 00:44:05.612 lat (usec): min=39, max=1888, avg=347.96, stdev=206.99 00:44:05.612 clat percentiles (usec): 00:44:05.612 | 50.000th=[ 247], 99.000th=[ 914], 99.900th=[ 1156], 99.990th=[ 1352], 00:44:05.612 | 99.999th=[ 1631] 00:44:05.612 write: IOPS=27.6k, BW=108MiB/s (113MB/s)(1050MiB/9730msec); 0 zone resets 00:44:05.612 slat (usec): min=18, max=436, avg=64.59, stdev=31.68 00:44:05.612 clat (usec): min=24, max=1828, avg=343.06, stdev=215.70 00:44:05.612 lat (usec): min=56, max=1940, avg=407.65, stdev=232.93 00:44:05.612 clat percentiles (usec): 00:44:05.612 | 50.000th=[ 302], 99.000th=[ 1057], 99.900th=[ 1352], 99.990th=[ 1565], 00:44:05.612 | 99.999th=[ 1713] 00:44:05.612 bw ( KiB/s): min=86808, max=146800, per=97.48%, avg=107710.74, stdev=4238.05, samples=76 00:44:05.612 iops : min=21702, max=36700, avg=26927.68, stdev=1059.51, samples=76 00:44:05.612 lat (usec) : 10=0.01%, 20=0.01%, 50=0.57%, 100=8.96%, 250=34.95% 00:44:05.612 lat (usec) : 500=39.27%, 750=11.78%, 1000=3.52% 00:44:05.612 lat (msec) : 2=0.95% 00:44:05.612 cpu : usr=99.67%, sys=0.01%, ctx=119, majf=0, minf=314 00:44:05.612 IO depths : 1=10.1%, 2=25.6%, 4=51.2%, 8=13.1%, 16=0.0%, 32=0.0%, >=64=0.0% 00:44:05.612 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:44:05.612 complete : 0=0.0%, 4=88.7%, 8=11.3%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:44:05.612 issued rwts: total=250182,268766,0,0 short=0,0,0,0 dropped=0,0,0,0 00:44:05.612 latency : target=0, window=0, percentile=100.00%, depth=8 00:44:05.612 00:44:05.612 Run status group 0 (all jobs): 00:44:05.612 READ: bw=97.7MiB/s (102MB/s), 97.7MiB/s-97.7MiB/s (102MB/s-102MB/s), io=977MiB (1025MB), run=10001-10001msec 00:44:05.612 WRITE: bw=108MiB/s (113MB/s), 108MiB/s-108MiB/s (113MB/s-113MB/s), io=1050MiB (1101MB), run=9730-9730msec 00:44:05.612 00:44:05.612 real 0m13.439s 00:44:05.612 user 0m45.564s 00:44:05.612 sys 0m0.477s 00:44:05.612 11:54:47 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1125 -- # xtrace_disable 00:44:05.612 11:54:47 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:44:05.612 ************************************ 00:44:05.612 END TEST bdev_fio_rw_verify 00:44:05.612 ************************************ 00:44:05.612 11:54:47 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:44:05.612 11:54:47 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:44:05.612 11:54:47 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:44:05.612 11:54:47 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1279 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:44:05.612 11:54:47 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1280 -- # local workload=trim 00:44:05.612 11:54:47 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1281 -- # local bdev_type= 00:44:05.612 11:54:47 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1282 -- # local env_context= 00:44:05.612 11:54:47 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1283 -- # local fio_dir=/usr/src/fio 00:44:05.612 11:54:47 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1285 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:44:05.612 11:54:47 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1290 -- # '[' -z trim ']' 00:44:05.612 11:54:47 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1294 -- # '[' -n '' ']' 00:44:05.612 11:54:47 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1298 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:44:05.612 11:54:47 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1300 -- # cat 00:44:05.612 11:54:47 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1312 -- # '[' trim == verify ']' 00:44:05.612 11:54:47 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1327 -- # '[' trim == trim ']' 00:44:05.612 11:54:47 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1328 -- # echo rw=trimwrite 00:44:05.612 11:54:47 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:44:05.613 11:54:47 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "fe30364a-b10b-50e8-8e8b-0a2d31ea9b3e"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "fe30364a-b10b-50e8-8e8b-0a2d31ea9b3e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "2bc1a8c1-a478-574c-abbc-147afca836f4"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "2bc1a8c1-a478-574c-abbc-147afca836f4",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "da8eb50c-6ea2-5544-a953-5921563ec487"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "da8eb50c-6ea2-5544-a953-5921563ec487",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "0f9d5200-8433-5694-a444-28dbe4407d6e"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "0f9d5200-8433-5694-a444-28dbe4407d6e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:44:05.613 11:54:47 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n crypto_ram 00:44:05.613 crypto_ram2 00:44:05.613 crypto_ram3 00:44:05.613 crypto_ram4 ]] 00:44:05.613 11:54:47 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:44:05.613 11:54:47 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "fe30364a-b10b-50e8-8e8b-0a2d31ea9b3e"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "fe30364a-b10b-50e8-8e8b-0a2d31ea9b3e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "2bc1a8c1-a478-574c-abbc-147afca836f4"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "2bc1a8c1-a478-574c-abbc-147afca836f4",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "da8eb50c-6ea2-5544-a953-5921563ec487"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "da8eb50c-6ea2-5544-a953-5921563ec487",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "0f9d5200-8433-5694-a444-28dbe4407d6e"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "0f9d5200-8433-5694-a444-28dbe4407d6e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:44:05.613 11:54:47 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:44:05.613 11:54:47 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram]' 00:44:05.613 11:54:47 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram 00:44:05.613 11:54:47 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:44:05.613 11:54:47 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram2]' 00:44:05.613 11:54:47 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram2 00:44:05.613 11:54:47 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:44:05.613 11:54:47 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram3]' 00:44:05.613 11:54:47 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram3 00:44:05.613 11:54:47 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:44:05.613 11:54:47 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram4]' 00:44:05.613 11:54:47 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram4 00:44:05.613 11:54:47 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:44:05.613 11:54:47 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1100 -- # '[' 11 -le 1 ']' 00:44:05.613 11:54:47 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1106 -- # xtrace_disable 00:44:05.613 11:54:47 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:44:05.613 ************************************ 00:44:05.613 START TEST bdev_fio_trim 00:44:05.613 ************************************ 00:44:05.613 11:54:47 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:44:05.613 11:54:47 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1355 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:44:05.613 11:54:47 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1336 -- # local fio_dir=/usr/src/fio 00:44:05.613 11:54:47 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1338 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:44:05.613 11:54:47 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1338 -- # local sanitizers 00:44:05.613 11:54:47 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:44:05.613 11:54:47 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # shift 00:44:05.613 11:54:47 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1342 -- # local asan_lib= 00:44:05.613 11:54:47 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # for sanitizer in "${sanitizers[@]}" 00:44:05.613 11:54:47 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:44:05.614 11:54:47 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # grep libasan 00:44:05.614 11:54:47 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # awk '{print $3}' 00:44:05.614 11:54:47 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # asan_lib= 00:44:05.614 11:54:47 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # [[ -n '' ]] 00:44:05.614 11:54:47 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # for sanitizer in "${sanitizers[@]}" 00:44:05.614 11:54:47 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:44:05.614 11:54:47 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # awk '{print $3}' 00:44:05.614 11:54:47 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # grep libclang_rt.asan 00:44:05.614 11:54:47 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # asan_lib= 00:44:05.614 11:54:47 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # [[ -n '' ]] 00:44:05.614 11:54:47 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1351 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:44:05.614 11:54:47 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1351 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:44:05.614 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:44:05.614 job_crypto_ram2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:44:05.614 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:44:05.614 job_crypto_ram4: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:44:05.614 fio-3.35 00:44:05.614 Starting 4 threads 00:44:17.837 00:44:17.837 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=281504: Mon Jun 10 11:55:00 2024 00:44:17.837 write: IOPS=41.1k, BW=160MiB/s (168MB/s)(1604MiB/10001msec); 0 zone resets 00:44:17.837 slat (usec): min=10, max=1202, avg=55.58, stdev=30.38 00:44:17.837 clat (usec): min=32, max=1863, avg=245.47, stdev=148.15 00:44:17.837 lat (usec): min=42, max=2162, avg=301.05, stdev=164.08 00:44:17.837 clat percentiles (usec): 00:44:17.837 | 50.000th=[ 215], 99.000th=[ 725], 99.900th=[ 857], 99.990th=[ 996], 00:44:17.837 | 99.999th=[ 1680] 00:44:17.837 bw ( KiB/s): min=153744, max=244029, per=100.00%, avg=164480.68, stdev=5203.99, samples=76 00:44:17.837 iops : min=38436, max=61007, avg=41120.16, stdev=1300.98, samples=76 00:44:17.837 trim: IOPS=41.1k, BW=160MiB/s (168MB/s)(1604MiB/10001msec); 0 zone resets 00:44:17.837 slat (usec): min=4, max=116, avg=16.55, stdev= 6.65 00:44:17.837 clat (usec): min=43, max=1498, avg=231.40, stdev=102.69 00:44:17.837 lat (usec): min=47, max=1514, avg=247.94, stdev=104.39 00:44:17.837 clat percentiles (usec): 00:44:17.837 | 50.000th=[ 221], 99.000th=[ 515], 99.900th=[ 627], 99.990th=[ 734], 00:44:17.837 | 99.999th=[ 1205] 00:44:17.837 bw ( KiB/s): min=153744, max=244061, per=100.00%, avg=164481.53, stdev=5205.06, samples=76 00:44:17.837 iops : min=38436, max=61015, avg=41120.37, stdev=1301.24, samples=76 00:44:17.837 lat (usec) : 50=0.57%, 100=9.16%, 250=51.40%, 500=34.45%, 750=4.07% 00:44:17.837 lat (usec) : 1000=0.34% 00:44:17.837 lat (msec) : 2=0.01% 00:44:17.837 cpu : usr=99.67%, sys=0.00%, ctx=64, majf=0, minf=105 00:44:17.837 IO depths : 1=7.6%, 2=26.4%, 4=52.8%, 8=13.2%, 16=0.0%, 32=0.0%, >=64=0.0% 00:44:17.837 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:44:17.837 complete : 0=0.0%, 4=88.3%, 8=11.7%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:44:17.837 issued rwts: total=0,410730,410730,0 short=0,0,0,0 dropped=0,0,0,0 00:44:17.837 latency : target=0, window=0, percentile=100.00%, depth=8 00:44:17.837 00:44:17.837 Run status group 0 (all jobs): 00:44:17.837 WRITE: bw=160MiB/s (168MB/s), 160MiB/s-160MiB/s (168MB/s-168MB/s), io=1604MiB (1682MB), run=10001-10001msec 00:44:17.837 TRIM: bw=160MiB/s (168MB/s), 160MiB/s-160MiB/s (168MB/s-168MB/s), io=1604MiB (1682MB), run=10001-10001msec 00:44:17.837 00:44:17.837 real 0m13.426s 00:44:17.837 user 0m45.665s 00:44:17.837 sys 0m0.464s 00:44:17.837 11:55:01 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1125 -- # xtrace_disable 00:44:17.837 11:55:01 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:44:17.837 ************************************ 00:44:17.837 END TEST bdev_fio_trim 00:44:17.837 ************************************ 00:44:17.837 11:55:01 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:44:17.837 11:55:01 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:44:17.837 11:55:01 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:44:17.837 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:44:17.837 11:55:01 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:44:17.837 00:44:17.837 real 0m27.194s 00:44:17.837 user 1m31.382s 00:44:17.837 sys 0m1.139s 00:44:17.837 11:55:01 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1125 -- # xtrace_disable 00:44:17.837 11:55:01 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:44:17.837 ************************************ 00:44:17.837 END TEST bdev_fio 00:44:17.837 ************************************ 00:44:17.837 11:55:01 blockdev_crypto_aesni -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:44:17.837 11:55:01 blockdev_crypto_aesni -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:44:17.837 11:55:01 blockdev_crypto_aesni -- common/autotest_common.sh@1100 -- # '[' 16 -le 1 ']' 00:44:17.837 11:55:01 blockdev_crypto_aesni -- common/autotest_common.sh@1106 -- # xtrace_disable 00:44:17.837 11:55:01 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:44:17.837 ************************************ 00:44:17.837 START TEST bdev_verify 00:44:17.837 ************************************ 00:44:17.837 11:55:01 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:44:17.837 [2024-06-10 11:55:01.474413] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:44:17.837 [2024-06-10 11:55:01.474462] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid282855 ] 00:44:17.837 [2024-06-10 11:55:01.561624] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:44:17.837 [2024-06-10 11:55:01.644505] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:44:17.837 [2024-06-10 11:55:01.644507] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:44:17.837 [2024-06-10 11:55:01.665499] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:44:17.837 [2024-06-10 11:55:01.673533] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:44:17.837 [2024-06-10 11:55:01.681546] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:44:18.096 [2024-06-10 11:55:01.789333] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:44:20.634 [2024-06-10 11:55:03.981042] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:44:20.634 [2024-06-10 11:55:03.981103] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:44:20.634 [2024-06-10 11:55:03.981113] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:44:20.634 [2024-06-10 11:55:03.989060] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:44:20.634 [2024-06-10 11:55:03.989077] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:44:20.634 [2024-06-10 11:55:03.989085] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:44:20.634 [2024-06-10 11:55:03.997083] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:44:20.634 [2024-06-10 11:55:03.997096] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:44:20.634 [2024-06-10 11:55:03.997104] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:44:20.634 [2024-06-10 11:55:04.005106] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:44:20.634 [2024-06-10 11:55:04.005119] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:44:20.634 [2024-06-10 11:55:04.005126] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:44:20.634 Running I/O for 5 seconds... 00:44:25.909 00:44:25.909 Latency(us) 00:44:25.909 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:44:25.909 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:44:25.909 Verification LBA range: start 0x0 length 0x1000 00:44:25.909 crypto_ram : 5.05 724.46 2.83 0.00 0.00 176131.63 1980.33 121270.09 00:44:25.909 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:44:25.909 Verification LBA range: start 0x1000 length 0x1000 00:44:25.909 crypto_ram : 5.05 725.81 2.84 0.00 0.00 175758.18 2222.53 120358.29 00:44:25.909 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:44:25.909 Verification LBA range: start 0x0 length 0x1000 00:44:25.909 crypto_ram2 : 5.05 725.92 2.84 0.00 0.00 175500.93 2322.25 111696.14 00:44:25.909 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:44:25.909 Verification LBA range: start 0x1000 length 0x1000 00:44:25.909 crypto_ram2 : 5.05 728.81 2.85 0.00 0.00 174848.84 2436.23 111696.14 00:44:25.909 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:44:25.909 Verification LBA range: start 0x0 length 0x1000 00:44:25.909 crypto_ram3 : 5.03 5693.47 22.24 0.00 0.00 22336.86 2778.16 19147.91 00:44:25.909 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:44:25.909 Verification LBA range: start 0x1000 length 0x1000 00:44:25.909 crypto_ram3 : 5.03 5698.30 22.26 0.00 0.00 22309.28 5442.34 19033.93 00:44:25.909 Job: crypto_ram4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:44:25.909 Verification LBA range: start 0x0 length 0x1000 00:44:25.909 crypto_ram4 : 5.04 5694.05 22.24 0.00 0.00 22287.20 2279.51 18008.15 00:44:25.909 Job: crypto_ram4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:44:25.909 Verification LBA range: start 0x1000 length 0x1000 00:44:25.909 crypto_ram4 : 5.04 5714.91 22.32 0.00 0.00 22206.80 2478.97 17666.23 00:44:25.909 =================================================================================================================== 00:44:25.909 Total : 25705.72 100.41 0.00 0.00 39640.41 1980.33 121270.09 00:44:25.909 00:44:25.909 real 0m8.119s 00:44:25.909 user 0m15.498s 00:44:25.909 sys 0m0.346s 00:44:25.909 11:55:09 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@1125 -- # xtrace_disable 00:44:25.909 11:55:09 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:44:25.909 ************************************ 00:44:25.909 END TEST bdev_verify 00:44:25.909 ************************************ 00:44:25.909 11:55:09 blockdev_crypto_aesni -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:44:25.909 11:55:09 blockdev_crypto_aesni -- common/autotest_common.sh@1100 -- # '[' 16 -le 1 ']' 00:44:25.909 11:55:09 blockdev_crypto_aesni -- common/autotest_common.sh@1106 -- # xtrace_disable 00:44:25.909 11:55:09 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:44:25.909 ************************************ 00:44:25.909 START TEST bdev_verify_big_io 00:44:25.909 ************************************ 00:44:25.909 11:55:09 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:44:25.909 [2024-06-10 11:55:09.659691] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:44:25.909 [2024-06-10 11:55:09.659740] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid283894 ] 00:44:25.909 [2024-06-10 11:55:09.748237] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:44:25.909 [2024-06-10 11:55:09.831577] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:44:25.909 [2024-06-10 11:55:09.831580] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:44:25.909 [2024-06-10 11:55:09.852593] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:44:26.168 [2024-06-10 11:55:09.860624] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:44:26.168 [2024-06-10 11:55:09.868635] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:44:26.168 [2024-06-10 11:55:09.969889] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:44:28.702 [2024-06-10 11:55:12.158409] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:44:28.702 [2024-06-10 11:55:12.158469] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:44:28.702 [2024-06-10 11:55:12.158479] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:44:28.702 [2024-06-10 11:55:12.166427] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:44:28.702 [2024-06-10 11:55:12.166443] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:44:28.702 [2024-06-10 11:55:12.166452] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:44:28.702 [2024-06-10 11:55:12.174449] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:44:28.702 [2024-06-10 11:55:12.174462] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:44:28.702 [2024-06-10 11:55:12.174470] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:44:28.702 [2024-06-10 11:55:12.182471] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:44:28.702 [2024-06-10 11:55:12.182483] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:44:28.702 [2024-06-10 11:55:12.182491] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:44:28.702 Running I/O for 5 seconds... 00:44:30.142 [2024-06-10 11:55:14.075893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:44:30.142 [2024-06-10 11:55:14.076017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:44:30.142 [2024-06-10 11:55:14.076291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:44:30.142 [2024-06-10 11:55:14.076340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:44:30.463 [2024-06-10 11:55:14.086413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:44:30.463 [2024-06-10 11:55:14.086462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:44:30.463 [2024-06-10 11:55:14.086504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:44:30.463 [2024-06-10 11:55:14.086533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:44:30.463 [2024-06-10 11:55:14.186479] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.186560] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.186812] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.320257] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.320311] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.320354] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.320617] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.321666] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.321723] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.321761] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.321794] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.322189] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.322240] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.322279] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.322311] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.323286] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.323338] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.323370] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.323401] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.323725] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.323766] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.323797] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.323827] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.324807] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.324859] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.324910] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.324949] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.325347] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.325390] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.325436] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.325478] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.326332] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.326377] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.326415] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.326448] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.326873] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.326919] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.326952] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.326983] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.327936] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.327993] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.328027] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.328058] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.328355] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.328394] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.328427] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.328465] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.329755] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.329801] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.329832] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.329862] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.330159] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.330196] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.330227] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.330258] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.331291] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.331345] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.331376] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.331421] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.331801] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.331838] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.331877] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.331908] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.332773] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.332823] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.332861] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.332900] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.333223] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.333260] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.333290] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.333320] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.334172] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.463 [2024-06-10 11:55:14.334850] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.334895] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.335270] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.335952] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.335998] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.336720] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.336758] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.337608] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.338377] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.338413] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.339438] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.340899] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.340945] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.341237] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.341273] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.342186] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.343141] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.343186] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.343987] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.344581] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.344625] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.345604] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.345647] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.346524] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.347460] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.347506] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.347768] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.348623] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.348669] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.349394] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.349434] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.350268] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.350737] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.350773] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.351652] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.352777] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.352821] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.353356] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.353393] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.354290] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.355183] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.355224] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.355795] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.356311] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.356361] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.357173] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.357208] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.358025] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.358299] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.358338] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.359215] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.360241] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.360287] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.360552] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.360589] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.361673] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.362762] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.362809] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.363229] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.364322] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.364367] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.365344] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.365387] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.366394] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.367415] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.367453] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.368262] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.369494] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.369540] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.370108] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.370148] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.371246] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.371598] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.371635] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.372480] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.373748] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.373794] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.374409] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.374448] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.377717] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.378417] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.378455] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.379143] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.380311] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.380358] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.381375] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.381413] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.382330] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.382829] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.382875] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.383447] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.464 [2024-06-10 11:55:14.384190] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.465 [2024-06-10 11:55:14.384239] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.465 [2024-06-10 11:55:14.385179] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.465 [2024-06-10 11:55:14.385216] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.465 [2024-06-10 11:55:14.386047] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.465 [2024-06-10 11:55:14.386411] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.465 [2024-06-10 11:55:14.386449] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.465 [2024-06-10 11:55:14.387299] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.465 [2024-06-10 11:55:14.388422] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.465 [2024-06-10 11:55:14.388469] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.465 [2024-06-10 11:55:14.388751] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.465 [2024-06-10 11:55:14.388788] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.465 [2024-06-10 11:55:14.390745] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.465 [2024-06-10 11:55:14.391853] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.465 [2024-06-10 11:55:14.391902] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.465 [2024-06-10 11:55:14.392643] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.465 [2024-06-10 11:55:14.393378] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.465 [2024-06-10 11:55:14.393423] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.465 [2024-06-10 11:55:14.394105] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.465 [2024-06-10 11:55:14.394141] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.465 [2024-06-10 11:55:14.396340] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.465 [2024-06-10 11:55:14.397352] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.465 [2024-06-10 11:55:14.397391] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.737 [2024-06-10 11:55:14.397908] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.737 [2024-06-10 11:55:14.399126] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.737 [2024-06-10 11:55:14.399178] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.737 [2024-06-10 11:55:14.400187] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.737 [2024-06-10 11:55:14.400229] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.737 [2024-06-10 11:55:14.402122] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.737 [2024-06-10 11:55:14.402164] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.737 [2024-06-10 11:55:14.403102] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.737 [2024-06-10 11:55:14.403141] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.737 [2024-06-10 11:55:14.404275] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.737 [2024-06-10 11:55:14.404320] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.737 [2024-06-10 11:55:14.405419] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.737 [2024-06-10 11:55:14.405460] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.737 [2024-06-10 11:55:14.409049] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.737 [2024-06-10 11:55:14.409102] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.737 [2024-06-10 11:55:14.409392] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.737 [2024-06-10 11:55:14.409428] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.737 [2024-06-10 11:55:14.410000] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.737 [2024-06-10 11:55:14.410045] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.737 [2024-06-10 11:55:14.410888] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.737 [2024-06-10 11:55:14.410925] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.737 [2024-06-10 11:55:14.414306] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.737 [2024-06-10 11:55:14.414351] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.737 [2024-06-10 11:55:14.415309] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.737 [2024-06-10 11:55:14.415346] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.737 [2024-06-10 11:55:14.416398] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.737 [2024-06-10 11:55:14.416444] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.738 [2024-06-10 11:55:14.417323] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.738 [2024-06-10 11:55:14.417360] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.738 [2024-06-10 11:55:14.420438] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.738 [2024-06-10 11:55:14.420486] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.738 [2024-06-10 11:55:14.421464] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.738 [2024-06-10 11:55:14.421504] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.738 [2024-06-10 11:55:14.422816] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.738 [2024-06-10 11:55:14.422872] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.738 [2024-06-10 11:55:14.423693] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.738 [2024-06-10 11:55:14.423727] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.738 [2024-06-10 11:55:14.426025] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.738 [2024-06-10 11:55:14.426072] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.738 [2024-06-10 11:55:14.427013] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.738 [2024-06-10 11:55:14.427054] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.738 [2024-06-10 11:55:14.427791] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.738 [2024-06-10 11:55:14.427835] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.738 [2024-06-10 11:55:14.428658] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.738 [2024-06-10 11:55:14.428694] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.738 [2024-06-10 11:55:14.432545] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.738 [2024-06-10 11:55:14.432592] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.738 [2024-06-10 11:55:14.432929] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.738 [2024-06-10 11:55:14.432965] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.738 [2024-06-10 11:55:14.433271] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.738 [2024-06-10 11:55:14.434142] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.738 [2024-06-10 11:55:14.434185] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.738 [2024-06-10 11:55:14.435200] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.738 [2024-06-10 11:55:14.438193] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.738 [2024-06-10 11:55:14.438239] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.738 [2024-06-10 11:55:14.438506] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.738 [2024-06-10 11:55:14.438540] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.738 [2024-06-10 11:55:14.438828] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.738 [2024-06-10 11:55:14.439652] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.738 [2024-06-10 11:55:14.439702] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.738 [2024-06-10 11:55:14.440537] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.738 [2024-06-10 11:55:14.443723] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.738 [2024-06-10 11:55:14.443769] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.738 [2024-06-10 11:55:14.444052] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.738 [2024-06-10 11:55:14.444087] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.738 [2024-06-10 11:55:14.444479] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.738 [2024-06-10 11:55:14.445426] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.738 [2024-06-10 11:55:14.446288] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.738 [2024-06-10 11:55:14.446322] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.738 [2024-06-10 11:55:14.449971] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:30.738 [2024-06-10 11:55:14.631810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:44:30.738 [2024-06-10 11:55:14.631919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:44:30.738 [2024-06-10 11:55:14.631951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:44:30.738 [2024-06-10 11:55:14.631969] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:31.307 [2024-06-10 11:55:14.958388] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:31.307 [2024-06-10 11:55:14.958455] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:31.307 [2024-06-10 11:55:14.958720] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:31.566 [2024-06-10 11:55:15.445985] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:31.566 [2024-06-10 11:55:15.446064] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:31.566 [2024-06-10 11:55:15.446116] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:31.566 [2024-06-10 11:55:15.446136] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:32.504 [2024-06-10 11:55:16.179140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:44:32.504 [2024-06-10 11:55:16.179841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:44:32.504 [2024-06-10 11:55:16.179910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:44:32.504 [2024-06-10 11:55:16.179940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:44:32.504 [2024-06-10 11:55:16.410191] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:32.504 [2024-06-10 11:55:16.410275] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:32.504 [2024-06-10 11:55:16.410312] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:32.504 [2024-06-10 11:55:16.411199] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:44:34.411 00:44:34.411 Latency(us) 00:44:34.411 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:44:34.411 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:44:34.411 Verification LBA range: start 0x0 length 0x100 00:44:34.411 crypto_ram : 5.58 67.93 4.25 0.00 0.00 1842522.87 48325.68 1685016.04 00:44:34.411 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:44:34.411 Verification LBA range: start 0x100 length 0x100 00:44:34.411 crypto_ram : 5.57 68.27 4.27 0.00 0.00 1833601.00 52428.80 1641249.39 00:44:34.411 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:44:34.411 Verification LBA range: start 0x0 length 0x100 00:44:34.411 crypto_ram2 : 5.58 68.10 4.26 0.00 0.00 1795632.71 47869.77 1685016.04 00:44:34.411 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:44:34.411 Verification LBA range: start 0x100 length 0x100 00:44:34.411 crypto_ram2 : 5.57 68.26 4.27 0.00 0.00 1790211.67 52200.85 1648543.83 00:44:34.411 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:44:34.411 Verification LBA range: start 0x0 length 0x100 00:44:34.411 crypto_ram3 : 5.39 452.78 28.30 0.00 0.00 262353.78 10428.77 439490.11 00:44:34.411 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:44:34.411 Verification LBA range: start 0x100 length 0x100 00:44:34.411 crypto_ram3 : 5.39 460.00 28.75 0.00 0.00 258232.63 15842.62 368369.31 00:44:34.411 Job: crypto_ram4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:44:34.411 Verification LBA range: start 0x0 length 0x100 00:44:34.411 crypto_ram4 : 5.44 467.38 29.21 0.00 0.00 249775.01 12651.30 348309.59 00:44:34.411 Job: crypto_ram4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:44:34.411 Verification LBA range: start 0x100 length 0x100 00:44:34.411 crypto_ram4 : 5.43 474.33 29.65 0.00 0.00 246506.15 13734.07 335544.32 00:44:34.411 =================================================================================================================== 00:44:34.411 Total : 2127.06 132.94 0.00 0.00 459322.96 10428.77 1685016.04 00:44:34.411 00:44:34.411 real 0m8.599s 00:44:34.411 user 0m16.479s 00:44:34.411 sys 0m0.338s 00:44:34.411 11:55:18 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@1125 -- # xtrace_disable 00:44:34.411 11:55:18 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:44:34.411 ************************************ 00:44:34.411 END TEST bdev_verify_big_io 00:44:34.411 ************************************ 00:44:34.411 11:55:18 blockdev_crypto_aesni -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:44:34.411 11:55:18 blockdev_crypto_aesni -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:44:34.411 11:55:18 blockdev_crypto_aesni -- common/autotest_common.sh@1106 -- # xtrace_disable 00:44:34.411 11:55:18 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:44:34.411 ************************************ 00:44:34.411 START TEST bdev_write_zeroes 00:44:34.411 ************************************ 00:44:34.411 11:55:18 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:44:34.411 [2024-06-10 11:55:18.353057] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:44:34.411 [2024-06-10 11:55:18.353104] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid285113 ] 00:44:34.671 [2024-06-10 11:55:18.437495] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:44:34.671 [2024-06-10 11:55:18.517608] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:44:34.671 [2024-06-10 11:55:18.538481] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:44:34.671 [2024-06-10 11:55:18.546510] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:44:34.671 [2024-06-10 11:55:18.554530] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:44:34.931 [2024-06-10 11:55:18.657059] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:44:37.466 [2024-06-10 11:55:20.848007] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:44:37.466 [2024-06-10 11:55:20.848069] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:44:37.466 [2024-06-10 11:55:20.848079] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:44:37.466 [2024-06-10 11:55:20.856025] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:44:37.466 [2024-06-10 11:55:20.856041] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:44:37.466 [2024-06-10 11:55:20.856049] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:44:37.466 [2024-06-10 11:55:20.864044] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:44:37.466 [2024-06-10 11:55:20.864059] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:44:37.466 [2024-06-10 11:55:20.864067] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:44:37.466 [2024-06-10 11:55:20.872065] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:44:37.466 [2024-06-10 11:55:20.872080] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:44:37.466 [2024-06-10 11:55:20.872089] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:44:37.466 Running I/O for 1 seconds... 00:44:38.035 00:44:38.035 Latency(us) 00:44:38.035 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:44:38.035 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:44:38.035 crypto_ram : 1.02 3093.14 12.08 0.00 0.00 41177.39 3504.75 49693.38 00:44:38.035 Job: crypto_ram2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:44:38.035 crypto_ram2 : 1.02 3106.59 12.14 0.00 0.00 40881.63 3490.50 46274.11 00:44:38.035 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:44:38.035 crypto_ram3 : 1.01 24134.11 94.27 0.00 0.00 5253.71 1638.40 6867.03 00:44:38.035 Job: crypto_ram4 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:44:38.035 crypto_ram4 : 1.01 24171.65 94.42 0.00 0.00 5235.12 1545.79 5784.26 00:44:38.035 =================================================================================================================== 00:44:38.035 Total : 54505.48 212.91 0.00 0.00 9326.27 1545.79 49693.38 00:44:38.603 00:44:38.603 real 0m4.029s 00:44:38.603 user 0m3.656s 00:44:38.603 sys 0m0.339s 00:44:38.603 11:55:22 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@1125 -- # xtrace_disable 00:44:38.603 11:55:22 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:44:38.603 ************************************ 00:44:38.603 END TEST bdev_write_zeroes 00:44:38.603 ************************************ 00:44:38.603 11:55:22 blockdev_crypto_aesni -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:44:38.603 11:55:22 blockdev_crypto_aesni -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:44:38.603 11:55:22 blockdev_crypto_aesni -- common/autotest_common.sh@1106 -- # xtrace_disable 00:44:38.603 11:55:22 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:44:38.603 ************************************ 00:44:38.603 START TEST bdev_json_nonenclosed 00:44:38.603 ************************************ 00:44:38.603 11:55:22 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:44:38.603 [2024-06-10 11:55:22.467782] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:44:38.603 [2024-06-10 11:55:22.467829] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid285645 ] 00:44:38.862 [2024-06-10 11:55:22.553524] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:44:38.862 [2024-06-10 11:55:22.634518] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:44:38.862 [2024-06-10 11:55:22.634578] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:44:38.862 [2024-06-10 11:55:22.634593] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:44:38.862 [2024-06-10 11:55:22.634602] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:44:38.862 00:44:38.862 real 0m0.304s 00:44:38.862 user 0m0.190s 00:44:38.862 sys 0m0.112s 00:44:38.862 11:55:22 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1125 -- # xtrace_disable 00:44:38.862 11:55:22 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:44:38.862 ************************************ 00:44:38.862 END TEST bdev_json_nonenclosed 00:44:38.862 ************************************ 00:44:38.862 11:55:22 blockdev_crypto_aesni -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:44:38.862 11:55:22 blockdev_crypto_aesni -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:44:38.862 11:55:22 blockdev_crypto_aesni -- common/autotest_common.sh@1106 -- # xtrace_disable 00:44:38.862 11:55:22 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:44:38.862 ************************************ 00:44:38.862 START TEST bdev_json_nonarray 00:44:38.862 ************************************ 00:44:38.862 11:55:22 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:44:39.122 [2024-06-10 11:55:22.838370] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:44:39.122 [2024-06-10 11:55:22.838417] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid285683 ] 00:44:39.122 [2024-06-10 11:55:22.917754] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:44:39.122 [2024-06-10 11:55:22.999979] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:44:39.122 [2024-06-10 11:55:23.000043] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:44:39.122 [2024-06-10 11:55:23.000056] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:44:39.122 [2024-06-10 11:55:23.000065] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:44:39.381 00:44:39.381 real 0m0.286s 00:44:39.381 user 0m0.171s 00:44:39.381 sys 0m0.114s 00:44:39.381 11:55:23 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1125 -- # xtrace_disable 00:44:39.381 11:55:23 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:44:39.381 ************************************ 00:44:39.381 END TEST bdev_json_nonarray 00:44:39.381 ************************************ 00:44:39.381 11:55:23 blockdev_crypto_aesni -- bdev/blockdev.sh@787 -- # [[ crypto_aesni == bdev ]] 00:44:39.381 11:55:23 blockdev_crypto_aesni -- bdev/blockdev.sh@794 -- # [[ crypto_aesni == gpt ]] 00:44:39.381 11:55:23 blockdev_crypto_aesni -- bdev/blockdev.sh@798 -- # [[ crypto_aesni == crypto_sw ]] 00:44:39.381 11:55:23 blockdev_crypto_aesni -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:44:39.381 11:55:23 blockdev_crypto_aesni -- bdev/blockdev.sh@811 -- # cleanup 00:44:39.381 11:55:23 blockdev_crypto_aesni -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:44:39.381 11:55:23 blockdev_crypto_aesni -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:44:39.381 11:55:23 blockdev_crypto_aesni -- bdev/blockdev.sh@26 -- # [[ crypto_aesni == rbd ]] 00:44:39.381 11:55:23 blockdev_crypto_aesni -- bdev/blockdev.sh@30 -- # [[ crypto_aesni == daos ]] 00:44:39.381 11:55:23 blockdev_crypto_aesni -- bdev/blockdev.sh@34 -- # [[ crypto_aesni = \g\p\t ]] 00:44:39.381 11:55:23 blockdev_crypto_aesni -- bdev/blockdev.sh@40 -- # [[ crypto_aesni == xnvme ]] 00:44:39.381 00:44:39.381 real 1m8.334s 00:44:39.381 user 2m34.621s 00:44:39.381 sys 0m7.607s 00:44:39.381 11:55:23 blockdev_crypto_aesni -- common/autotest_common.sh@1125 -- # xtrace_disable 00:44:39.381 11:55:23 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:44:39.381 ************************************ 00:44:39.381 END TEST blockdev_crypto_aesni 00:44:39.381 ************************************ 00:44:39.381 11:55:23 -- spdk/autotest.sh@358 -- # run_test blockdev_crypto_sw /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_sw 00:44:39.381 11:55:23 -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:44:39.381 11:55:23 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:44:39.381 11:55:23 -- common/autotest_common.sh@10 -- # set +x 00:44:39.381 ************************************ 00:44:39.381 START TEST blockdev_crypto_sw 00:44:39.381 ************************************ 00:44:39.381 11:55:23 blockdev_crypto_sw -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_sw 00:44:39.381 * Looking for test storage... 00:44:39.381 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:44:39.381 11:55:23 blockdev_crypto_sw -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:44:39.382 11:55:23 blockdev_crypto_sw -- bdev/nbd_common.sh@6 -- # set -e 00:44:39.382 11:55:23 blockdev_crypto_sw -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:44:39.382 11:55:23 blockdev_crypto_sw -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:44:39.382 11:55:23 blockdev_crypto_sw -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:44:39.382 11:55:23 blockdev_crypto_sw -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:44:39.382 11:55:23 blockdev_crypto_sw -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:44:39.382 11:55:23 blockdev_crypto_sw -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:44:39.382 11:55:23 blockdev_crypto_sw -- bdev/blockdev.sh@20 -- # : 00:44:39.382 11:55:23 blockdev_crypto_sw -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:44:39.382 11:55:23 blockdev_crypto_sw -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:44:39.382 11:55:23 blockdev_crypto_sw -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:44:39.382 11:55:23 blockdev_crypto_sw -- bdev/blockdev.sh@674 -- # uname -s 00:44:39.382 11:55:23 blockdev_crypto_sw -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:44:39.382 11:55:23 blockdev_crypto_sw -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:44:39.382 11:55:23 blockdev_crypto_sw -- bdev/blockdev.sh@682 -- # test_type=crypto_sw 00:44:39.382 11:55:23 blockdev_crypto_sw -- bdev/blockdev.sh@683 -- # crypto_device= 00:44:39.382 11:55:23 blockdev_crypto_sw -- bdev/blockdev.sh@684 -- # dek= 00:44:39.382 11:55:23 blockdev_crypto_sw -- bdev/blockdev.sh@685 -- # env_ctx= 00:44:39.382 11:55:23 blockdev_crypto_sw -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:44:39.382 11:55:23 blockdev_crypto_sw -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:44:39.382 11:55:23 blockdev_crypto_sw -- bdev/blockdev.sh@690 -- # [[ crypto_sw == bdev ]] 00:44:39.382 11:55:23 blockdev_crypto_sw -- bdev/blockdev.sh@690 -- # [[ crypto_sw == crypto_* ]] 00:44:39.382 11:55:23 blockdev_crypto_sw -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:44:39.382 11:55:23 blockdev_crypto_sw -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:44:39.382 11:55:23 blockdev_crypto_sw -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=285760 00:44:39.382 11:55:23 blockdev_crypto_sw -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:44:39.382 11:55:23 blockdev_crypto_sw -- bdev/blockdev.sh@49 -- # waitforlisten 285760 00:44:39.382 11:55:23 blockdev_crypto_sw -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:44:39.382 11:55:23 blockdev_crypto_sw -- common/autotest_common.sh@830 -- # '[' -z 285760 ']' 00:44:39.382 11:55:23 blockdev_crypto_sw -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:44:39.382 11:55:23 blockdev_crypto_sw -- common/autotest_common.sh@835 -- # local max_retries=100 00:44:39.382 11:55:23 blockdev_crypto_sw -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:44:39.382 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:44:39.382 11:55:23 blockdev_crypto_sw -- common/autotest_common.sh@839 -- # xtrace_disable 00:44:39.382 11:55:23 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:44:39.641 [2024-06-10 11:55:23.381848] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:44:39.641 [2024-06-10 11:55:23.381904] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid285760 ] 00:44:39.641 [2024-06-10 11:55:23.467256] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:44:39.641 [2024-06-10 11:55:23.552055] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:44:40.575 11:55:24 blockdev_crypto_sw -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:44:40.575 11:55:24 blockdev_crypto_sw -- common/autotest_common.sh@863 -- # return 0 00:44:40.575 11:55:24 blockdev_crypto_sw -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:44:40.575 11:55:24 blockdev_crypto_sw -- bdev/blockdev.sh@711 -- # setup_crypto_sw_conf 00:44:40.575 11:55:24 blockdev_crypto_sw -- bdev/blockdev.sh@193 -- # rpc_cmd 00:44:40.575 11:55:24 blockdev_crypto_sw -- common/autotest_common.sh@560 -- # xtrace_disable 00:44:40.575 11:55:24 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:44:40.575 Malloc0 00:44:40.575 Malloc1 00:44:40.575 true 00:44:40.575 true 00:44:40.575 true 00:44:40.575 [2024-06-10 11:55:24.428593] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:44:40.575 crypto_ram 00:44:40.575 [2024-06-10 11:55:24.436622] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:44:40.575 crypto_ram2 00:44:40.575 [2024-06-10 11:55:24.444641] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:44:40.575 crypto_ram3 00:44:40.575 [ 00:44:40.575 { 00:44:40.575 "name": "Malloc1", 00:44:40.575 "aliases": [ 00:44:40.575 "73dfeb7c-5a98-4ea8-a184-a3bf0fbdf3f7" 00:44:40.575 ], 00:44:40.575 "product_name": "Malloc disk", 00:44:40.575 "block_size": 4096, 00:44:40.575 "num_blocks": 4096, 00:44:40.575 "uuid": "73dfeb7c-5a98-4ea8-a184-a3bf0fbdf3f7", 00:44:40.575 "assigned_rate_limits": { 00:44:40.575 "rw_ios_per_sec": 0, 00:44:40.575 "rw_mbytes_per_sec": 0, 00:44:40.575 "r_mbytes_per_sec": 0, 00:44:40.575 "w_mbytes_per_sec": 0 00:44:40.575 }, 00:44:40.575 "claimed": true, 00:44:40.575 "claim_type": "exclusive_write", 00:44:40.575 "zoned": false, 00:44:40.575 "supported_io_types": { 00:44:40.575 "read": true, 00:44:40.575 "write": true, 00:44:40.575 "unmap": true, 00:44:40.575 "write_zeroes": true, 00:44:40.575 "flush": true, 00:44:40.575 "reset": true, 00:44:40.575 "compare": false, 00:44:40.575 "compare_and_write": false, 00:44:40.575 "abort": true, 00:44:40.575 "nvme_admin": false, 00:44:40.575 "nvme_io": false 00:44:40.575 }, 00:44:40.575 "memory_domains": [ 00:44:40.575 { 00:44:40.575 "dma_device_id": "system", 00:44:40.575 "dma_device_type": 1 00:44:40.575 }, 00:44:40.575 { 00:44:40.575 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:44:40.575 "dma_device_type": 2 00:44:40.575 } 00:44:40.575 ], 00:44:40.575 "driver_specific": {} 00:44:40.575 } 00:44:40.575 ] 00:44:40.575 11:55:24 blockdev_crypto_sw -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:44:40.575 11:55:24 blockdev_crypto_sw -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:44:40.575 11:55:24 blockdev_crypto_sw -- common/autotest_common.sh@560 -- # xtrace_disable 00:44:40.575 11:55:24 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:44:40.575 11:55:24 blockdev_crypto_sw -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:44:40.575 11:55:24 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # cat 00:44:40.576 11:55:24 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:44:40.576 11:55:24 blockdev_crypto_sw -- common/autotest_common.sh@560 -- # xtrace_disable 00:44:40.576 11:55:24 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:44:40.576 11:55:24 blockdev_crypto_sw -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:44:40.576 11:55:24 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:44:40.576 11:55:24 blockdev_crypto_sw -- common/autotest_common.sh@560 -- # xtrace_disable 00:44:40.576 11:55:24 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:44:40.835 11:55:24 blockdev_crypto_sw -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:44:40.835 11:55:24 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:44:40.835 11:55:24 blockdev_crypto_sw -- common/autotest_common.sh@560 -- # xtrace_disable 00:44:40.835 11:55:24 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:44:40.835 11:55:24 blockdev_crypto_sw -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:44:40.835 11:55:24 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:44:40.835 11:55:24 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:44:40.835 11:55:24 blockdev_crypto_sw -- common/autotest_common.sh@560 -- # xtrace_disable 00:44:40.835 11:55:24 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:44:40.835 11:55:24 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:44:40.835 11:55:24 blockdev_crypto_sw -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:44:40.835 11:55:24 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:44:40.835 11:55:24 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "4c649918-9dcb-559a-9b1f-97f3ba2e964c"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "4c649918-9dcb-559a-9b1f-97f3ba2e964c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "0355b735-25dc-57bf-824c-9f4378e2de95"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "0355b735-25dc-57bf-824c-9f4378e2de95",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:44:40.835 11:55:24 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # jq -r .name 00:44:40.835 11:55:24 blockdev_crypto_sw -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:44:40.835 11:55:24 blockdev_crypto_sw -- bdev/blockdev.sh@752 -- # hello_world_bdev=crypto_ram 00:44:40.835 11:55:24 blockdev_crypto_sw -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:44:40.835 11:55:24 blockdev_crypto_sw -- bdev/blockdev.sh@754 -- # killprocess 285760 00:44:40.835 11:55:24 blockdev_crypto_sw -- common/autotest_common.sh@949 -- # '[' -z 285760 ']' 00:44:40.835 11:55:24 blockdev_crypto_sw -- common/autotest_common.sh@953 -- # kill -0 285760 00:44:40.835 11:55:24 blockdev_crypto_sw -- common/autotest_common.sh@954 -- # uname 00:44:40.835 11:55:24 blockdev_crypto_sw -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:44:40.835 11:55:24 blockdev_crypto_sw -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 285760 00:44:40.835 11:55:24 blockdev_crypto_sw -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:44:40.835 11:55:24 blockdev_crypto_sw -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:44:40.835 11:55:24 blockdev_crypto_sw -- common/autotest_common.sh@967 -- # echo 'killing process with pid 285760' 00:44:40.835 killing process with pid 285760 00:44:40.835 11:55:24 blockdev_crypto_sw -- common/autotest_common.sh@968 -- # kill 285760 00:44:40.835 11:55:24 blockdev_crypto_sw -- common/autotest_common.sh@973 -- # wait 285760 00:44:41.403 11:55:25 blockdev_crypto_sw -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:44:41.403 11:55:25 blockdev_crypto_sw -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:44:41.403 11:55:25 blockdev_crypto_sw -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:44:41.403 11:55:25 blockdev_crypto_sw -- common/autotest_common.sh@1106 -- # xtrace_disable 00:44:41.403 11:55:25 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:44:41.403 ************************************ 00:44:41.403 START TEST bdev_hello_world 00:44:41.403 ************************************ 00:44:41.403 11:55:25 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:44:41.403 [2024-06-10 11:55:25.160112] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:44:41.403 [2024-06-10 11:55:25.160158] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid286087 ] 00:44:41.403 [2024-06-10 11:55:25.247114] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:44:41.403 [2024-06-10 11:55:25.325289] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:44:41.663 [2024-06-10 11:55:25.479099] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:44:41.663 [2024-06-10 11:55:25.479154] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:44:41.663 [2024-06-10 11:55:25.479164] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:44:41.663 [2024-06-10 11:55:25.487117] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:44:41.663 [2024-06-10 11:55:25.487129] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:44:41.663 [2024-06-10 11:55:25.487136] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:44:41.663 [2024-06-10 11:55:25.495137] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:44:41.663 [2024-06-10 11:55:25.495148] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:44:41.663 [2024-06-10 11:55:25.495155] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:44:41.663 [2024-06-10 11:55:25.533489] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:44:41.663 [2024-06-10 11:55:25.533520] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:44:41.663 [2024-06-10 11:55:25.533534] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:44:41.663 [2024-06-10 11:55:25.534984] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:44:41.663 [2024-06-10 11:55:25.535049] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:44:41.663 [2024-06-10 11:55:25.535061] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:44:41.663 [2024-06-10 11:55:25.535087] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:44:41.663 00:44:41.663 [2024-06-10 11:55:25.535100] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:44:41.922 00:44:41.922 real 0m0.607s 00:44:41.922 user 0m0.415s 00:44:41.922 sys 0m0.182s 00:44:41.922 11:55:25 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@1125 -- # xtrace_disable 00:44:41.923 11:55:25 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:44:41.923 ************************************ 00:44:41.923 END TEST bdev_hello_world 00:44:41.923 ************************************ 00:44:41.923 11:55:25 blockdev_crypto_sw -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:44:41.923 11:55:25 blockdev_crypto_sw -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:44:41.923 11:55:25 blockdev_crypto_sw -- common/autotest_common.sh@1106 -- # xtrace_disable 00:44:41.923 11:55:25 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:44:41.923 ************************************ 00:44:41.923 START TEST bdev_bounds 00:44:41.923 ************************************ 00:44:41.923 11:55:25 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@1124 -- # bdev_bounds '' 00:44:41.923 11:55:25 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:44:41.923 11:55:25 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=286144 00:44:41.923 11:55:25 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:44:41.923 11:55:25 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 286144' 00:44:41.923 Process bdevio pid: 286144 00:44:41.923 11:55:25 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 286144 00:44:41.923 11:55:25 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@830 -- # '[' -z 286144 ']' 00:44:41.923 11:55:25 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:44:41.923 11:55:25 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@835 -- # local max_retries=100 00:44:41.923 11:55:25 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:44:41.923 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:44:41.923 11:55:25 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@839 -- # xtrace_disable 00:44:41.923 11:55:25 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:44:41.923 [2024-06-10 11:55:25.824259] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:44:41.923 [2024-06-10 11:55:25.824305] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid286144 ] 00:44:42.182 [2024-06-10 11:55:25.912366] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:44:42.182 [2024-06-10 11:55:26.001044] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:44:42.182 [2024-06-10 11:55:26.001128] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:44:42.182 [2024-06-10 11:55:26.001131] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:44:42.441 [2024-06-10 11:55:26.165395] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:44:42.441 [2024-06-10 11:55:26.165449] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:44:42.441 [2024-06-10 11:55:26.165460] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:44:42.441 [2024-06-10 11:55:26.173416] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:44:42.441 [2024-06-10 11:55:26.173429] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:44:42.441 [2024-06-10 11:55:26.173437] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:44:42.441 [2024-06-10 11:55:26.181438] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:44:42.441 [2024-06-10 11:55:26.181452] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:44:42.441 [2024-06-10 11:55:26.181460] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:44:43.009 11:55:26 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:44:43.009 11:55:26 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@863 -- # return 0 00:44:43.009 11:55:26 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:44:43.009 I/O targets: 00:44:43.009 crypto_ram: 32768 blocks of 512 bytes (16 MiB) 00:44:43.009 crypto_ram3: 4096 blocks of 4096 bytes (16 MiB) 00:44:43.009 00:44:43.009 00:44:43.009 CUnit - A unit testing framework for C - Version 2.1-3 00:44:43.009 http://cunit.sourceforge.net/ 00:44:43.009 00:44:43.009 00:44:43.009 Suite: bdevio tests on: crypto_ram3 00:44:43.009 Test: blockdev write read block ...passed 00:44:43.009 Test: blockdev write zeroes read block ...passed 00:44:43.009 Test: blockdev write zeroes read no split ...passed 00:44:43.009 Test: blockdev write zeroes read split ...passed 00:44:43.009 Test: blockdev write zeroes read split partial ...passed 00:44:43.009 Test: blockdev reset ...passed 00:44:43.009 Test: blockdev write read 8 blocks ...passed 00:44:43.009 Test: blockdev write read size > 128k ...passed 00:44:43.009 Test: blockdev write read invalid size ...passed 00:44:43.009 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:44:43.009 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:44:43.009 Test: blockdev write read max offset ...passed 00:44:43.009 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:44:43.009 Test: blockdev writev readv 8 blocks ...passed 00:44:43.009 Test: blockdev writev readv 30 x 1block ...passed 00:44:43.009 Test: blockdev writev readv block ...passed 00:44:43.009 Test: blockdev writev readv size > 128k ...passed 00:44:43.010 Test: blockdev writev readv size > 128k in two iovs ...passed 00:44:43.010 Test: blockdev comparev and writev ...passed 00:44:43.010 Test: blockdev nvme passthru rw ...passed 00:44:43.010 Test: blockdev nvme passthru vendor specific ...passed 00:44:43.010 Test: blockdev nvme admin passthru ...passed 00:44:43.010 Test: blockdev copy ...passed 00:44:43.010 Suite: bdevio tests on: crypto_ram 00:44:43.010 Test: blockdev write read block ...passed 00:44:43.010 Test: blockdev write zeroes read block ...passed 00:44:43.010 Test: blockdev write zeroes read no split ...passed 00:44:43.010 Test: blockdev write zeroes read split ...passed 00:44:43.010 Test: blockdev write zeroes read split partial ...passed 00:44:43.010 Test: blockdev reset ...passed 00:44:43.010 Test: blockdev write read 8 blocks ...passed 00:44:43.010 Test: blockdev write read size > 128k ...passed 00:44:43.010 Test: blockdev write read invalid size ...passed 00:44:43.010 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:44:43.010 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:44:43.010 Test: blockdev write read max offset ...passed 00:44:43.010 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:44:43.010 Test: blockdev writev readv 8 blocks ...passed 00:44:43.010 Test: blockdev writev readv 30 x 1block ...passed 00:44:43.010 Test: blockdev writev readv block ...passed 00:44:43.010 Test: blockdev writev readv size > 128k ...passed 00:44:43.010 Test: blockdev writev readv size > 128k in two iovs ...passed 00:44:43.010 Test: blockdev comparev and writev ...passed 00:44:43.010 Test: blockdev nvme passthru rw ...passed 00:44:43.010 Test: blockdev nvme passthru vendor specific ...passed 00:44:43.010 Test: blockdev nvme admin passthru ...passed 00:44:43.010 Test: blockdev copy ...passed 00:44:43.010 00:44:43.010 Run Summary: Type Total Ran Passed Failed Inactive 00:44:43.010 suites 2 2 n/a 0 0 00:44:43.010 tests 46 46 46 0 0 00:44:43.010 asserts 260 260 260 0 n/a 00:44:43.010 00:44:43.010 Elapsed time = 0.078 seconds 00:44:43.010 0 00:44:43.010 11:55:26 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 286144 00:44:43.010 11:55:26 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@949 -- # '[' -z 286144 ']' 00:44:43.010 11:55:26 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@953 -- # kill -0 286144 00:44:43.010 11:55:26 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@954 -- # uname 00:44:43.010 11:55:26 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:44:43.010 11:55:26 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 286144 00:44:43.010 11:55:26 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:44:43.010 11:55:26 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:44:43.010 11:55:26 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@967 -- # echo 'killing process with pid 286144' 00:44:43.010 killing process with pid 286144 00:44:43.010 11:55:26 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@968 -- # kill 286144 00:44:43.010 11:55:26 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@973 -- # wait 286144 00:44:43.270 11:55:27 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:44:43.270 00:44:43.270 real 0m1.252s 00:44:43.270 user 0m3.277s 00:44:43.270 sys 0m0.325s 00:44:43.270 11:55:27 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@1125 -- # xtrace_disable 00:44:43.270 11:55:27 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:44:43.270 ************************************ 00:44:43.270 END TEST bdev_bounds 00:44:43.270 ************************************ 00:44:43.270 11:55:27 blockdev_crypto_sw -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram3' '' 00:44:43.270 11:55:27 blockdev_crypto_sw -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:44:43.270 11:55:27 blockdev_crypto_sw -- common/autotest_common.sh@1106 -- # xtrace_disable 00:44:43.270 11:55:27 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:44:43.270 ************************************ 00:44:43.270 START TEST bdev_nbd 00:44:43.270 ************************************ 00:44:43.270 11:55:27 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@1124 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram3' '' 00:44:43.270 11:55:27 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:44:43.270 11:55:27 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:44:43.270 11:55:27 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:44:43.270 11:55:27 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:44:43.270 11:55:27 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('crypto_ram' 'crypto_ram3') 00:44:43.270 11:55:27 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:44:43.270 11:55:27 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=2 00:44:43.270 11:55:27 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:44:43.270 11:55:27 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:44:43.270 11:55:27 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:44:43.270 11:55:27 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=2 00:44:43.270 11:55:27 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:44:43.270 11:55:27 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:44:43.270 11:55:27 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:44:43.270 11:55:27 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:44:43.270 11:55:27 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:44:43.270 11:55:27 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=286356 00:44:43.270 11:55:27 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:44:43.270 11:55:27 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 286356 /var/tmp/spdk-nbd.sock 00:44:43.270 11:55:27 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@830 -- # '[' -z 286356 ']' 00:44:43.270 11:55:27 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:44:43.270 11:55:27 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@835 -- # local max_retries=100 00:44:43.270 11:55:27 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:44:43.270 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:44:43.270 11:55:27 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@839 -- # xtrace_disable 00:44:43.270 11:55:27 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:44:43.270 [2024-06-10 11:55:27.168325] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:44:43.270 [2024-06-10 11:55:27.168372] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:44:43.529 [2024-06-10 11:55:27.255343] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:44:43.529 [2024-06-10 11:55:27.339704] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:44:43.788 [2024-06-10 11:55:27.492423] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:44:43.788 [2024-06-10 11:55:27.492474] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:44:43.788 [2024-06-10 11:55:27.492483] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:44:43.788 [2024-06-10 11:55:27.500441] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:44:43.789 [2024-06-10 11:55:27.500453] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:44:43.789 [2024-06-10 11:55:27.500461] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:44:43.789 [2024-06-10 11:55:27.508462] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:44:43.789 [2024-06-10 11:55:27.508472] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:44:43.789 [2024-06-10 11:55:27.508479] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:44:44.048 11:55:27 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:44:44.048 11:55:27 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@863 -- # return 0 00:44:44.048 11:55:27 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' 00:44:44.048 11:55:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:44:44.048 11:55:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:44:44.048 11:55:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:44:44.048 11:55:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' 00:44:44.048 11:55:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:44:44.048 11:55:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:44:44.048 11:55:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:44:44.048 11:55:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:44:44.048 11:55:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:44:44.048 11:55:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:44:44.048 11:55:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:44:44.048 11:55:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:44:44.307 11:55:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:44:44.307 11:55:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:44:44.307 11:55:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:44:44.307 11:55:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:44:44.307 11:55:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:44:44.307 11:55:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:44:44.307 11:55:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:44:44.307 11:55:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:44:44.307 11:55:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:44:44.307 11:55:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:44:44.307 11:55:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:44:44.307 11:55:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:44:44.307 1+0 records in 00:44:44.307 1+0 records out 00:44:44.307 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000266632 s, 15.4 MB/s 00:44:44.307 11:55:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:44:44.307 11:55:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:44:44.307 11:55:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:44:44.307 11:55:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:44:44.307 11:55:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:44:44.307 11:55:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:44:44.307 11:55:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:44:44.307 11:55:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:44:44.566 11:55:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:44:44.566 11:55:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:44:44.566 11:55:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:44:44.566 11:55:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:44:44.566 11:55:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:44:44.566 11:55:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:44:44.566 11:55:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:44:44.566 11:55:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:44:44.566 11:55:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:44:44.566 11:55:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:44:44.566 11:55:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:44:44.566 11:55:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:44:44.566 1+0 records in 00:44:44.566 1+0 records out 00:44:44.566 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000305893 s, 13.4 MB/s 00:44:44.566 11:55:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:44:44.566 11:55:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:44:44.566 11:55:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:44:44.566 11:55:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:44:44.566 11:55:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:44:44.566 11:55:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:44:44.566 11:55:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:44:44.566 11:55:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:44:44.826 11:55:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:44:44.826 { 00:44:44.826 "nbd_device": "/dev/nbd0", 00:44:44.826 "bdev_name": "crypto_ram" 00:44:44.826 }, 00:44:44.826 { 00:44:44.826 "nbd_device": "/dev/nbd1", 00:44:44.826 "bdev_name": "crypto_ram3" 00:44:44.826 } 00:44:44.826 ]' 00:44:44.826 11:55:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:44:44.826 11:55:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:44:44.826 { 00:44:44.826 "nbd_device": "/dev/nbd0", 00:44:44.826 "bdev_name": "crypto_ram" 00:44:44.826 }, 00:44:44.826 { 00:44:44.826 "nbd_device": "/dev/nbd1", 00:44:44.826 "bdev_name": "crypto_ram3" 00:44:44.826 } 00:44:44.826 ]' 00:44:44.826 11:55:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:44:44.826 11:55:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:44:44.826 11:55:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:44:44.826 11:55:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:44:44.826 11:55:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:44:44.826 11:55:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:44:44.826 11:55:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:44:44.826 11:55:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:44:45.084 11:55:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:44:45.084 11:55:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:44:45.084 11:55:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:44:45.084 11:55:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:44:45.084 11:55:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:44:45.084 11:55:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:44:45.084 11:55:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:44:45.084 11:55:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:44:45.085 11:55:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:44:45.085 11:55:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:44:45.085 11:55:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:44:45.085 11:55:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:44:45.085 11:55:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:44:45.085 11:55:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:44:45.085 11:55:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:44:45.085 11:55:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:44:45.085 11:55:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:44:45.085 11:55:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:44:45.085 11:55:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:44:45.085 11:55:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:44:45.085 11:55:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:44:45.343 11:55:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:44:45.343 11:55:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:44:45.343 11:55:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:44:45.343 11:55:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:44:45.343 11:55:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:44:45.343 11:55:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:44:45.343 11:55:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:44:45.343 11:55:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:44:45.343 11:55:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:44:45.343 11:55:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:44:45.343 11:55:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:44:45.343 11:55:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:44:45.343 11:55:29 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' '/dev/nbd0 /dev/nbd1' 00:44:45.343 11:55:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:44:45.343 11:55:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:44:45.343 11:55:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:44:45.343 11:55:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:44:45.343 11:55:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:44:45.343 11:55:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' '/dev/nbd0 /dev/nbd1' 00:44:45.343 11:55:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:44:45.343 11:55:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:44:45.343 11:55:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:44:45.343 11:55:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:44:45.343 11:55:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:44:45.343 11:55:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:44:45.343 11:55:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:44:45.343 11:55:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:44:45.343 11:55:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:44:45.602 /dev/nbd0 00:44:45.602 11:55:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:44:45.602 11:55:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:44:45.602 11:55:29 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:44:45.602 11:55:29 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:44:45.602 11:55:29 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:44:45.602 11:55:29 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:44:45.602 11:55:29 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:44:45.602 11:55:29 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:44:45.602 11:55:29 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:44:45.602 11:55:29 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:44:45.602 11:55:29 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:44:45.602 1+0 records in 00:44:45.602 1+0 records out 00:44:45.602 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000268054 s, 15.3 MB/s 00:44:45.602 11:55:29 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:44:45.602 11:55:29 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:44:45.602 11:55:29 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:44:45.602 11:55:29 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:44:45.602 11:55:29 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:44:45.602 11:55:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:44:45.602 11:55:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:44:45.602 11:55:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd1 00:44:45.860 /dev/nbd1 00:44:45.860 11:55:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:44:45.860 11:55:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:44:45.860 11:55:29 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:44:45.860 11:55:29 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:44:45.860 11:55:29 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:44:45.860 11:55:29 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:44:45.860 11:55:29 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:44:45.860 11:55:29 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:44:45.860 11:55:29 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:44:45.860 11:55:29 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:44:45.860 11:55:29 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:44:45.860 1+0 records in 00:44:45.860 1+0 records out 00:44:45.860 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000268828 s, 15.2 MB/s 00:44:45.860 11:55:29 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:44:45.860 11:55:29 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:44:45.860 11:55:29 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:44:45.860 11:55:29 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:44:45.860 11:55:29 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:44:45.860 11:55:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:44:45.860 11:55:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:44:45.860 11:55:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:44:45.860 11:55:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:44:45.860 11:55:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:44:46.119 11:55:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:44:46.119 { 00:44:46.119 "nbd_device": "/dev/nbd0", 00:44:46.119 "bdev_name": "crypto_ram" 00:44:46.119 }, 00:44:46.119 { 00:44:46.119 "nbd_device": "/dev/nbd1", 00:44:46.119 "bdev_name": "crypto_ram3" 00:44:46.119 } 00:44:46.119 ]' 00:44:46.119 11:55:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:44:46.119 { 00:44:46.119 "nbd_device": "/dev/nbd0", 00:44:46.119 "bdev_name": "crypto_ram" 00:44:46.119 }, 00:44:46.119 { 00:44:46.119 "nbd_device": "/dev/nbd1", 00:44:46.119 "bdev_name": "crypto_ram3" 00:44:46.119 } 00:44:46.119 ]' 00:44:46.119 11:55:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:44:46.119 11:55:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:44:46.119 /dev/nbd1' 00:44:46.119 11:55:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:44:46.119 11:55:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:44:46.119 /dev/nbd1' 00:44:46.119 11:55:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=2 00:44:46.119 11:55:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 2 00:44:46.119 11:55:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=2 00:44:46.119 11:55:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:44:46.119 11:55:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:44:46.119 11:55:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:44:46.119 11:55:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:44:46.119 11:55:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:44:46.119 11:55:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:44:46.119 11:55:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:44:46.119 11:55:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:44:46.119 256+0 records in 00:44:46.119 256+0 records out 00:44:46.119 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0110327 s, 95.0 MB/s 00:44:46.119 11:55:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:44:46.119 11:55:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:44:46.119 256+0 records in 00:44:46.119 256+0 records out 00:44:46.119 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0206849 s, 50.7 MB/s 00:44:46.119 11:55:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:44:46.119 11:55:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:44:46.119 256+0 records in 00:44:46.119 256+0 records out 00:44:46.119 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0311964 s, 33.6 MB/s 00:44:46.119 11:55:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:44:46.119 11:55:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:44:46.119 11:55:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:44:46.119 11:55:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:44:46.119 11:55:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:44:46.119 11:55:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:44:46.119 11:55:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:44:46.119 11:55:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:44:46.119 11:55:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:44:46.119 11:55:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:44:46.119 11:55:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:44:46.119 11:55:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:44:46.119 11:55:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:44:46.120 11:55:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:44:46.120 11:55:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:44:46.120 11:55:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:44:46.120 11:55:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:44:46.120 11:55:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:44:46.120 11:55:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:44:46.378 11:55:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:44:46.378 11:55:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:44:46.378 11:55:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:44:46.378 11:55:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:44:46.378 11:55:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:44:46.378 11:55:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:44:46.378 11:55:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:44:46.378 11:55:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:44:46.378 11:55:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:44:46.378 11:55:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:44:46.636 11:55:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:44:46.636 11:55:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:44:46.636 11:55:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:44:46.636 11:55:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:44:46.636 11:55:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:44:46.636 11:55:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:44:46.636 11:55:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:44:46.636 11:55:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:44:46.636 11:55:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:44:46.636 11:55:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:44:46.636 11:55:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:44:46.894 11:55:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:44:46.894 11:55:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:44:46.894 11:55:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:44:46.894 11:55:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:44:46.894 11:55:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:44:46.894 11:55:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:44:46.894 11:55:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:44:46.894 11:55:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:44:46.894 11:55:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:44:46.894 11:55:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:44:46.894 11:55:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:44:46.894 11:55:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:44:46.894 11:55:30 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:44:46.894 11:55:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:44:46.894 11:55:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:44:46.894 11:55:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:44:46.894 11:55:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:44:46.894 11:55:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:44:46.894 malloc_lvol_verify 00:44:47.153 11:55:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:44:47.153 1c5306ee-e7cb-427a-8916-4342d244cef6 00:44:47.153 11:55:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:44:47.411 9eb0189e-3db5-455e-93e0-9ae53570b796 00:44:47.411 11:55:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:44:47.669 /dev/nbd0 00:44:47.669 11:55:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:44:47.669 mke2fs 1.46.5 (30-Dec-2021) 00:44:47.669 Discarding device blocks: 0/4096 done 00:44:47.669 Creating filesystem with 4096 1k blocks and 1024 inodes 00:44:47.669 00:44:47.669 Allocating group tables: 0/1 done 00:44:47.669 Writing inode tables: 0/1 done 00:44:47.669 Creating journal (1024 blocks): done 00:44:47.669 Writing superblocks and filesystem accounting information: 0/1 done 00:44:47.669 00:44:47.669 11:55:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:44:47.669 11:55:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:44:47.669 11:55:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:44:47.669 11:55:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:44:47.669 11:55:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:44:47.669 11:55:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:44:47.669 11:55:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:44:47.669 11:55:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:44:47.669 11:55:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:44:47.669 11:55:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:44:47.669 11:55:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:44:47.669 11:55:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:44:47.669 11:55:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:44:47.669 11:55:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:44:47.669 11:55:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:44:47.669 11:55:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:44:47.669 11:55:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:44:47.669 11:55:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:44:47.669 11:55:31 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 286356 00:44:47.669 11:55:31 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@949 -- # '[' -z 286356 ']' 00:44:47.669 11:55:31 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@953 -- # kill -0 286356 00:44:47.669 11:55:31 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@954 -- # uname 00:44:47.928 11:55:31 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:44:47.928 11:55:31 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 286356 00:44:47.928 11:55:31 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:44:47.928 11:55:31 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:44:47.928 11:55:31 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@967 -- # echo 'killing process with pid 286356' 00:44:47.928 killing process with pid 286356 00:44:47.928 11:55:31 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@968 -- # kill 286356 00:44:47.928 11:55:31 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@973 -- # wait 286356 00:44:47.928 11:55:31 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:44:47.928 00:44:47.928 real 0m4.743s 00:44:47.928 user 0m6.565s 00:44:47.928 sys 0m1.937s 00:44:47.928 11:55:31 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@1125 -- # xtrace_disable 00:44:47.928 11:55:31 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:44:47.928 ************************************ 00:44:47.928 END TEST bdev_nbd 00:44:47.928 ************************************ 00:44:48.188 11:55:31 blockdev_crypto_sw -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:44:48.188 11:55:31 blockdev_crypto_sw -- bdev/blockdev.sh@764 -- # '[' crypto_sw = nvme ']' 00:44:48.188 11:55:31 blockdev_crypto_sw -- bdev/blockdev.sh@764 -- # '[' crypto_sw = gpt ']' 00:44:48.188 11:55:31 blockdev_crypto_sw -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:44:48.188 11:55:31 blockdev_crypto_sw -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:44:48.188 11:55:31 blockdev_crypto_sw -- common/autotest_common.sh@1106 -- # xtrace_disable 00:44:48.188 11:55:31 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:44:48.188 ************************************ 00:44:48.188 START TEST bdev_fio 00:44:48.188 ************************************ 00:44:48.188 11:55:31 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1124 -- # fio_test_suite '' 00:44:48.188 11:55:31 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:44:48.188 11:55:31 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:44:48.188 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:44:48.188 11:55:31 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:44:48.188 11:55:31 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:44:48.188 11:55:31 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:44:48.188 11:55:31 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:44:48.188 11:55:31 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:44:48.188 11:55:31 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1279 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:44:48.188 11:55:31 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1280 -- # local workload=verify 00:44:48.188 11:55:31 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1281 -- # local bdev_type=AIO 00:44:48.188 11:55:31 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1282 -- # local env_context= 00:44:48.188 11:55:31 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1283 -- # local fio_dir=/usr/src/fio 00:44:48.188 11:55:31 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1285 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:44:48.188 11:55:31 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1290 -- # '[' -z verify ']' 00:44:48.188 11:55:31 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1294 -- # '[' -n '' ']' 00:44:48.188 11:55:31 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1298 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:44:48.188 11:55:31 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1300 -- # cat 00:44:48.188 11:55:31 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1312 -- # '[' verify == verify ']' 00:44:48.188 11:55:31 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1313 -- # cat 00:44:48.188 11:55:31 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1322 -- # '[' AIO == AIO ']' 00:44:48.188 11:55:31 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1323 -- # /usr/src/fio/fio --version 00:44:48.188 11:55:32 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1323 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:44:48.188 11:55:32 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1324 -- # echo serialize_overlap=1 00:44:48.188 11:55:32 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:44:48.188 11:55:32 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram]' 00:44:48.188 11:55:32 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram 00:44:48.188 11:55:32 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:44:48.188 11:55:32 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram3]' 00:44:48.188 11:55:32 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram3 00:44:48.188 11:55:32 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:44:48.188 11:55:32 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:44:48.188 11:55:32 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1100 -- # '[' 11 -le 1 ']' 00:44:48.188 11:55:32 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1106 -- # xtrace_disable 00:44:48.188 11:55:32 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:44:48.188 ************************************ 00:44:48.188 START TEST bdev_fio_rw_verify 00:44:48.188 ************************************ 00:44:48.188 11:55:32 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:44:48.189 11:55:32 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1355 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:44:48.189 11:55:32 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1336 -- # local fio_dir=/usr/src/fio 00:44:48.189 11:55:32 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1338 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:44:48.189 11:55:32 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1338 -- # local sanitizers 00:44:48.189 11:55:32 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:44:48.189 11:55:32 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # shift 00:44:48.189 11:55:32 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1342 -- # local asan_lib= 00:44:48.189 11:55:32 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # for sanitizer in "${sanitizers[@]}" 00:44:48.189 11:55:32 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:44:48.189 11:55:32 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # grep libasan 00:44:48.189 11:55:32 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # awk '{print $3}' 00:44:48.189 11:55:32 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # asan_lib= 00:44:48.189 11:55:32 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # [[ -n '' ]] 00:44:48.189 11:55:32 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # for sanitizer in "${sanitizers[@]}" 00:44:48.189 11:55:32 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:44:48.189 11:55:32 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # grep libclang_rt.asan 00:44:48.189 11:55:32 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # awk '{print $3}' 00:44:48.446 11:55:32 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # asan_lib= 00:44:48.446 11:55:32 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # [[ -n '' ]] 00:44:48.446 11:55:32 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1351 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:44:48.446 11:55:32 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1351 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:44:48.704 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:44:48.704 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:44:48.704 fio-3.35 00:44:48.704 Starting 2 threads 00:45:01.003 00:45:01.003 job_crypto_ram: (groupid=0, jobs=2): err= 0: pid=287289: Mon Jun 10 11:55:42 2024 00:45:01.003 read: IOPS=29.6k, BW=116MiB/s (121MB/s)(1158MiB/10001msec) 00:45:01.003 slat (usec): min=8, max=1665, avg=15.37, stdev= 6.78 00:45:01.003 clat (usec): min=5, max=1828, avg=109.07, stdev=53.04 00:45:01.003 lat (usec): min=17, max=1846, avg=124.44, stdev=56.21 00:45:01.003 clat percentiles (usec): 00:45:01.003 | 50.000th=[ 103], 99.000th=[ 306], 99.900th=[ 379], 99.990th=[ 429], 00:45:01.003 | 99.999th=[ 1795] 00:45:01.003 write: IOPS=35.6k, BW=139MiB/s (146MB/s)(1319MiB/9478msec); 0 zone resets 00:45:01.003 slat (usec): min=9, max=476, avg=24.93, stdev= 7.30 00:45:01.003 clat (usec): min=17, max=862, avg=145.44, stdev=76.66 00:45:01.003 lat (usec): min=35, max=947, avg=170.37, stdev=80.56 00:45:01.003 clat percentiles (usec): 00:45:01.003 | 50.000th=[ 139], 99.000th=[ 416], 99.900th=[ 537], 99.990th=[ 660], 00:45:01.003 | 99.999th=[ 824] 00:45:01.003 bw ( KiB/s): min=96944, max=150192, per=95.72%, avg=136446.74, stdev=8484.38, samples=38 00:45:01.003 iops : min=24236, max=37548, avg=34111.68, stdev=2121.09, samples=38 00:45:01.003 lat (usec) : 10=0.01%, 20=0.01%, 50=8.75%, 100=30.14%, 250=56.09% 00:45:01.003 lat (usec) : 500=4.86%, 750=0.14%, 1000=0.01% 00:45:01.003 lat (msec) : 2=0.01% 00:45:01.003 cpu : usr=99.66%, sys=0.01%, ctx=33, majf=0, minf=493 00:45:01.003 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:45:01.003 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:45:01.003 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:45:01.003 issued rwts: total=296482,337778,0,0 short=0,0,0,0 dropped=0,0,0,0 00:45:01.003 latency : target=0, window=0, percentile=100.00%, depth=8 00:45:01.003 00:45:01.003 Run status group 0 (all jobs): 00:45:01.003 READ: bw=116MiB/s (121MB/s), 116MiB/s-116MiB/s (121MB/s-121MB/s), io=1158MiB (1214MB), run=10001-10001msec 00:45:01.003 WRITE: bw=139MiB/s (146MB/s), 139MiB/s-139MiB/s (146MB/s-146MB/s), io=1319MiB (1384MB), run=9478-9478msec 00:45:01.003 00:45:01.003 real 0m11.091s 00:45:01.003 user 0m23.302s 00:45:01.003 sys 0m0.315s 00:45:01.003 11:55:43 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1125 -- # xtrace_disable 00:45:01.003 11:55:43 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:45:01.003 ************************************ 00:45:01.003 END TEST bdev_fio_rw_verify 00:45:01.003 ************************************ 00:45:01.003 11:55:43 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:45:01.003 11:55:43 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:45:01.003 11:55:43 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:45:01.003 11:55:43 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1279 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:45:01.003 11:55:43 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1280 -- # local workload=trim 00:45:01.003 11:55:43 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1281 -- # local bdev_type= 00:45:01.003 11:55:43 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1282 -- # local env_context= 00:45:01.003 11:55:43 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1283 -- # local fio_dir=/usr/src/fio 00:45:01.003 11:55:43 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1285 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:45:01.003 11:55:43 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1290 -- # '[' -z trim ']' 00:45:01.003 11:55:43 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1294 -- # '[' -n '' ']' 00:45:01.003 11:55:43 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1298 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:45:01.003 11:55:43 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1300 -- # cat 00:45:01.003 11:55:43 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1312 -- # '[' trim == verify ']' 00:45:01.003 11:55:43 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1327 -- # '[' trim == trim ']' 00:45:01.003 11:55:43 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1328 -- # echo rw=trimwrite 00:45:01.003 11:55:43 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:45:01.004 11:55:43 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "4c649918-9dcb-559a-9b1f-97f3ba2e964c"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "4c649918-9dcb-559a-9b1f-97f3ba2e964c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "0355b735-25dc-57bf-824c-9f4378e2de95"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "0355b735-25dc-57bf-824c-9f4378e2de95",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:45:01.004 11:55:43 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n crypto_ram 00:45:01.004 crypto_ram3 ]] 00:45:01.004 11:55:43 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:45:01.004 11:55:43 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "4c649918-9dcb-559a-9b1f-97f3ba2e964c"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "4c649918-9dcb-559a-9b1f-97f3ba2e964c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "0355b735-25dc-57bf-824c-9f4378e2de95"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "0355b735-25dc-57bf-824c-9f4378e2de95",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:45:01.004 11:55:43 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:45:01.004 11:55:43 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram]' 00:45:01.004 11:55:43 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram 00:45:01.004 11:55:43 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:45:01.004 11:55:43 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram3]' 00:45:01.004 11:55:43 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram3 00:45:01.004 11:55:43 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:45:01.004 11:55:43 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1100 -- # '[' 11 -le 1 ']' 00:45:01.004 11:55:43 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1106 -- # xtrace_disable 00:45:01.004 11:55:43 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:45:01.004 ************************************ 00:45:01.004 START TEST bdev_fio_trim 00:45:01.004 ************************************ 00:45:01.004 11:55:43 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:45:01.004 11:55:43 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1355 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:45:01.004 11:55:43 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1336 -- # local fio_dir=/usr/src/fio 00:45:01.004 11:55:43 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1338 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:45:01.004 11:55:43 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1338 -- # local sanitizers 00:45:01.004 11:55:43 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:45:01.004 11:55:43 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # shift 00:45:01.004 11:55:43 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1342 -- # local asan_lib= 00:45:01.004 11:55:43 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # for sanitizer in "${sanitizers[@]}" 00:45:01.004 11:55:43 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:45:01.004 11:55:43 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # grep libasan 00:45:01.004 11:55:43 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # awk '{print $3}' 00:45:01.004 11:55:43 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # asan_lib= 00:45:01.004 11:55:43 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # [[ -n '' ]] 00:45:01.004 11:55:43 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # for sanitizer in "${sanitizers[@]}" 00:45:01.004 11:55:43 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # grep libclang_rt.asan 00:45:01.004 11:55:43 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:45:01.004 11:55:43 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # awk '{print $3}' 00:45:01.004 11:55:43 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # asan_lib= 00:45:01.004 11:55:43 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # [[ -n '' ]] 00:45:01.004 11:55:43 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1351 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:45:01.004 11:55:43 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1351 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:45:01.004 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:45:01.004 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:45:01.004 fio-3.35 00:45:01.004 Starting 2 threads 00:45:10.974 00:45:10.974 job_crypto_ram: (groupid=0, jobs=2): err= 0: pid=288811: Mon Jun 10 11:55:54 2024 00:45:10.974 write: IOPS=36.0k, BW=141MiB/s (148MB/s)(1407MiB/10001msec); 0 zone resets 00:45:10.974 slat (usec): min=8, max=1804, avg=24.30, stdev=11.43 00:45:10.974 clat (usec): min=23, max=2238, avg=179.93, stdev=129.55 00:45:10.974 lat (usec): min=32, max=2274, avg=204.24, stdev=137.71 00:45:10.974 clat percentiles (usec): 00:45:10.974 | 50.000th=[ 137], 99.000th=[ 537], 99.900th=[ 594], 99.990th=[ 635], 00:45:10.974 | 99.999th=[ 988] 00:45:10.974 bw ( KiB/s): min=96392, max=221904, per=100.00%, avg=145518.32, stdev=20425.67, samples=38 00:45:10.974 iops : min=24098, max=55476, avg=36379.58, stdev=5106.42, samples=38 00:45:10.974 trim: IOPS=36.0k, BW=141MiB/s (148MB/s)(1407MiB/10001msec); 0 zone resets 00:45:10.974 slat (usec): min=3, max=413, avg=12.33, stdev= 6.29 00:45:10.974 clat (usec): min=28, max=998, avg=118.72, stdev=54.79 00:45:10.974 lat (usec): min=33, max=1005, avg=131.05, stdev=58.07 00:45:10.974 clat percentiles (usec): 00:45:10.974 | 50.000th=[ 108], 99.000th=[ 273], 99.900th=[ 310], 99.990th=[ 474], 00:45:10.974 | 99.999th=[ 611] 00:45:10.975 bw ( KiB/s): min=96392, max=221912, per=100.00%, avg=145519.58, stdev=20427.13, samples=38 00:45:10.975 iops : min=24098, max=55478, avg=36379.89, stdev=5106.78, samples=38 00:45:10.975 lat (usec) : 50=7.18%, 100=33.02%, 250=46.09%, 500=12.51%, 750=1.20% 00:45:10.975 lat (usec) : 1000=0.01% 00:45:10.975 lat (msec) : 4=0.01% 00:45:10.975 cpu : usr=99.67%, sys=0.00%, ctx=72, majf=0, minf=343 00:45:10.975 IO depths : 1=7.7%, 2=17.8%, 4=59.6%, 8=14.9%, 16=0.0%, 32=0.0%, >=64=0.0% 00:45:10.975 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:45:10.975 complete : 0=0.0%, 4=87.0%, 8=13.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:45:10.975 issued rwts: total=0,360299,360300,0 short=0,0,0,0 dropped=0,0,0,0 00:45:10.975 latency : target=0, window=0, percentile=100.00%, depth=8 00:45:10.975 00:45:10.975 Run status group 0 (all jobs): 00:45:10.975 WRITE: bw=141MiB/s (148MB/s), 141MiB/s-141MiB/s (148MB/s-148MB/s), io=1407MiB (1476MB), run=10001-10001msec 00:45:10.975 TRIM: bw=141MiB/s (148MB/s), 141MiB/s-141MiB/s (148MB/s-148MB/s), io=1407MiB (1476MB), run=10001-10001msec 00:45:10.975 00:45:10.975 real 0m11.084s 00:45:10.975 user 0m23.379s 00:45:10.975 sys 0m0.302s 00:45:10.975 11:55:54 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1125 -- # xtrace_disable 00:45:10.975 11:55:54 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:45:10.975 ************************************ 00:45:10.975 END TEST bdev_fio_trim 00:45:10.975 ************************************ 00:45:10.975 11:55:54 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:45:10.975 11:55:54 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:45:10.975 11:55:54 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:45:10.975 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:45:10.975 11:55:54 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:45:10.975 00:45:10.975 real 0m22.530s 00:45:10.975 user 0m46.852s 00:45:10.975 sys 0m0.820s 00:45:10.975 11:55:54 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1125 -- # xtrace_disable 00:45:10.975 11:55:54 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:45:10.975 ************************************ 00:45:10.975 END TEST bdev_fio 00:45:10.975 ************************************ 00:45:10.975 11:55:54 blockdev_crypto_sw -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:45:10.975 11:55:54 blockdev_crypto_sw -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:45:10.975 11:55:54 blockdev_crypto_sw -- common/autotest_common.sh@1100 -- # '[' 16 -le 1 ']' 00:45:10.975 11:55:54 blockdev_crypto_sw -- common/autotest_common.sh@1106 -- # xtrace_disable 00:45:10.975 11:55:54 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:45:10.975 ************************************ 00:45:10.975 START TEST bdev_verify 00:45:10.975 ************************************ 00:45:10.975 11:55:54 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:45:10.975 [2024-06-10 11:55:54.617987] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:45:10.975 [2024-06-10 11:55:54.618031] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid290229 ] 00:45:10.975 [2024-06-10 11:55:54.702213] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:45:10.975 [2024-06-10 11:55:54.783818] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:45:10.975 [2024-06-10 11:55:54.783819] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:45:11.233 [2024-06-10 11:55:54.951040] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:45:11.233 [2024-06-10 11:55:54.951094] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:45:11.233 [2024-06-10 11:55:54.951104] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:45:11.233 [2024-06-10 11:55:54.959059] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:45:11.234 [2024-06-10 11:55:54.959072] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:45:11.234 [2024-06-10 11:55:54.959079] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:45:11.234 [2024-06-10 11:55:54.967083] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:45:11.234 [2024-06-10 11:55:54.967094] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:45:11.234 [2024-06-10 11:55:54.967102] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:45:11.234 Running I/O for 5 seconds... 00:45:16.503 00:45:16.504 Latency(us) 00:45:16.504 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:45:16.504 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:45:16.504 Verification LBA range: start 0x0 length 0x800 00:45:16.504 crypto_ram : 5.01 8426.81 32.92 0.00 0.00 15134.82 1118.39 17780.20 00:45:16.504 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:45:16.504 Verification LBA range: start 0x800 length 0x800 00:45:16.504 crypto_ram : 5.01 8429.25 32.93 0.00 0.00 15132.44 1367.71 17780.20 00:45:16.504 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:45:16.504 Verification LBA range: start 0x0 length 0x800 00:45:16.504 crypto_ram3 : 5.01 4212.01 16.45 0.00 0.00 30260.76 5328.36 20515.62 00:45:16.504 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:45:16.504 Verification LBA range: start 0x800 length 0x800 00:45:16.504 crypto_ram3 : 5.02 4232.23 16.53 0.00 0.00 30114.29 1296.47 21085.50 00:45:16.504 =================================================================================================================== 00:45:16.504 Total : 25300.31 98.83 0.00 0.00 20161.65 1118.39 21085.50 00:45:16.504 00:45:16.504 real 0m5.679s 00:45:16.504 user 0m10.785s 00:45:16.504 sys 0m0.209s 00:45:16.504 11:56:00 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@1125 -- # xtrace_disable 00:45:16.504 11:56:00 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:45:16.504 ************************************ 00:45:16.504 END TEST bdev_verify 00:45:16.504 ************************************ 00:45:16.504 11:56:00 blockdev_crypto_sw -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:45:16.504 11:56:00 blockdev_crypto_sw -- common/autotest_common.sh@1100 -- # '[' 16 -le 1 ']' 00:45:16.504 11:56:00 blockdev_crypto_sw -- common/autotest_common.sh@1106 -- # xtrace_disable 00:45:16.504 11:56:00 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:45:16.504 ************************************ 00:45:16.504 START TEST bdev_verify_big_io 00:45:16.504 ************************************ 00:45:16.504 11:56:00 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:45:16.504 [2024-06-10 11:56:00.371388] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:45:16.504 [2024-06-10 11:56:00.371433] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid290944 ] 00:45:16.762 [2024-06-10 11:56:00.455243] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:45:16.762 [2024-06-10 11:56:00.537168] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:45:16.762 [2024-06-10 11:56:00.537170] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:45:17.021 [2024-06-10 11:56:00.715221] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:45:17.021 [2024-06-10 11:56:00.715269] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:45:17.021 [2024-06-10 11:56:00.715280] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:45:17.021 [2024-06-10 11:56:00.723241] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:45:17.021 [2024-06-10 11:56:00.723256] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:45:17.021 [2024-06-10 11:56:00.723264] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:45:17.021 [2024-06-10 11:56:00.731263] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:45:17.021 [2024-06-10 11:56:00.731276] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:45:17.021 [2024-06-10 11:56:00.731284] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:45:17.021 Running I/O for 5 seconds... 00:45:22.287 00:45:22.287 Latency(us) 00:45:22.287 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:45:22.287 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:45:22.287 Verification LBA range: start 0x0 length 0x80 00:45:22.287 crypto_ram : 5.23 660.23 41.26 0.00 0.00 190445.11 5157.40 246187.41 00:45:22.287 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:45:22.287 Verification LBA range: start 0x80 length 0x80 00:45:22.287 crypto_ram : 5.24 659.74 41.23 0.00 0.00 190550.81 4786.98 244363.80 00:45:22.287 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:45:22.287 Verification LBA range: start 0x0 length 0x80 00:45:22.287 crypto_ram3 : 5.25 341.65 21.35 0.00 0.00 358539.71 4587.52 258952.68 00:45:22.287 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:45:22.287 Verification LBA range: start 0x80 length 0x80 00:45:22.287 crypto_ram3 : 5.25 341.42 21.34 0.00 0.00 358834.96 4103.12 258952.68 00:45:22.287 =================================================================================================================== 00:45:22.287 Total : 2003.04 125.19 0.00 0.00 247928.48 4103.12 258952.68 00:45:22.546 00:45:22.546 real 0m5.937s 00:45:22.546 user 0m11.288s 00:45:22.546 sys 0m0.211s 00:45:22.546 11:56:06 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@1125 -- # xtrace_disable 00:45:22.546 11:56:06 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:45:22.546 ************************************ 00:45:22.546 END TEST bdev_verify_big_io 00:45:22.546 ************************************ 00:45:22.546 11:56:06 blockdev_crypto_sw -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:45:22.546 11:56:06 blockdev_crypto_sw -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:45:22.546 11:56:06 blockdev_crypto_sw -- common/autotest_common.sh@1106 -- # xtrace_disable 00:45:22.546 11:56:06 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:45:22.546 ************************************ 00:45:22.546 START TEST bdev_write_zeroes 00:45:22.546 ************************************ 00:45:22.546 11:56:06 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:45:22.546 [2024-06-10 11:56:06.376328] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:45:22.546 [2024-06-10 11:56:06.376375] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid291751 ] 00:45:22.546 [2024-06-10 11:56:06.461436] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:45:22.805 [2024-06-10 11:56:06.545006] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:45:22.805 [2024-06-10 11:56:06.708398] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:45:22.805 [2024-06-10 11:56:06.708455] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:45:22.805 [2024-06-10 11:56:06.708464] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:45:22.805 [2024-06-10 11:56:06.716417] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:45:22.805 [2024-06-10 11:56:06.716430] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:45:22.805 [2024-06-10 11:56:06.716439] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:45:22.805 [2024-06-10 11:56:06.724437] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:45:22.805 [2024-06-10 11:56:06.724449] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:45:22.805 [2024-06-10 11:56:06.724456] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:45:23.063 Running I/O for 1 seconds... 00:45:23.998 00:45:23.999 Latency(us) 00:45:23.999 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:45:23.999 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:45:23.999 crypto_ram : 1.00 42158.60 164.68 0.00 0.00 3031.15 1310.72 4359.57 00:45:23.999 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:45:23.999 crypto_ram3 : 1.01 21107.59 82.45 0.00 0.00 6035.14 2194.03 6553.60 00:45:23.999 =================================================================================================================== 00:45:23.999 Total : 63266.19 247.13 0.00 0.00 4034.50 1310.72 6553.60 00:45:24.258 00:45:24.258 real 0m1.643s 00:45:24.258 user 0m1.420s 00:45:24.258 sys 0m0.206s 00:45:24.258 11:56:07 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@1125 -- # xtrace_disable 00:45:24.258 11:56:07 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:45:24.258 ************************************ 00:45:24.258 END TEST bdev_write_zeroes 00:45:24.258 ************************************ 00:45:24.258 11:56:07 blockdev_crypto_sw -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:45:24.258 11:56:08 blockdev_crypto_sw -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:45:24.258 11:56:08 blockdev_crypto_sw -- common/autotest_common.sh@1106 -- # xtrace_disable 00:45:24.258 11:56:08 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:45:24.258 ************************************ 00:45:24.258 START TEST bdev_json_nonenclosed 00:45:24.258 ************************************ 00:45:24.258 11:56:08 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:45:24.258 [2024-06-10 11:56:08.079901] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:45:24.258 [2024-06-10 11:56:08.079946] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid292023 ] 00:45:24.258 [2024-06-10 11:56:08.159378] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:45:24.517 [2024-06-10 11:56:08.242431] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:45:24.517 [2024-06-10 11:56:08.242490] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:45:24.517 [2024-06-10 11:56:08.242504] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:45:24.517 [2024-06-10 11:56:08.242512] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:45:24.517 00:45:24.517 real 0m0.286s 00:45:24.517 user 0m0.181s 00:45:24.517 sys 0m0.103s 00:45:24.517 11:56:08 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1125 -- # xtrace_disable 00:45:24.517 11:56:08 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:45:24.517 ************************************ 00:45:24.517 END TEST bdev_json_nonenclosed 00:45:24.517 ************************************ 00:45:24.517 11:56:08 blockdev_crypto_sw -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:45:24.517 11:56:08 blockdev_crypto_sw -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:45:24.517 11:56:08 blockdev_crypto_sw -- common/autotest_common.sh@1106 -- # xtrace_disable 00:45:24.517 11:56:08 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:45:24.517 ************************************ 00:45:24.517 START TEST bdev_json_nonarray 00:45:24.517 ************************************ 00:45:24.517 11:56:08 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:45:24.517 [2024-06-10 11:56:08.455496] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:45:24.517 [2024-06-10 11:56:08.455537] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid292047 ] 00:45:24.776 [2024-06-10 11:56:08.540483] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:45:24.776 [2024-06-10 11:56:08.621735] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:45:24.776 [2024-06-10 11:56:08.621799] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:45:24.776 [2024-06-10 11:56:08.621813] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:45:24.776 [2024-06-10 11:56:08.621821] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:45:24.776 00:45:24.776 real 0m0.300s 00:45:24.776 user 0m0.184s 00:45:24.776 sys 0m0.115s 00:45:24.776 11:56:08 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1125 -- # xtrace_disable 00:45:24.776 11:56:08 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:45:24.776 ************************************ 00:45:24.776 END TEST bdev_json_nonarray 00:45:24.776 ************************************ 00:45:25.035 11:56:08 blockdev_crypto_sw -- bdev/blockdev.sh@787 -- # [[ crypto_sw == bdev ]] 00:45:25.035 11:56:08 blockdev_crypto_sw -- bdev/blockdev.sh@794 -- # [[ crypto_sw == gpt ]] 00:45:25.035 11:56:08 blockdev_crypto_sw -- bdev/blockdev.sh@798 -- # [[ crypto_sw == crypto_sw ]] 00:45:25.035 11:56:08 blockdev_crypto_sw -- bdev/blockdev.sh@799 -- # run_test bdev_crypto_enomem bdev_crypto_enomem 00:45:25.035 11:56:08 blockdev_crypto_sw -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:45:25.035 11:56:08 blockdev_crypto_sw -- common/autotest_common.sh@1106 -- # xtrace_disable 00:45:25.035 11:56:08 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:45:25.035 ************************************ 00:45:25.035 START TEST bdev_crypto_enomem 00:45:25.035 ************************************ 00:45:25.035 11:56:08 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@1124 -- # bdev_crypto_enomem 00:45:25.035 11:56:08 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@635 -- # local base_dev=base0 00:45:25.035 11:56:08 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@636 -- # local test_dev=crypt0 00:45:25.035 11:56:08 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@637 -- # local err_dev=EE_base0 00:45:25.035 11:56:08 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@638 -- # local qd=32 00:45:25.035 11:56:08 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@641 -- # ERR_PID=292069 00:45:25.035 11:56:08 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@642 -- # trap 'cleanup; killprocess $ERR_PID; exit 1' SIGINT SIGTERM EXIT 00:45:25.035 11:56:08 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@640 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 32 -o 4096 -w randwrite -t 5 -f '' 00:45:25.035 11:56:08 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@643 -- # waitforlisten 292069 00:45:25.035 11:56:08 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@830 -- # '[' -z 292069 ']' 00:45:25.035 11:56:08 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:45:25.035 11:56:08 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@835 -- # local max_retries=100 00:45:25.035 11:56:08 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:45:25.035 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:45:25.035 11:56:08 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@839 -- # xtrace_disable 00:45:25.035 11:56:08 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:45:25.035 [2024-06-10 11:56:08.844754] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:45:25.035 [2024-06-10 11:56:08.844808] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid292069 ] 00:45:25.035 [2024-06-10 11:56:08.933723] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:45:25.295 [2024-06-10 11:56:09.026066] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:45:25.864 11:56:09 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:45:25.864 11:56:09 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@863 -- # return 0 00:45:25.864 11:56:09 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@645 -- # rpc_cmd 00:45:25.864 11:56:09 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@560 -- # xtrace_disable 00:45:25.864 11:56:09 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:45:25.864 true 00:45:25.864 base0 00:45:25.864 true 00:45:25.864 [2024-06-10 11:56:09.674936] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:45:25.864 crypt0 00:45:25.864 11:56:09 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:45:25.864 11:56:09 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@652 -- # waitforbdev crypt0 00:45:25.864 11:56:09 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@898 -- # local bdev_name=crypt0 00:45:25.864 11:56:09 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:45:25.864 11:56:09 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@900 -- # local i 00:45:25.864 11:56:09 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:45:25.864 11:56:09 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:45:25.864 11:56:09 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@903 -- # rpc_cmd bdev_wait_for_examine 00:45:25.864 11:56:09 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@560 -- # xtrace_disable 00:45:25.864 11:56:09 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:45:25.864 11:56:09 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:45:25.864 11:56:09 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@905 -- # rpc_cmd bdev_get_bdevs -b crypt0 -t 2000 00:45:25.864 11:56:09 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@560 -- # xtrace_disable 00:45:25.864 11:56:09 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:45:25.864 [ 00:45:25.864 { 00:45:25.864 "name": "crypt0", 00:45:25.864 "aliases": [ 00:45:25.864 "51469b4f-c195-58cd-918c-491e99bea3e0" 00:45:25.864 ], 00:45:25.864 "product_name": "crypto", 00:45:25.864 "block_size": 512, 00:45:25.864 "num_blocks": 2097152, 00:45:25.864 "uuid": "51469b4f-c195-58cd-918c-491e99bea3e0", 00:45:25.864 "assigned_rate_limits": { 00:45:25.864 "rw_ios_per_sec": 0, 00:45:25.864 "rw_mbytes_per_sec": 0, 00:45:25.864 "r_mbytes_per_sec": 0, 00:45:25.864 "w_mbytes_per_sec": 0 00:45:25.864 }, 00:45:25.864 "claimed": false, 00:45:25.864 "zoned": false, 00:45:25.864 "supported_io_types": { 00:45:25.864 "read": true, 00:45:25.864 "write": true, 00:45:25.864 "unmap": false, 00:45:25.864 "write_zeroes": true, 00:45:25.864 "flush": false, 00:45:25.864 "reset": true, 00:45:25.864 "compare": false, 00:45:25.864 "compare_and_write": false, 00:45:25.864 "abort": false, 00:45:25.864 "nvme_admin": false, 00:45:25.864 "nvme_io": false 00:45:25.864 }, 00:45:25.864 "memory_domains": [ 00:45:25.864 { 00:45:25.864 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:45:25.864 "dma_device_type": 2 00:45:25.864 } 00:45:25.864 ], 00:45:25.864 "driver_specific": { 00:45:25.864 "crypto": { 00:45:25.864 "base_bdev_name": "EE_base0", 00:45:25.864 "name": "crypt0", 00:45:25.864 "key_name": "test_dek_sw" 00:45:25.864 } 00:45:25.864 } 00:45:25.864 } 00:45:25.864 ] 00:45:25.864 11:56:09 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:45:25.864 11:56:09 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@906 -- # return 0 00:45:25.864 11:56:09 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@655 -- # rpcpid=292246 00:45:25.864 11:56:09 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@657 -- # sleep 1 00:45:25.864 11:56:09 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@654 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:45:25.864 Running I/O for 5 seconds... 00:45:26.800 11:56:10 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@658 -- # rpc_cmd bdev_error_inject_error EE_base0 -n 5 -q 31 write nomem 00:45:26.800 11:56:10 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@560 -- # xtrace_disable 00:45:26.800 11:56:10 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:45:26.800 11:56:10 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:45:26.800 11:56:10 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@660 -- # wait 292246 00:45:30.991 00:45:30.991 Latency(us) 00:45:30.991 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:45:30.991 Job: crypt0 (Core Mask 0x2, workload: randwrite, depth: 32, IO size: 4096) 00:45:30.991 crypt0 : 5.00 57276.03 223.73 0.00 0.00 556.14 284.94 797.83 00:45:30.992 =================================================================================================================== 00:45:30.992 Total : 57276.03 223.73 0.00 0.00 556.14 284.94 797.83 00:45:30.992 0 00:45:30.992 11:56:14 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@662 -- # rpc_cmd bdev_crypto_delete crypt0 00:45:30.992 11:56:14 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@560 -- # xtrace_disable 00:45:30.992 11:56:14 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:45:30.992 11:56:14 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:45:30.992 11:56:14 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@664 -- # killprocess 292069 00:45:30.992 11:56:14 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@949 -- # '[' -z 292069 ']' 00:45:30.992 11:56:14 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@953 -- # kill -0 292069 00:45:30.992 11:56:14 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@954 -- # uname 00:45:30.992 11:56:14 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:45:30.992 11:56:14 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 292069 00:45:30.992 11:56:14 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@955 -- # process_name=reactor_1 00:45:30.992 11:56:14 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@959 -- # '[' reactor_1 = sudo ']' 00:45:30.992 11:56:14 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@967 -- # echo 'killing process with pid 292069' 00:45:30.992 killing process with pid 292069 00:45:30.992 11:56:14 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@968 -- # kill 292069 00:45:30.992 Received shutdown signal, test time was about 5.000000 seconds 00:45:30.992 00:45:30.992 Latency(us) 00:45:30.992 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:45:30.992 =================================================================================================================== 00:45:30.992 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:45:30.992 11:56:14 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@973 -- # wait 292069 00:45:31.259 11:56:15 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@665 -- # trap - SIGINT SIGTERM EXIT 00:45:31.259 00:45:31.259 real 0m6.264s 00:45:31.259 user 0m6.387s 00:45:31.259 sys 0m0.353s 00:45:31.259 11:56:15 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@1125 -- # xtrace_disable 00:45:31.259 11:56:15 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:45:31.259 ************************************ 00:45:31.259 END TEST bdev_crypto_enomem 00:45:31.259 ************************************ 00:45:31.259 11:56:15 blockdev_crypto_sw -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:45:31.259 11:56:15 blockdev_crypto_sw -- bdev/blockdev.sh@811 -- # cleanup 00:45:31.259 11:56:15 blockdev_crypto_sw -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:45:31.259 11:56:15 blockdev_crypto_sw -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:45:31.259 11:56:15 blockdev_crypto_sw -- bdev/blockdev.sh@26 -- # [[ crypto_sw == rbd ]] 00:45:31.259 11:56:15 blockdev_crypto_sw -- bdev/blockdev.sh@30 -- # [[ crypto_sw == daos ]] 00:45:31.259 11:56:15 blockdev_crypto_sw -- bdev/blockdev.sh@34 -- # [[ crypto_sw = \g\p\t ]] 00:45:31.259 11:56:15 blockdev_crypto_sw -- bdev/blockdev.sh@40 -- # [[ crypto_sw == xnvme ]] 00:45:31.259 00:45:31.259 real 0m51.889s 00:45:31.259 user 1m29.519s 00:45:31.259 sys 0m5.569s 00:45:31.259 11:56:15 blockdev_crypto_sw -- common/autotest_common.sh@1125 -- # xtrace_disable 00:45:31.259 11:56:15 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:45:31.259 ************************************ 00:45:31.259 END TEST blockdev_crypto_sw 00:45:31.259 ************************************ 00:45:31.259 11:56:15 -- spdk/autotest.sh@359 -- # run_test blockdev_crypto_qat /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_qat 00:45:31.259 11:56:15 -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:45:31.259 11:56:15 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:45:31.259 11:56:15 -- common/autotest_common.sh@10 -- # set +x 00:45:31.259 ************************************ 00:45:31.259 START TEST blockdev_crypto_qat 00:45:31.259 ************************************ 00:45:31.259 11:56:15 blockdev_crypto_qat -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_qat 00:45:31.526 * Looking for test storage... 00:45:31.526 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:45:31.526 11:56:15 blockdev_crypto_qat -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:45:31.526 11:56:15 blockdev_crypto_qat -- bdev/nbd_common.sh@6 -- # set -e 00:45:31.526 11:56:15 blockdev_crypto_qat -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:45:31.526 11:56:15 blockdev_crypto_qat -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:45:31.526 11:56:15 blockdev_crypto_qat -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:45:31.526 11:56:15 blockdev_crypto_qat -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:45:31.526 11:56:15 blockdev_crypto_qat -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:45:31.526 11:56:15 blockdev_crypto_qat -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:45:31.526 11:56:15 blockdev_crypto_qat -- bdev/blockdev.sh@20 -- # : 00:45:31.526 11:56:15 blockdev_crypto_qat -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:45:31.526 11:56:15 blockdev_crypto_qat -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:45:31.526 11:56:15 blockdev_crypto_qat -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:45:31.526 11:56:15 blockdev_crypto_qat -- bdev/blockdev.sh@674 -- # uname -s 00:45:31.526 11:56:15 blockdev_crypto_qat -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:45:31.526 11:56:15 blockdev_crypto_qat -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:45:31.526 11:56:15 blockdev_crypto_qat -- bdev/blockdev.sh@682 -- # test_type=crypto_qat 00:45:31.526 11:56:15 blockdev_crypto_qat -- bdev/blockdev.sh@683 -- # crypto_device= 00:45:31.526 11:56:15 blockdev_crypto_qat -- bdev/blockdev.sh@684 -- # dek= 00:45:31.526 11:56:15 blockdev_crypto_qat -- bdev/blockdev.sh@685 -- # env_ctx= 00:45:31.526 11:56:15 blockdev_crypto_qat -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:45:31.526 11:56:15 blockdev_crypto_qat -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:45:31.526 11:56:15 blockdev_crypto_qat -- bdev/blockdev.sh@690 -- # [[ crypto_qat == bdev ]] 00:45:31.526 11:56:15 blockdev_crypto_qat -- bdev/blockdev.sh@690 -- # [[ crypto_qat == crypto_* ]] 00:45:31.526 11:56:15 blockdev_crypto_qat -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:45:31.526 11:56:15 blockdev_crypto_qat -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:45:31.526 11:56:15 blockdev_crypto_qat -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=293012 00:45:31.526 11:56:15 blockdev_crypto_qat -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:45:31.526 11:56:15 blockdev_crypto_qat -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:45:31.526 11:56:15 blockdev_crypto_qat -- bdev/blockdev.sh@49 -- # waitforlisten 293012 00:45:31.526 11:56:15 blockdev_crypto_qat -- common/autotest_common.sh@830 -- # '[' -z 293012 ']' 00:45:31.526 11:56:15 blockdev_crypto_qat -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:45:31.526 11:56:15 blockdev_crypto_qat -- common/autotest_common.sh@835 -- # local max_retries=100 00:45:31.526 11:56:15 blockdev_crypto_qat -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:45:31.526 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:45:31.526 11:56:15 blockdev_crypto_qat -- common/autotest_common.sh@839 -- # xtrace_disable 00:45:31.526 11:56:15 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:45:31.526 [2024-06-10 11:56:15.353450] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:45:31.526 [2024-06-10 11:56:15.353508] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid293012 ] 00:45:31.526 [2024-06-10 11:56:15.440271] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:45:31.784 [2024-06-10 11:56:15.523532] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:45:32.351 11:56:16 blockdev_crypto_qat -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:45:32.351 11:56:16 blockdev_crypto_qat -- common/autotest_common.sh@863 -- # return 0 00:45:32.351 11:56:16 blockdev_crypto_qat -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:45:32.351 11:56:16 blockdev_crypto_qat -- bdev/blockdev.sh@708 -- # setup_crypto_qat_conf 00:45:32.351 11:56:16 blockdev_crypto_qat -- bdev/blockdev.sh@170 -- # rpc_cmd 00:45:32.351 11:56:16 blockdev_crypto_qat -- common/autotest_common.sh@560 -- # xtrace_disable 00:45:32.351 11:56:16 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:45:32.351 [2024-06-10 11:56:16.153477] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:45:32.351 [2024-06-10 11:56:16.161511] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:45:32.351 [2024-06-10 11:56:16.169523] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:45:32.351 [2024-06-10 11:56:16.238406] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:45:34.883 true 00:45:34.883 true 00:45:34.883 true 00:45:34.883 true 00:45:34.883 Malloc0 00:45:34.883 Malloc1 00:45:34.883 Malloc2 00:45:34.883 Malloc3 00:45:34.883 [2024-06-10 11:56:18.579657] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:45:34.883 crypto_ram 00:45:34.883 [2024-06-10 11:56:18.587675] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:45:34.883 crypto_ram1 00:45:34.883 [2024-06-10 11:56:18.595697] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:45:34.883 crypto_ram2 00:45:34.883 [2024-06-10 11:56:18.603720] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:45:34.883 crypto_ram3 00:45:34.883 [ 00:45:34.883 { 00:45:34.883 "name": "Malloc1", 00:45:34.883 "aliases": [ 00:45:34.883 "170565ae-c315-45c5-a29d-84de581c5402" 00:45:34.883 ], 00:45:34.883 "product_name": "Malloc disk", 00:45:34.883 "block_size": 512, 00:45:34.883 "num_blocks": 65536, 00:45:34.883 "uuid": "170565ae-c315-45c5-a29d-84de581c5402", 00:45:34.883 "assigned_rate_limits": { 00:45:34.883 "rw_ios_per_sec": 0, 00:45:34.883 "rw_mbytes_per_sec": 0, 00:45:34.883 "r_mbytes_per_sec": 0, 00:45:34.883 "w_mbytes_per_sec": 0 00:45:34.883 }, 00:45:34.883 "claimed": true, 00:45:34.883 "claim_type": "exclusive_write", 00:45:34.883 "zoned": false, 00:45:34.883 "supported_io_types": { 00:45:34.883 "read": true, 00:45:34.883 "write": true, 00:45:34.883 "unmap": true, 00:45:34.883 "write_zeroes": true, 00:45:34.883 "flush": true, 00:45:34.883 "reset": true, 00:45:34.883 "compare": false, 00:45:34.883 "compare_and_write": false, 00:45:34.883 "abort": true, 00:45:34.883 "nvme_admin": false, 00:45:34.883 "nvme_io": false 00:45:34.883 }, 00:45:34.883 "memory_domains": [ 00:45:34.883 { 00:45:34.883 "dma_device_id": "system", 00:45:34.883 "dma_device_type": 1 00:45:34.883 }, 00:45:34.883 { 00:45:34.883 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:45:34.883 "dma_device_type": 2 00:45:34.883 } 00:45:34.883 ], 00:45:34.883 "driver_specific": {} 00:45:34.883 } 00:45:34.883 ] 00:45:34.883 11:56:18 blockdev_crypto_qat -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:45:34.883 11:56:18 blockdev_crypto_qat -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:45:34.883 11:56:18 blockdev_crypto_qat -- common/autotest_common.sh@560 -- # xtrace_disable 00:45:34.883 11:56:18 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:45:34.883 11:56:18 blockdev_crypto_qat -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:45:34.883 11:56:18 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # cat 00:45:34.883 11:56:18 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:45:34.883 11:56:18 blockdev_crypto_qat -- common/autotest_common.sh@560 -- # xtrace_disable 00:45:34.883 11:56:18 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:45:34.883 11:56:18 blockdev_crypto_qat -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:45:34.883 11:56:18 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:45:34.883 11:56:18 blockdev_crypto_qat -- common/autotest_common.sh@560 -- # xtrace_disable 00:45:34.883 11:56:18 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:45:34.883 11:56:18 blockdev_crypto_qat -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:45:34.883 11:56:18 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:45:34.883 11:56:18 blockdev_crypto_qat -- common/autotest_common.sh@560 -- # xtrace_disable 00:45:34.883 11:56:18 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:45:34.883 11:56:18 blockdev_crypto_qat -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:45:34.883 11:56:18 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:45:34.883 11:56:18 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:45:34.883 11:56:18 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:45:34.883 11:56:18 blockdev_crypto_qat -- common/autotest_common.sh@560 -- # xtrace_disable 00:45:34.883 11:56:18 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:45:34.883 11:56:18 blockdev_crypto_qat -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:45:34.883 11:56:18 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:45:34.884 11:56:18 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "d125f91f-6253-57ec-ad6a-ed3572447075"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "d125f91f-6253-57ec-ad6a-ed3572447075",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "efa13f93-6db5-582b-9f75-1f427acbb49a"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "efa13f93-6db5-582b-9f75-1f427acbb49a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "db21125e-010c-5f53-b4fe-cc4119700df2"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "db21125e-010c-5f53-b4fe-cc4119700df2",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "7acd8794-a7b8-56ca-9959-2fd9a65ca15d"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "7acd8794-a7b8-56ca-9959-2fd9a65ca15d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:45:34.884 11:56:18 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # jq -r .name 00:45:34.884 11:56:18 blockdev_crypto_qat -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:45:34.884 11:56:18 blockdev_crypto_qat -- bdev/blockdev.sh@752 -- # hello_world_bdev=crypto_ram 00:45:34.884 11:56:18 blockdev_crypto_qat -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:45:34.884 11:56:18 blockdev_crypto_qat -- bdev/blockdev.sh@754 -- # killprocess 293012 00:45:34.884 11:56:18 blockdev_crypto_qat -- common/autotest_common.sh@949 -- # '[' -z 293012 ']' 00:45:34.884 11:56:18 blockdev_crypto_qat -- common/autotest_common.sh@953 -- # kill -0 293012 00:45:35.142 11:56:18 blockdev_crypto_qat -- common/autotest_common.sh@954 -- # uname 00:45:35.142 11:56:18 blockdev_crypto_qat -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:45:35.142 11:56:18 blockdev_crypto_qat -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 293012 00:45:35.142 11:56:18 blockdev_crypto_qat -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:45:35.142 11:56:18 blockdev_crypto_qat -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:45:35.142 11:56:18 blockdev_crypto_qat -- common/autotest_common.sh@967 -- # echo 'killing process with pid 293012' 00:45:35.142 killing process with pid 293012 00:45:35.142 11:56:18 blockdev_crypto_qat -- common/autotest_common.sh@968 -- # kill 293012 00:45:35.142 11:56:18 blockdev_crypto_qat -- common/autotest_common.sh@973 -- # wait 293012 00:45:35.400 11:56:19 blockdev_crypto_qat -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:45:35.400 11:56:19 blockdev_crypto_qat -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:45:35.400 11:56:19 blockdev_crypto_qat -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:45:35.400 11:56:19 blockdev_crypto_qat -- common/autotest_common.sh@1106 -- # xtrace_disable 00:45:35.400 11:56:19 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:45:35.659 ************************************ 00:45:35.659 START TEST bdev_hello_world 00:45:35.659 ************************************ 00:45:35.659 11:56:19 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:45:35.659 [2024-06-10 11:56:19.429361] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:45:35.659 [2024-06-10 11:56:19.429402] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid293556 ] 00:45:35.659 [2024-06-10 11:56:19.511567] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:45:35.659 [2024-06-10 11:56:19.592508] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:45:35.917 [2024-06-10 11:56:19.613404] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:45:35.917 [2024-06-10 11:56:19.621435] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:45:35.917 [2024-06-10 11:56:19.629455] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:45:35.917 [2024-06-10 11:56:19.731698] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:45:38.450 [2024-06-10 11:56:21.911741] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:45:38.450 [2024-06-10 11:56:21.911802] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:45:38.450 [2024-06-10 11:56:21.911812] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:45:38.450 [2024-06-10 11:56:21.919759] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:45:38.450 [2024-06-10 11:56:21.919773] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:45:38.450 [2024-06-10 11:56:21.919780] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:45:38.450 [2024-06-10 11:56:21.927779] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:45:38.450 [2024-06-10 11:56:21.927792] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:45:38.450 [2024-06-10 11:56:21.927799] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:45:38.450 [2024-06-10 11:56:21.935810] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:45:38.450 [2024-06-10 11:56:21.935822] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:45:38.450 [2024-06-10 11:56:21.935829] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:45:38.450 [2024-06-10 11:56:22.008726] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:45:38.450 [2024-06-10 11:56:22.008762] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:45:38.450 [2024-06-10 11:56:22.008779] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:45:38.450 [2024-06-10 11:56:22.009676] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:45:38.450 [2024-06-10 11:56:22.009735] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:45:38.450 [2024-06-10 11:56:22.009747] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:45:38.450 [2024-06-10 11:56:22.009778] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:45:38.450 00:45:38.450 [2024-06-10 11:56:22.009791] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:45:38.450 00:45:38.450 real 0m3.009s 00:45:38.450 user 0m2.637s 00:45:38.450 sys 0m0.339s 00:45:38.450 11:56:22 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@1125 -- # xtrace_disable 00:45:38.450 11:56:22 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:45:38.450 ************************************ 00:45:38.450 END TEST bdev_hello_world 00:45:38.450 ************************************ 00:45:38.709 11:56:22 blockdev_crypto_qat -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:45:38.709 11:56:22 blockdev_crypto_qat -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:45:38.709 11:56:22 blockdev_crypto_qat -- common/autotest_common.sh@1106 -- # xtrace_disable 00:45:38.709 11:56:22 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:45:38.709 ************************************ 00:45:38.709 START TEST bdev_bounds 00:45:38.709 ************************************ 00:45:38.709 11:56:22 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@1124 -- # bdev_bounds '' 00:45:38.709 11:56:22 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=293931 00:45:38.709 11:56:22 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:45:38.709 11:56:22 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:45:38.709 11:56:22 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 293931' 00:45:38.709 Process bdevio pid: 293931 00:45:38.709 11:56:22 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 293931 00:45:38.709 11:56:22 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@830 -- # '[' -z 293931 ']' 00:45:38.709 11:56:22 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:45:38.709 11:56:22 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@835 -- # local max_retries=100 00:45:38.709 11:56:22 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:45:38.709 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:45:38.709 11:56:22 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@839 -- # xtrace_disable 00:45:38.709 11:56:22 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:45:38.709 [2024-06-10 11:56:22.519075] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:45:38.709 [2024-06-10 11:56:22.519124] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid293931 ] 00:45:38.709 [2024-06-10 11:56:22.605003] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:45:38.968 [2024-06-10 11:56:22.690644] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:45:38.968 [2024-06-10 11:56:22.690731] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:45:38.968 [2024-06-10 11:56:22.690734] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:45:38.968 [2024-06-10 11:56:22.711762] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:45:38.968 [2024-06-10 11:56:22.719794] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:45:38.968 [2024-06-10 11:56:22.727810] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:45:38.968 [2024-06-10 11:56:22.825268] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:45:41.505 [2024-06-10 11:56:24.997209] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:45:41.505 [2024-06-10 11:56:24.997271] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:45:41.505 [2024-06-10 11:56:24.997282] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:45:41.505 [2024-06-10 11:56:25.005225] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:45:41.505 [2024-06-10 11:56:25.005240] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:45:41.505 [2024-06-10 11:56:25.005247] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:45:41.505 [2024-06-10 11:56:25.013249] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:45:41.505 [2024-06-10 11:56:25.013263] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:45:41.505 [2024-06-10 11:56:25.013271] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:45:41.505 [2024-06-10 11:56:25.021268] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:45:41.505 [2024-06-10 11:56:25.021281] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:45:41.505 [2024-06-10 11:56:25.021288] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:45:41.505 11:56:25 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:45:41.505 11:56:25 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@863 -- # return 0 00:45:41.505 11:56:25 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:45:41.505 I/O targets: 00:45:41.506 crypto_ram: 65536 blocks of 512 bytes (32 MiB) 00:45:41.506 crypto_ram1: 65536 blocks of 512 bytes (32 MiB) 00:45:41.506 crypto_ram2: 8192 blocks of 4096 bytes (32 MiB) 00:45:41.506 crypto_ram3: 8192 blocks of 4096 bytes (32 MiB) 00:45:41.506 00:45:41.506 00:45:41.506 CUnit - A unit testing framework for C - Version 2.1-3 00:45:41.506 http://cunit.sourceforge.net/ 00:45:41.506 00:45:41.506 00:45:41.506 Suite: bdevio tests on: crypto_ram3 00:45:41.506 Test: blockdev write read block ...passed 00:45:41.506 Test: blockdev write zeroes read block ...passed 00:45:41.506 Test: blockdev write zeroes read no split ...passed 00:45:41.506 Test: blockdev write zeroes read split ...passed 00:45:41.506 Test: blockdev write zeroes read split partial ...passed 00:45:41.506 Test: blockdev reset ...passed 00:45:41.506 Test: blockdev write read 8 blocks ...passed 00:45:41.506 Test: blockdev write read size > 128k ...passed 00:45:41.506 Test: blockdev write read invalid size ...passed 00:45:41.506 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:45:41.506 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:45:41.506 Test: blockdev write read max offset ...passed 00:45:41.506 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:45:41.506 Test: blockdev writev readv 8 blocks ...passed 00:45:41.506 Test: blockdev writev readv 30 x 1block ...passed 00:45:41.506 Test: blockdev writev readv block ...passed 00:45:41.506 Test: blockdev writev readv size > 128k ...passed 00:45:41.506 Test: blockdev writev readv size > 128k in two iovs ...passed 00:45:41.506 Test: blockdev comparev and writev ...passed 00:45:41.506 Test: blockdev nvme passthru rw ...passed 00:45:41.506 Test: blockdev nvme passthru vendor specific ...passed 00:45:41.506 Test: blockdev nvme admin passthru ...passed 00:45:41.506 Test: blockdev copy ...passed 00:45:41.506 Suite: bdevio tests on: crypto_ram2 00:45:41.506 Test: blockdev write read block ...passed 00:45:41.506 Test: blockdev write zeroes read block ...passed 00:45:41.506 Test: blockdev write zeroes read no split ...passed 00:45:41.506 Test: blockdev write zeroes read split ...passed 00:45:41.506 Test: blockdev write zeroes read split partial ...passed 00:45:41.506 Test: blockdev reset ...passed 00:45:41.506 Test: blockdev write read 8 blocks ...passed 00:45:41.506 Test: blockdev write read size > 128k ...passed 00:45:41.506 Test: blockdev write read invalid size ...passed 00:45:41.506 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:45:41.506 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:45:41.506 Test: blockdev write read max offset ...passed 00:45:41.506 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:45:41.506 Test: blockdev writev readv 8 blocks ...passed 00:45:41.506 Test: blockdev writev readv 30 x 1block ...passed 00:45:41.506 Test: blockdev writev readv block ...passed 00:45:41.506 Test: blockdev writev readv size > 128k ...passed 00:45:41.506 Test: blockdev writev readv size > 128k in two iovs ...passed 00:45:41.506 Test: blockdev comparev and writev ...passed 00:45:41.506 Test: blockdev nvme passthru rw ...passed 00:45:41.506 Test: blockdev nvme passthru vendor specific ...passed 00:45:41.506 Test: blockdev nvme admin passthru ...passed 00:45:41.506 Test: blockdev copy ...passed 00:45:41.506 Suite: bdevio tests on: crypto_ram1 00:45:41.506 Test: blockdev write read block ...passed 00:45:41.506 Test: blockdev write zeroes read block ...passed 00:45:41.506 Test: blockdev write zeroes read no split ...passed 00:45:41.506 Test: blockdev write zeroes read split ...passed 00:45:41.506 Test: blockdev write zeroes read split partial ...passed 00:45:41.506 Test: blockdev reset ...passed 00:45:41.506 Test: blockdev write read 8 blocks ...passed 00:45:41.506 Test: blockdev write read size > 128k ...passed 00:45:41.506 Test: blockdev write read invalid size ...passed 00:45:41.506 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:45:41.506 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:45:41.506 Test: blockdev write read max offset ...passed 00:45:41.506 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:45:41.506 Test: blockdev writev readv 8 blocks ...passed 00:45:41.506 Test: blockdev writev readv 30 x 1block ...passed 00:45:41.506 Test: blockdev writev readv block ...passed 00:45:41.506 Test: blockdev writev readv size > 128k ...passed 00:45:41.506 Test: blockdev writev readv size > 128k in two iovs ...passed 00:45:41.506 Test: blockdev comparev and writev ...passed 00:45:41.506 Test: blockdev nvme passthru rw ...passed 00:45:41.506 Test: blockdev nvme passthru vendor specific ...passed 00:45:41.506 Test: blockdev nvme admin passthru ...passed 00:45:41.506 Test: blockdev copy ...passed 00:45:41.506 Suite: bdevio tests on: crypto_ram 00:45:41.506 Test: blockdev write read block ...passed 00:45:41.506 Test: blockdev write zeroes read block ...passed 00:45:41.506 Test: blockdev write zeroes read no split ...passed 00:45:41.506 Test: blockdev write zeroes read split ...passed 00:45:41.765 Test: blockdev write zeroes read split partial ...passed 00:45:41.765 Test: blockdev reset ...passed 00:45:41.765 Test: blockdev write read 8 blocks ...passed 00:45:41.765 Test: blockdev write read size > 128k ...passed 00:45:41.765 Test: blockdev write read invalid size ...passed 00:45:41.765 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:45:41.765 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:45:41.765 Test: blockdev write read max offset ...passed 00:45:41.765 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:45:41.765 Test: blockdev writev readv 8 blocks ...passed 00:45:41.766 Test: blockdev writev readv 30 x 1block ...passed 00:45:41.766 Test: blockdev writev readv block ...passed 00:45:41.766 Test: blockdev writev readv size > 128k ...passed 00:45:41.766 Test: blockdev writev readv size > 128k in two iovs ...passed 00:45:41.766 Test: blockdev comparev and writev ...passed 00:45:41.766 Test: blockdev nvme passthru rw ...passed 00:45:41.766 Test: blockdev nvme passthru vendor specific ...passed 00:45:41.766 Test: blockdev nvme admin passthru ...passed 00:45:41.766 Test: blockdev copy ...passed 00:45:41.766 00:45:41.766 Run Summary: Type Total Ran Passed Failed Inactive 00:45:41.766 suites 4 4 n/a 0 0 00:45:41.766 tests 92 92 92 0 0 00:45:41.766 asserts 520 520 520 0 n/a 00:45:41.766 00:45:41.766 Elapsed time = 0.512 seconds 00:45:41.766 0 00:45:41.766 11:56:25 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 293931 00:45:41.766 11:56:25 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@949 -- # '[' -z 293931 ']' 00:45:41.766 11:56:25 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@953 -- # kill -0 293931 00:45:41.766 11:56:25 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@954 -- # uname 00:45:41.766 11:56:25 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:45:41.766 11:56:25 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 293931 00:45:41.766 11:56:25 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:45:41.766 11:56:25 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:45:41.766 11:56:25 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@967 -- # echo 'killing process with pid 293931' 00:45:41.766 killing process with pid 293931 00:45:41.766 11:56:25 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@968 -- # kill 293931 00:45:41.766 11:56:25 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@973 -- # wait 293931 00:45:42.025 11:56:25 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:45:42.025 00:45:42.025 real 0m3.432s 00:45:42.025 user 0m9.590s 00:45:42.025 sys 0m0.483s 00:45:42.025 11:56:25 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@1125 -- # xtrace_disable 00:45:42.025 11:56:25 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:45:42.025 ************************************ 00:45:42.025 END TEST bdev_bounds 00:45:42.025 ************************************ 00:45:42.025 11:56:25 blockdev_crypto_qat -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '' 00:45:42.025 11:56:25 blockdev_crypto_qat -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:45:42.025 11:56:25 blockdev_crypto_qat -- common/autotest_common.sh@1106 -- # xtrace_disable 00:45:42.025 11:56:25 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:45:42.284 ************************************ 00:45:42.284 START TEST bdev_nbd 00:45:42.284 ************************************ 00:45:42.284 11:56:25 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@1124 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '' 00:45:42.284 11:56:25 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:45:42.284 11:56:25 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:45:42.284 11:56:25 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:45:42.284 11:56:25 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:45:42.284 11:56:25 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:45:42.284 11:56:25 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:45:42.284 11:56:25 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=4 00:45:42.284 11:56:25 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:45:42.284 11:56:25 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:45:42.284 11:56:25 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:45:42.284 11:56:25 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=4 00:45:42.284 11:56:25 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:45:42.284 11:56:25 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:45:42.284 11:56:25 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:45:42.284 11:56:25 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:45:42.284 11:56:25 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=294472 00:45:42.284 11:56:25 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:45:42.284 11:56:25 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:45:42.284 11:56:25 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 294472 /var/tmp/spdk-nbd.sock 00:45:42.284 11:56:25 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@830 -- # '[' -z 294472 ']' 00:45:42.284 11:56:25 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:45:42.284 11:56:25 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@835 -- # local max_retries=100 00:45:42.284 11:56:25 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:45:42.284 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:45:42.284 11:56:25 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@839 -- # xtrace_disable 00:45:42.284 11:56:25 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:45:42.284 [2024-06-10 11:56:26.047974] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:45:42.285 [2024-06-10 11:56:26.048025] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:45:42.285 [2024-06-10 11:56:26.136188] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:45:42.285 [2024-06-10 11:56:26.220844] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:45:42.544 [2024-06-10 11:56:26.241790] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:45:42.544 [2024-06-10 11:56:26.249817] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:45:42.544 [2024-06-10 11:56:26.257830] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:45:42.544 [2024-06-10 11:56:26.357378] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:45:45.184 [2024-06-10 11:56:28.538578] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:45:45.184 [2024-06-10 11:56:28.538629] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:45:45.184 [2024-06-10 11:56:28.538641] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:45:45.184 [2024-06-10 11:56:28.546595] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:45:45.184 [2024-06-10 11:56:28.546610] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:45:45.184 [2024-06-10 11:56:28.546618] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:45:45.184 [2024-06-10 11:56:28.554615] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:45:45.184 [2024-06-10 11:56:28.554628] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:45:45.184 [2024-06-10 11:56:28.554636] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:45:45.184 [2024-06-10 11:56:28.562635] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:45:45.184 [2024-06-10 11:56:28.562648] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:45:45.184 [2024-06-10 11:56:28.562656] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:45:45.184 11:56:28 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:45:45.184 11:56:28 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@863 -- # return 0 00:45:45.184 11:56:28 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' 00:45:45.184 11:56:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:45:45.184 11:56:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:45:45.184 11:56:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:45:45.184 11:56:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' 00:45:45.184 11:56:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:45:45.184 11:56:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:45:45.184 11:56:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:45:45.184 11:56:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:45:45.184 11:56:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:45:45.184 11:56:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:45:45.184 11:56:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:45:45.184 11:56:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:45:45.184 11:56:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:45:45.184 11:56:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:45:45.184 11:56:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:45:45.184 11:56:28 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:45:45.184 11:56:28 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:45:45.184 11:56:28 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:45:45.184 11:56:28 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:45:45.184 11:56:28 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:45:45.184 11:56:28 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:45:45.184 11:56:28 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:45:45.184 11:56:28 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:45:45.184 11:56:28 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:45:45.184 1+0 records in 00:45:45.184 1+0 records out 00:45:45.184 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000165133 s, 24.8 MB/s 00:45:45.184 11:56:28 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:45:45.184 11:56:28 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:45:45.184 11:56:28 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:45:45.184 11:56:28 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:45:45.184 11:56:28 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:45:45.184 11:56:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:45:45.184 11:56:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:45:45.184 11:56:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram1 00:45:45.184 11:56:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:45:45.184 11:56:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:45:45.184 11:56:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:45:45.184 11:56:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:45:45.184 11:56:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:45:45.184 11:56:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:45:45.184 11:56:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:45:45.184 11:56:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:45:45.184 11:56:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:45:45.184 11:56:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:45:45.184 11:56:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:45:45.184 11:56:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:45:45.184 1+0 records in 00:45:45.184 1+0 records out 00:45:45.184 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000277674 s, 14.8 MB/s 00:45:45.184 11:56:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:45:45.184 11:56:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:45:45.184 11:56:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:45:45.184 11:56:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:45:45.184 11:56:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:45:45.184 11:56:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:45:45.184 11:56:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:45:45.184 11:56:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 00:45:45.443 11:56:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:45:45.443 11:56:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:45:45.443 11:56:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:45:45.443 11:56:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd2 00:45:45.443 11:56:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:45:45.443 11:56:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:45:45.443 11:56:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:45:45.443 11:56:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd2 /proc/partitions 00:45:45.443 11:56:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:45:45.443 11:56:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:45:45.443 11:56:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:45:45.443 11:56:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:45:45.443 1+0 records in 00:45:45.443 1+0 records out 00:45:45.443 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000243601 s, 16.8 MB/s 00:45:45.444 11:56:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:45:45.444 11:56:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:45:45.444 11:56:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:45:45.444 11:56:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:45:45.444 11:56:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:45:45.444 11:56:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:45:45.444 11:56:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:45:45.444 11:56:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:45:45.703 11:56:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:45:45.703 11:56:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:45:45.703 11:56:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:45:45.703 11:56:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd3 00:45:45.703 11:56:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:45:45.703 11:56:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:45:45.703 11:56:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:45:45.703 11:56:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd3 /proc/partitions 00:45:45.703 11:56:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:45:45.703 11:56:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:45:45.703 11:56:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:45:45.703 11:56:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:45:45.703 1+0 records in 00:45:45.703 1+0 records out 00:45:45.703 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000294412 s, 13.9 MB/s 00:45:45.703 11:56:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:45:45.703 11:56:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:45:45.703 11:56:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:45:45.703 11:56:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:45:45.703 11:56:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:45:45.703 11:56:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:45:45.703 11:56:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:45:45.703 11:56:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:45:45.963 11:56:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:45:45.963 { 00:45:45.963 "nbd_device": "/dev/nbd0", 00:45:45.963 "bdev_name": "crypto_ram" 00:45:45.963 }, 00:45:45.963 { 00:45:45.963 "nbd_device": "/dev/nbd1", 00:45:45.963 "bdev_name": "crypto_ram1" 00:45:45.963 }, 00:45:45.963 { 00:45:45.963 "nbd_device": "/dev/nbd2", 00:45:45.963 "bdev_name": "crypto_ram2" 00:45:45.963 }, 00:45:45.963 { 00:45:45.963 "nbd_device": "/dev/nbd3", 00:45:45.963 "bdev_name": "crypto_ram3" 00:45:45.963 } 00:45:45.963 ]' 00:45:45.963 11:56:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:45:45.963 11:56:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:45:45.963 { 00:45:45.963 "nbd_device": "/dev/nbd0", 00:45:45.963 "bdev_name": "crypto_ram" 00:45:45.963 }, 00:45:45.963 { 00:45:45.963 "nbd_device": "/dev/nbd1", 00:45:45.963 "bdev_name": "crypto_ram1" 00:45:45.963 }, 00:45:45.963 { 00:45:45.963 "nbd_device": "/dev/nbd2", 00:45:45.963 "bdev_name": "crypto_ram2" 00:45:45.963 }, 00:45:45.963 { 00:45:45.963 "nbd_device": "/dev/nbd3", 00:45:45.963 "bdev_name": "crypto_ram3" 00:45:45.963 } 00:45:45.963 ]' 00:45:45.963 11:56:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:45:45.963 11:56:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3' 00:45:45.963 11:56:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:45:45.963 11:56:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3') 00:45:45.963 11:56:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:45:45.963 11:56:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:45:45.963 11:56:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:45:45.963 11:56:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:45:45.963 11:56:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:45:45.963 11:56:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:45:45.963 11:56:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:45:45.963 11:56:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:45:45.963 11:56:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:45:45.963 11:56:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:45:45.963 11:56:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:45:45.963 11:56:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:45:45.963 11:56:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:45:45.963 11:56:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:45:46.223 11:56:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:45:46.223 11:56:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:45:46.223 11:56:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:45:46.223 11:56:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:45:46.223 11:56:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:45:46.223 11:56:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:45:46.223 11:56:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:45:46.223 11:56:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:45:46.223 11:56:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:45:46.223 11:56:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:45:46.482 11:56:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:45:46.482 11:56:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:45:46.482 11:56:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:45:46.482 11:56:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:45:46.482 11:56:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:45:46.482 11:56:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:45:46.482 11:56:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:45:46.482 11:56:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:45:46.482 11:56:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:45:46.482 11:56:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:45:46.741 11:56:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:45:46.741 11:56:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:45:46.741 11:56:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:45:46.741 11:56:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:45:46.741 11:56:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:45:46.741 11:56:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:45:46.741 11:56:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:45:46.741 11:56:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:45:46.741 11:56:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:45:46.741 11:56:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:45:46.741 11:56:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:45:46.741 11:56:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:45:46.741 11:56:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:45:46.741 11:56:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:45:47.001 11:56:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:45:47.001 11:56:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:45:47.001 11:56:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:45:47.001 11:56:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:45:47.001 11:56:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:45:47.001 11:56:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:45:47.001 11:56:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:45:47.001 11:56:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:45:47.001 11:56:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:45:47.001 11:56:30 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:45:47.001 11:56:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:45:47.001 11:56:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:45:47.001 11:56:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:45:47.001 11:56:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:45:47.001 11:56:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:45:47.001 11:56:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:45:47.001 11:56:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:45:47.001 11:56:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:45:47.001 11:56:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:45:47.001 11:56:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:45:47.001 11:56:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:45:47.001 11:56:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:45:47.001 11:56:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:45:47.001 11:56:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:45:47.001 11:56:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:45:47.001 /dev/nbd0 00:45:47.001 11:56:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:45:47.001 11:56:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:45:47.001 11:56:30 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:45:47.001 11:56:30 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:45:47.001 11:56:30 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:45:47.001 11:56:30 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:45:47.001 11:56:30 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:45:47.001 11:56:30 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:45:47.001 11:56:30 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:45:47.001 11:56:30 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:45:47.001 11:56:30 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:45:47.001 1+0 records in 00:45:47.001 1+0 records out 00:45:47.001 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000276495 s, 14.8 MB/s 00:45:47.001 11:56:30 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:45:47.260 11:56:30 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:45:47.260 11:56:30 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:45:47.260 11:56:30 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:45:47.260 11:56:30 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:45:47.260 11:56:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:45:47.260 11:56:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:45:47.260 11:56:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram1 /dev/nbd1 00:45:47.260 /dev/nbd1 00:45:47.260 11:56:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:45:47.260 11:56:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:45:47.260 11:56:31 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:45:47.260 11:56:31 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:45:47.260 11:56:31 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:45:47.260 11:56:31 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:45:47.260 11:56:31 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:45:47.260 11:56:31 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:45:47.260 11:56:31 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:45:47.260 11:56:31 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:45:47.260 11:56:31 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:45:47.260 1+0 records in 00:45:47.261 1+0 records out 00:45:47.261 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000312541 s, 13.1 MB/s 00:45:47.261 11:56:31 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:45:47.261 11:56:31 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:45:47.261 11:56:31 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:45:47.261 11:56:31 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:45:47.261 11:56:31 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:45:47.261 11:56:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:45:47.261 11:56:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:45:47.261 11:56:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 /dev/nbd10 00:45:47.520 /dev/nbd10 00:45:47.520 11:56:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:45:47.520 11:56:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:45:47.520 11:56:31 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd10 00:45:47.520 11:56:31 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:45:47.520 11:56:31 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:45:47.520 11:56:31 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:45:47.520 11:56:31 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd10 /proc/partitions 00:45:47.520 11:56:31 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:45:47.520 11:56:31 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:45:47.520 11:56:31 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:45:47.520 11:56:31 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:45:47.520 1+0 records in 00:45:47.520 1+0 records out 00:45:47.520 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000309348 s, 13.2 MB/s 00:45:47.520 11:56:31 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:45:47.520 11:56:31 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:45:47.520 11:56:31 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:45:47.520 11:56:31 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:45:47.520 11:56:31 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:45:47.520 11:56:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:45:47.520 11:56:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:45:47.520 11:56:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd11 00:45:47.779 /dev/nbd11 00:45:47.779 11:56:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:45:47.779 11:56:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:45:47.779 11:56:31 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd11 00:45:47.779 11:56:31 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:45:47.779 11:56:31 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:45:47.779 11:56:31 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:45:47.779 11:56:31 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd11 /proc/partitions 00:45:47.779 11:56:31 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:45:47.779 11:56:31 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:45:47.779 11:56:31 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:45:47.779 11:56:31 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:45:47.779 1+0 records in 00:45:47.779 1+0 records out 00:45:47.779 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000300569 s, 13.6 MB/s 00:45:47.779 11:56:31 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:45:47.779 11:56:31 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:45:47.779 11:56:31 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:45:47.779 11:56:31 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:45:47.779 11:56:31 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:45:47.779 11:56:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:45:47.779 11:56:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:45:47.779 11:56:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:45:47.779 11:56:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:45:47.779 11:56:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:45:48.038 11:56:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:45:48.038 { 00:45:48.038 "nbd_device": "/dev/nbd0", 00:45:48.038 "bdev_name": "crypto_ram" 00:45:48.038 }, 00:45:48.038 { 00:45:48.038 "nbd_device": "/dev/nbd1", 00:45:48.038 "bdev_name": "crypto_ram1" 00:45:48.038 }, 00:45:48.038 { 00:45:48.038 "nbd_device": "/dev/nbd10", 00:45:48.038 "bdev_name": "crypto_ram2" 00:45:48.038 }, 00:45:48.038 { 00:45:48.038 "nbd_device": "/dev/nbd11", 00:45:48.038 "bdev_name": "crypto_ram3" 00:45:48.038 } 00:45:48.038 ]' 00:45:48.038 11:56:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:45:48.038 { 00:45:48.038 "nbd_device": "/dev/nbd0", 00:45:48.038 "bdev_name": "crypto_ram" 00:45:48.038 }, 00:45:48.038 { 00:45:48.038 "nbd_device": "/dev/nbd1", 00:45:48.038 "bdev_name": "crypto_ram1" 00:45:48.038 }, 00:45:48.038 { 00:45:48.038 "nbd_device": "/dev/nbd10", 00:45:48.038 "bdev_name": "crypto_ram2" 00:45:48.038 }, 00:45:48.038 { 00:45:48.038 "nbd_device": "/dev/nbd11", 00:45:48.038 "bdev_name": "crypto_ram3" 00:45:48.038 } 00:45:48.038 ]' 00:45:48.038 11:56:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:45:48.038 11:56:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:45:48.038 /dev/nbd1 00:45:48.038 /dev/nbd10 00:45:48.038 /dev/nbd11' 00:45:48.038 11:56:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:45:48.038 /dev/nbd1 00:45:48.038 /dev/nbd10 00:45:48.038 /dev/nbd11' 00:45:48.038 11:56:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:45:48.038 11:56:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=4 00:45:48.038 11:56:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 4 00:45:48.038 11:56:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=4 00:45:48.038 11:56:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 4 -ne 4 ']' 00:45:48.038 11:56:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' write 00:45:48.038 11:56:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:45:48.038 11:56:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:45:48.038 11:56:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:45:48.038 11:56:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:45:48.038 11:56:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:45:48.038 11:56:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:45:48.038 256+0 records in 00:45:48.038 256+0 records out 00:45:48.038 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00355836 s, 295 MB/s 00:45:48.038 11:56:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:45:48.038 11:56:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:45:48.038 256+0 records in 00:45:48.038 256+0 records out 00:45:48.038 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0566793 s, 18.5 MB/s 00:45:48.038 11:56:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:45:48.038 11:56:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:45:48.038 256+0 records in 00:45:48.038 256+0 records out 00:45:48.038 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0417368 s, 25.1 MB/s 00:45:48.038 11:56:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:45:48.038 11:56:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:45:48.298 256+0 records in 00:45:48.298 256+0 records out 00:45:48.298 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0393667 s, 26.6 MB/s 00:45:48.298 11:56:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:45:48.298 11:56:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:45:48.298 256+0 records in 00:45:48.298 256+0 records out 00:45:48.298 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0373817 s, 28.1 MB/s 00:45:48.298 11:56:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' verify 00:45:48.298 11:56:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:45:48.298 11:56:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:45:48.298 11:56:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:45:48.298 11:56:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:45:48.298 11:56:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:45:48.298 11:56:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:45:48.298 11:56:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:45:48.298 11:56:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:45:48.298 11:56:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:45:48.298 11:56:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:45:48.298 11:56:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:45:48.298 11:56:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:45:48.298 11:56:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:45:48.298 11:56:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:45:48.298 11:56:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:45:48.298 11:56:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:45:48.298 11:56:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:45:48.298 11:56:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:45:48.298 11:56:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:45:48.298 11:56:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:45:48.298 11:56:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:45:48.298 11:56:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:45:48.558 11:56:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:45:48.558 11:56:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:45:48.558 11:56:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:45:48.558 11:56:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:45:48.558 11:56:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:45:48.558 11:56:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:45:48.558 11:56:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:45:48.558 11:56:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:45:48.558 11:56:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:45:48.558 11:56:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:45:48.558 11:56:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:45:48.558 11:56:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:45:48.558 11:56:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:45:48.558 11:56:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:45:48.558 11:56:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:45:48.558 11:56:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:45:48.558 11:56:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:45:48.558 11:56:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:45:48.558 11:56:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:45:48.558 11:56:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:45:48.816 11:56:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:45:48.816 11:56:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:45:48.817 11:56:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:45:48.817 11:56:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:45:48.817 11:56:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:45:48.817 11:56:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:45:48.817 11:56:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:45:48.817 11:56:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:45:48.817 11:56:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:45:48.817 11:56:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:45:49.076 11:56:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:45:49.076 11:56:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:45:49.076 11:56:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:45:49.076 11:56:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:45:49.076 11:56:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:45:49.076 11:56:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:45:49.076 11:56:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:45:49.076 11:56:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:45:49.076 11:56:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:45:49.076 11:56:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:45:49.076 11:56:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:45:49.334 11:56:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:45:49.334 11:56:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:45:49.334 11:56:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:45:49.334 11:56:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:45:49.334 11:56:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:45:49.334 11:56:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:45:49.334 11:56:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:45:49.334 11:56:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:45:49.334 11:56:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:45:49.334 11:56:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:45:49.334 11:56:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:45:49.334 11:56:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:45:49.334 11:56:33 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:45:49.334 11:56:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:45:49.334 11:56:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:45:49.334 11:56:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:45:49.334 11:56:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:45:49.334 11:56:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:45:49.334 malloc_lvol_verify 00:45:49.593 11:56:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:45:49.593 aaa98f38-1991-4d4f-bbed-02913734ee75 00:45:49.593 11:56:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:45:49.852 8e244783-06be-4fdf-b739-9a2cf8cf18e1 00:45:49.852 11:56:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:45:50.111 /dev/nbd0 00:45:50.111 11:56:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:45:50.111 mke2fs 1.46.5 (30-Dec-2021) 00:45:50.111 Discarding device blocks: 0/4096 done 00:45:50.111 Creating filesystem with 4096 1k blocks and 1024 inodes 00:45:50.111 00:45:50.111 Allocating group tables: 0/1 done 00:45:50.111 Writing inode tables: 0/1 done 00:45:50.111 Creating journal (1024 blocks): done 00:45:50.111 Writing superblocks and filesystem accounting information: 0/1 done 00:45:50.111 00:45:50.111 11:56:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:45:50.111 11:56:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:45:50.111 11:56:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:45:50.111 11:56:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:45:50.111 11:56:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:45:50.111 11:56:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:45:50.111 11:56:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:45:50.111 11:56:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:45:50.111 11:56:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:45:50.111 11:56:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:45:50.111 11:56:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:45:50.371 11:56:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:45:50.371 11:56:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:45:50.371 11:56:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:45:50.371 11:56:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:45:50.371 11:56:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:45:50.371 11:56:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:45:50.371 11:56:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:45:50.371 11:56:34 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 294472 00:45:50.371 11:56:34 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@949 -- # '[' -z 294472 ']' 00:45:50.371 11:56:34 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@953 -- # kill -0 294472 00:45:50.371 11:56:34 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@954 -- # uname 00:45:50.371 11:56:34 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:45:50.371 11:56:34 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 294472 00:45:50.371 11:56:34 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:45:50.371 11:56:34 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:45:50.371 11:56:34 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@967 -- # echo 'killing process with pid 294472' 00:45:50.371 killing process with pid 294472 00:45:50.371 11:56:34 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@968 -- # kill 294472 00:45:50.371 11:56:34 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@973 -- # wait 294472 00:45:50.630 11:56:34 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:45:50.630 00:45:50.630 real 0m8.464s 00:45:50.630 user 0m10.542s 00:45:50.630 sys 0m3.372s 00:45:50.630 11:56:34 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@1125 -- # xtrace_disable 00:45:50.630 11:56:34 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:45:50.630 ************************************ 00:45:50.630 END TEST bdev_nbd 00:45:50.630 ************************************ 00:45:50.630 11:56:34 blockdev_crypto_qat -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:45:50.630 11:56:34 blockdev_crypto_qat -- bdev/blockdev.sh@764 -- # '[' crypto_qat = nvme ']' 00:45:50.630 11:56:34 blockdev_crypto_qat -- bdev/blockdev.sh@764 -- # '[' crypto_qat = gpt ']' 00:45:50.630 11:56:34 blockdev_crypto_qat -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:45:50.630 11:56:34 blockdev_crypto_qat -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:45:50.630 11:56:34 blockdev_crypto_qat -- common/autotest_common.sh@1106 -- # xtrace_disable 00:45:50.630 11:56:34 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:45:50.630 ************************************ 00:45:50.630 START TEST bdev_fio 00:45:50.630 ************************************ 00:45:50.630 11:56:34 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1124 -- # fio_test_suite '' 00:45:50.630 11:56:34 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:45:50.630 11:56:34 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:45:50.630 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:45:50.630 11:56:34 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:45:50.630 11:56:34 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:45:50.630 11:56:34 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:45:50.630 11:56:34 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:45:50.630 11:56:34 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:45:50.630 11:56:34 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1279 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:45:50.630 11:56:34 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1280 -- # local workload=verify 00:45:50.630 11:56:34 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1281 -- # local bdev_type=AIO 00:45:50.630 11:56:34 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1282 -- # local env_context= 00:45:50.630 11:56:34 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1283 -- # local fio_dir=/usr/src/fio 00:45:50.630 11:56:34 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1285 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:45:50.630 11:56:34 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1290 -- # '[' -z verify ']' 00:45:50.630 11:56:34 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1294 -- # '[' -n '' ']' 00:45:50.630 11:56:34 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1298 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:45:50.630 11:56:34 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1300 -- # cat 00:45:50.630 11:56:34 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1312 -- # '[' verify == verify ']' 00:45:50.630 11:56:34 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1313 -- # cat 00:45:50.630 11:56:34 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1322 -- # '[' AIO == AIO ']' 00:45:50.630 11:56:34 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1323 -- # /usr/src/fio/fio --version 00:45:50.890 11:56:34 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1323 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:45:50.890 11:56:34 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1324 -- # echo serialize_overlap=1 00:45:50.890 11:56:34 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:45:50.890 11:56:34 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram]' 00:45:50.890 11:56:34 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram 00:45:50.890 11:56:34 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:45:50.890 11:56:34 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram1]' 00:45:50.890 11:56:34 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram1 00:45:50.890 11:56:34 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:45:50.890 11:56:34 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram2]' 00:45:50.890 11:56:34 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram2 00:45:50.890 11:56:34 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:45:50.890 11:56:34 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram3]' 00:45:50.890 11:56:34 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram3 00:45:50.890 11:56:34 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:45:50.890 11:56:34 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:45:50.890 11:56:34 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1100 -- # '[' 11 -le 1 ']' 00:45:50.890 11:56:34 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1106 -- # xtrace_disable 00:45:50.890 11:56:34 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:45:50.890 ************************************ 00:45:50.890 START TEST bdev_fio_rw_verify 00:45:50.890 ************************************ 00:45:50.890 11:56:34 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:45:50.890 11:56:34 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1355 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:45:50.890 11:56:34 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1336 -- # local fio_dir=/usr/src/fio 00:45:50.890 11:56:34 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1338 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:45:50.890 11:56:34 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1338 -- # local sanitizers 00:45:50.890 11:56:34 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:45:50.890 11:56:34 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # shift 00:45:50.890 11:56:34 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1342 -- # local asan_lib= 00:45:50.890 11:56:34 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # for sanitizer in "${sanitizers[@]}" 00:45:50.890 11:56:34 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:45:50.890 11:56:34 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # grep libasan 00:45:50.890 11:56:34 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # awk '{print $3}' 00:45:50.890 11:56:34 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # asan_lib= 00:45:50.890 11:56:34 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # [[ -n '' ]] 00:45:50.890 11:56:34 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # for sanitizer in "${sanitizers[@]}" 00:45:50.890 11:56:34 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # grep libclang_rt.asan 00:45:50.890 11:56:34 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:45:50.890 11:56:34 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # awk '{print $3}' 00:45:50.890 11:56:34 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # asan_lib= 00:45:50.890 11:56:34 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # [[ -n '' ]] 00:45:50.890 11:56:34 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1351 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:45:50.890 11:56:34 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1351 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:45:51.149 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:45:51.149 job_crypto_ram1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:45:51.149 job_crypto_ram2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:45:51.149 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:45:51.149 fio-3.35 00:45:51.149 Starting 4 threads 00:46:06.031 00:46:06.031 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=296180: Mon Jun 10 11:56:47 2024 00:46:06.031 read: IOPS=25.8k, BW=101MiB/s (106MB/s)(1009MiB/10001msec) 00:46:06.031 slat (usec): min=11, max=533, avg=55.46, stdev=29.42 00:46:06.031 clat (usec): min=14, max=2263, avg=302.67, stdev=185.18 00:46:06.031 lat (usec): min=43, max=2500, avg=358.13, stdev=198.97 00:46:06.031 clat percentiles (usec): 00:46:06.031 | 50.000th=[ 260], 99.000th=[ 865], 99.900th=[ 1045], 99.990th=[ 1516], 00:46:06.031 | 99.999th=[ 2245] 00:46:06.031 write: IOPS=28.4k, BW=111MiB/s (116MB/s)(1082MiB/9748msec); 0 zone resets 00:46:06.031 slat (usec): min=17, max=432, avg=62.57, stdev=28.00 00:46:06.031 clat (usec): min=14, max=1621, avg=328.07, stdev=186.75 00:46:06.031 lat (usec): min=47, max=1812, avg=390.65, stdev=199.35 00:46:06.031 clat percentiles (usec): 00:46:06.031 | 50.000th=[ 293], 99.000th=[ 889], 99.900th=[ 1074], 99.990th=[ 1221], 00:46:06.031 | 99.999th=[ 1549] 00:46:06.031 bw ( KiB/s): min=82504, max=169800, per=97.88%, avg=111223.21, stdev=6194.03, samples=76 00:46:06.031 iops : min=20626, max=42450, avg=27805.79, stdev=1548.50, samples=76 00:46:06.031 lat (usec) : 20=0.01%, 50=0.07%, 100=7.15%, 250=37.48%, 500=39.04% 00:46:06.031 lat (usec) : 750=13.40%, 1000=2.61% 00:46:06.031 lat (msec) : 2=0.25%, 4=0.01% 00:46:06.031 cpu : usr=99.69%, sys=0.00%, ctx=53, majf=0, minf=274 00:46:06.031 IO depths : 1=4.2%, 2=27.4%, 4=54.7%, 8=13.7%, 16=0.0%, 32=0.0%, >=64=0.0% 00:46:06.031 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:46:06.031 complete : 0=0.0%, 4=88.0%, 8=12.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:46:06.031 issued rwts: total=258328,276923,0,0 short=0,0,0,0 dropped=0,0,0,0 00:46:06.031 latency : target=0, window=0, percentile=100.00%, depth=8 00:46:06.031 00:46:06.031 Run status group 0 (all jobs): 00:46:06.031 READ: bw=101MiB/s (106MB/s), 101MiB/s-101MiB/s (106MB/s-106MB/s), io=1009MiB (1058MB), run=10001-10001msec 00:46:06.031 WRITE: bw=111MiB/s (116MB/s), 111MiB/s-111MiB/s (116MB/s-116MB/s), io=1082MiB (1134MB), run=9748-9748msec 00:46:06.031 00:46:06.031 real 0m13.360s 00:46:06.031 user 0m45.506s 00:46:06.031 sys 0m0.439s 00:46:06.031 11:56:47 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1125 -- # xtrace_disable 00:46:06.031 11:56:47 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:46:06.031 ************************************ 00:46:06.031 END TEST bdev_fio_rw_verify 00:46:06.031 ************************************ 00:46:06.031 11:56:48 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:46:06.031 11:56:48 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:46:06.031 11:56:48 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:46:06.031 11:56:48 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1279 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:46:06.031 11:56:48 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1280 -- # local workload=trim 00:46:06.031 11:56:48 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1281 -- # local bdev_type= 00:46:06.031 11:56:48 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1282 -- # local env_context= 00:46:06.031 11:56:48 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1283 -- # local fio_dir=/usr/src/fio 00:46:06.031 11:56:48 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1285 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:46:06.031 11:56:48 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1290 -- # '[' -z trim ']' 00:46:06.031 11:56:48 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1294 -- # '[' -n '' ']' 00:46:06.031 11:56:48 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1298 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:46:06.031 11:56:48 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1300 -- # cat 00:46:06.031 11:56:48 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1312 -- # '[' trim == verify ']' 00:46:06.031 11:56:48 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1327 -- # '[' trim == trim ']' 00:46:06.031 11:56:48 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1328 -- # echo rw=trimwrite 00:46:06.031 11:56:48 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:46:06.032 11:56:48 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "d125f91f-6253-57ec-ad6a-ed3572447075"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "d125f91f-6253-57ec-ad6a-ed3572447075",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "efa13f93-6db5-582b-9f75-1f427acbb49a"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "efa13f93-6db5-582b-9f75-1f427acbb49a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "db21125e-010c-5f53-b4fe-cc4119700df2"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "db21125e-010c-5f53-b4fe-cc4119700df2",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "7acd8794-a7b8-56ca-9959-2fd9a65ca15d"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "7acd8794-a7b8-56ca-9959-2fd9a65ca15d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:46:06.032 11:56:48 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n crypto_ram 00:46:06.032 crypto_ram1 00:46:06.032 crypto_ram2 00:46:06.032 crypto_ram3 ]] 00:46:06.032 11:56:48 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:46:06.032 11:56:48 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "d125f91f-6253-57ec-ad6a-ed3572447075"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "d125f91f-6253-57ec-ad6a-ed3572447075",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "efa13f93-6db5-582b-9f75-1f427acbb49a"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "efa13f93-6db5-582b-9f75-1f427acbb49a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "db21125e-010c-5f53-b4fe-cc4119700df2"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "db21125e-010c-5f53-b4fe-cc4119700df2",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "7acd8794-a7b8-56ca-9959-2fd9a65ca15d"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "7acd8794-a7b8-56ca-9959-2fd9a65ca15d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:46:06.032 11:56:48 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:46:06.032 11:56:48 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram]' 00:46:06.032 11:56:48 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram 00:46:06.032 11:56:48 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:46:06.032 11:56:48 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram1]' 00:46:06.032 11:56:48 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram1 00:46:06.032 11:56:48 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:46:06.032 11:56:48 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram2]' 00:46:06.032 11:56:48 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram2 00:46:06.032 11:56:48 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:46:06.032 11:56:48 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram3]' 00:46:06.032 11:56:48 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram3 00:46:06.032 11:56:48 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:46:06.032 11:56:48 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1100 -- # '[' 11 -le 1 ']' 00:46:06.032 11:56:48 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1106 -- # xtrace_disable 00:46:06.032 11:56:48 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:46:06.032 ************************************ 00:46:06.032 START TEST bdev_fio_trim 00:46:06.032 ************************************ 00:46:06.032 11:56:48 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:46:06.032 11:56:48 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1355 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:46:06.032 11:56:48 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1336 -- # local fio_dir=/usr/src/fio 00:46:06.032 11:56:48 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1338 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:46:06.032 11:56:48 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1338 -- # local sanitizers 00:46:06.032 11:56:48 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:46:06.032 11:56:48 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # shift 00:46:06.032 11:56:48 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1342 -- # local asan_lib= 00:46:06.032 11:56:48 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # for sanitizer in "${sanitizers[@]}" 00:46:06.032 11:56:48 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:46:06.032 11:56:48 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # grep libasan 00:46:06.032 11:56:48 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # awk '{print $3}' 00:46:06.032 11:56:48 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # asan_lib= 00:46:06.032 11:56:48 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # [[ -n '' ]] 00:46:06.032 11:56:48 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # for sanitizer in "${sanitizers[@]}" 00:46:06.032 11:56:48 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # grep libclang_rt.asan 00:46:06.032 11:56:48 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:46:06.032 11:56:48 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # awk '{print $3}' 00:46:06.032 11:56:48 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # asan_lib= 00:46:06.032 11:56:48 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # [[ -n '' ]] 00:46:06.032 11:56:48 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1351 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:46:06.032 11:56:48 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1351 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:46:06.032 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:46:06.032 job_crypto_ram1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:46:06.032 job_crypto_ram2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:46:06.032 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:46:06.032 fio-3.35 00:46:06.032 Starting 4 threads 00:46:19.041 00:46:19.041 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=298034: Mon Jun 10 11:57:01 2024 00:46:19.041 write: IOPS=36.3k, BW=142MiB/s (149MB/s)(1420MiB/10001msec); 0 zone resets 00:46:19.041 slat (usec): min=11, max=1263, avg=62.96, stdev=30.76 00:46:19.041 clat (usec): min=15, max=1537, avg=231.76, stdev=126.84 00:46:19.041 lat (usec): min=27, max=1655, avg=294.72, stdev=145.35 00:46:19.041 clat percentiles (usec): 00:46:19.041 | 50.000th=[ 210], 99.000th=[ 586], 99.900th=[ 693], 99.990th=[ 816], 00:46:19.041 | 99.999th=[ 1254] 00:46:19.041 bw ( KiB/s): min=129760, max=225920, per=100.00%, avg=146048.00, stdev=6416.81, samples=76 00:46:19.041 iops : min=32440, max=56480, avg=36512.00, stdev=1604.20, samples=76 00:46:19.041 trim: IOPS=36.3k, BW=142MiB/s (149MB/s)(1420MiB/10001msec); 0 zone resets 00:46:19.041 slat (usec): min=4, max=282, avg=18.31, stdev= 7.65 00:46:19.041 clat (usec): min=8, max=1655, avg=294.85, stdev=145.36 00:46:19.041 lat (usec): min=31, max=1678, avg=313.16, stdev=148.29 00:46:19.041 clat percentiles (usec): 00:46:19.041 | 50.000th=[ 269], 99.000th=[ 701], 99.900th=[ 832], 99.990th=[ 996], 00:46:19.041 | 99.999th=[ 1500] 00:46:19.041 bw ( KiB/s): min=129760, max=225920, per=100.00%, avg=146048.00, stdev=6416.81, samples=76 00:46:19.041 iops : min=32440, max=56480, avg=36512.00, stdev=1604.20, samples=76 00:46:19.041 lat (usec) : 10=0.01%, 20=0.01%, 50=0.58%, 100=7.48%, 250=46.11% 00:46:19.041 lat (usec) : 500=37.73%, 750=7.94%, 1000=0.16% 00:46:19.041 lat (msec) : 2=0.01% 00:46:19.041 cpu : usr=99.67%, sys=0.00%, ctx=57, majf=0, minf=108 00:46:19.041 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:46:19.041 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:46:19.041 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:46:19.041 issued rwts: total=0,363410,363411,0 short=0,0,0,0 dropped=0,0,0,0 00:46:19.041 latency : target=0, window=0, percentile=100.00%, depth=8 00:46:19.041 00:46:19.041 Run status group 0 (all jobs): 00:46:19.041 WRITE: bw=142MiB/s (149MB/s), 142MiB/s-142MiB/s (149MB/s-149MB/s), io=1420MiB (1489MB), run=10001-10001msec 00:46:19.041 TRIM: bw=142MiB/s (149MB/s), 142MiB/s-142MiB/s (149MB/s-149MB/s), io=1420MiB (1489MB), run=10001-10001msec 00:46:19.041 00:46:19.041 real 0m13.451s 00:46:19.041 user 0m45.356s 00:46:19.041 sys 0m0.462s 00:46:19.041 11:57:01 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1125 -- # xtrace_disable 00:46:19.041 11:57:01 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:46:19.041 ************************************ 00:46:19.041 END TEST bdev_fio_trim 00:46:19.041 ************************************ 00:46:19.041 11:57:01 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:46:19.041 11:57:01 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:46:19.041 11:57:01 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:46:19.041 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:46:19.041 11:57:01 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:46:19.041 00:46:19.041 real 0m27.130s 00:46:19.041 user 1m31.023s 00:46:19.041 sys 0m1.082s 00:46:19.041 11:57:01 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1125 -- # xtrace_disable 00:46:19.041 11:57:01 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:46:19.041 ************************************ 00:46:19.041 END TEST bdev_fio 00:46:19.041 ************************************ 00:46:19.041 11:57:01 blockdev_crypto_qat -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:46:19.041 11:57:01 blockdev_crypto_qat -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:46:19.041 11:57:01 blockdev_crypto_qat -- common/autotest_common.sh@1100 -- # '[' 16 -le 1 ']' 00:46:19.041 11:57:01 blockdev_crypto_qat -- common/autotest_common.sh@1106 -- # xtrace_disable 00:46:19.041 11:57:01 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:46:19.041 ************************************ 00:46:19.041 START TEST bdev_verify 00:46:19.041 ************************************ 00:46:19.041 11:57:01 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:46:19.041 [2024-06-10 11:57:01.758977] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:46:19.042 [2024-06-10 11:57:01.759023] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid299580 ] 00:46:19.042 [2024-06-10 11:57:01.839974] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:46:19.042 [2024-06-10 11:57:01.921262] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:46:19.042 [2024-06-10 11:57:01.921265] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:46:19.042 [2024-06-10 11:57:01.942241] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:46:19.042 [2024-06-10 11:57:01.950267] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:46:19.042 [2024-06-10 11:57:01.958281] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:46:19.042 [2024-06-10 11:57:02.056658] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:46:20.419 [2024-06-10 11:57:04.229639] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:46:20.419 [2024-06-10 11:57:04.229705] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:46:20.419 [2024-06-10 11:57:04.229715] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:46:20.419 [2024-06-10 11:57:04.237658] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:46:20.419 [2024-06-10 11:57:04.237672] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:46:20.419 [2024-06-10 11:57:04.237680] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:46:20.419 [2024-06-10 11:57:04.245680] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:46:20.419 [2024-06-10 11:57:04.245694] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:46:20.419 [2024-06-10 11:57:04.245702] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:46:20.419 [2024-06-10 11:57:04.253699] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:46:20.419 [2024-06-10 11:57:04.253711] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:46:20.419 [2024-06-10 11:57:04.253719] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:46:20.419 Running I/O for 5 seconds... 00:46:25.716 00:46:25.716 Latency(us) 00:46:25.716 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:46:25.716 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:46:25.716 Verification LBA range: start 0x0 length 0x1000 00:46:25.716 crypto_ram : 5.06 732.36 2.86 0.00 0.00 174423.23 2892.13 121270.09 00:46:25.716 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:46:25.716 Verification LBA range: start 0x1000 length 0x1000 00:46:25.716 crypto_ram : 5.05 735.43 2.87 0.00 0.00 173747.48 10599.74 121270.09 00:46:25.716 Job: crypto_ram1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:46:25.716 Verification LBA range: start 0x0 length 0x1000 00:46:25.716 crypto_ram1 : 5.06 733.80 2.87 0.00 0.00 173707.54 3120.08 108048.92 00:46:25.716 Job: crypto_ram1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:46:25.716 Verification LBA range: start 0x1000 length 0x1000 00:46:25.716 crypto_ram1 : 5.05 735.22 2.87 0.00 0.00 173304.55 12537.32 108048.92 00:46:25.716 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:46:25.716 Verification LBA range: start 0x0 length 0x1000 00:46:25.716 crypto_ram2 : 5.05 5731.84 22.39 0.00 0.00 22172.26 4188.61 17780.20 00:46:25.716 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:46:25.716 Verification LBA range: start 0x1000 length 0x1000 00:46:25.716 crypto_ram2 : 5.04 5795.70 22.64 0.00 0.00 21921.67 5812.76 17894.18 00:46:25.716 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:46:25.716 Verification LBA range: start 0x0 length 0x1000 00:46:25.716 crypto_ram3 : 5.05 5730.25 22.38 0.00 0.00 22137.45 4217.10 16526.47 00:46:25.716 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:46:25.716 Verification LBA range: start 0x1000 length 0x1000 00:46:25.716 crypto_ram3 : 5.04 5811.18 22.70 0.00 0.00 21831.09 1474.56 16412.49 00:46:25.716 =================================================================================================================== 00:46:25.716 Total : 26005.79 101.59 0.00 0.00 39183.71 1474.56 121270.09 00:46:25.975 00:46:25.975 real 0m8.028s 00:46:25.975 user 0m15.378s 00:46:25.975 sys 0m0.307s 00:46:25.975 11:57:09 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@1125 -- # xtrace_disable 00:46:25.975 11:57:09 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:46:25.975 ************************************ 00:46:25.975 END TEST bdev_verify 00:46:25.975 ************************************ 00:46:25.975 11:57:09 blockdev_crypto_qat -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:46:25.975 11:57:09 blockdev_crypto_qat -- common/autotest_common.sh@1100 -- # '[' 16 -le 1 ']' 00:46:25.975 11:57:09 blockdev_crypto_qat -- common/autotest_common.sh@1106 -- # xtrace_disable 00:46:25.975 11:57:09 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:46:25.975 ************************************ 00:46:25.975 START TEST bdev_verify_big_io 00:46:25.975 ************************************ 00:46:25.975 11:57:09 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:46:25.975 [2024-06-10 11:57:09.871547] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:46:25.975 [2024-06-10 11:57:09.871590] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid301033 ] 00:46:26.234 [2024-06-10 11:57:09.953997] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:46:26.234 [2024-06-10 11:57:10.037837] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:46:26.234 [2024-06-10 11:57:10.037839] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:46:26.234 [2024-06-10 11:57:10.058834] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:46:26.234 [2024-06-10 11:57:10.066860] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:46:26.234 [2024-06-10 11:57:10.074879] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:46:26.234 [2024-06-10 11:57:10.170820] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:46:28.769 [2024-06-10 11:57:12.342433] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:46:28.769 [2024-06-10 11:57:12.342503] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:46:28.769 [2024-06-10 11:57:12.342514] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:46:28.769 [2024-06-10 11:57:12.350450] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:46:28.769 [2024-06-10 11:57:12.350470] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:46:28.769 [2024-06-10 11:57:12.350478] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:46:28.769 [2024-06-10 11:57:12.358471] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:46:28.769 [2024-06-10 11:57:12.358484] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:46:28.769 [2024-06-10 11:57:12.358492] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:46:28.769 [2024-06-10 11:57:12.366492] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:46:28.769 [2024-06-10 11:57:12.366505] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:46:28.769 [2024-06-10 11:57:12.366512] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:46:28.769 Running I/O for 5 seconds... 00:46:29.028 [2024-06-10 11:57:12.955191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.028 [2024-06-10 11:57:12.955540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.028 [2024-06-10 11:57:12.955798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.028 [2024-06-10 11:57:12.956056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.028 [2024-06-10 11:57:12.956104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.028 [2024-06-10 11:57:12.956133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.028 [2024-06-10 11:57:12.956162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.028 [2024-06-10 11:57:12.956191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.028 [2024-06-10 11:57:12.956464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.028 [2024-06-10 11:57:12.956477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.028 [2024-06-10 11:57:12.956487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.028 [2024-06-10 11:57:12.956498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.028 [2024-06-10 11:57:12.959041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.028 [2024-06-10 11:57:12.959077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.028 [2024-06-10 11:57:12.959106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.028 [2024-06-10 11:57:12.959133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.028 [2024-06-10 11:57:12.959477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.028 [2024-06-10 11:57:12.959509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.028 [2024-06-10 11:57:12.959539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.028 [2024-06-10 11:57:12.959567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.028 [2024-06-10 11:57:12.959905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.028 [2024-06-10 11:57:12.959920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.028 [2024-06-10 11:57:12.959932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.028 [2024-06-10 11:57:12.959943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.028 [2024-06-10 11:57:12.962392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.028 [2024-06-10 11:57:12.962426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.028 [2024-06-10 11:57:12.962453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.028 [2024-06-10 11:57:12.962481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.028 [2024-06-10 11:57:12.962807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.028 [2024-06-10 11:57:12.962841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.028 [2024-06-10 11:57:12.962873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.028 [2024-06-10 11:57:12.962901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.028 [2024-06-10 11:57:12.963235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.028 [2024-06-10 11:57:12.963250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.028 [2024-06-10 11:57:12.963261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.029 [2024-06-10 11:57:12.963272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.029 [2024-06-10 11:57:12.965789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.029 [2024-06-10 11:57:12.965822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.029 [2024-06-10 11:57:12.965850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.029 [2024-06-10 11:57:12.965882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.029 [2024-06-10 11:57:12.966266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.029 [2024-06-10 11:57:12.966299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.029 [2024-06-10 11:57:12.966328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.029 [2024-06-10 11:57:12.966356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.029 [2024-06-10 11:57:12.966668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.029 [2024-06-10 11:57:12.966684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.029 [2024-06-10 11:57:12.966694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.029 [2024-06-10 11:57:12.966705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.029 [2024-06-10 11:57:12.969205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.029 [2024-06-10 11:57:12.969250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.029 [2024-06-10 11:57:12.969289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.029 [2024-06-10 11:57:12.969319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.029 [2024-06-10 11:57:12.969709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.029 [2024-06-10 11:57:12.969742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.029 [2024-06-10 11:57:12.969776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.029 [2024-06-10 11:57:12.969804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.029 [2024-06-10 11:57:12.970135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.029 [2024-06-10 11:57:12.970149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.029 [2024-06-10 11:57:12.970160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.029 [2024-06-10 11:57:12.970172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.029 [2024-06-10 11:57:12.972586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.029 [2024-06-10 11:57:12.972619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.029 [2024-06-10 11:57:12.972648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.029 [2024-06-10 11:57:12.972676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.029 [2024-06-10 11:57:12.973029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.029 [2024-06-10 11:57:12.973062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.029 [2024-06-10 11:57:12.973092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.029 [2024-06-10 11:57:12.973131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.029 [2024-06-10 11:57:12.973453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.029 [2024-06-10 11:57:12.973468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.029 [2024-06-10 11:57:12.973480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.029 [2024-06-10 11:57:12.973493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.292 [2024-06-10 11:57:12.975778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.292 [2024-06-10 11:57:12.975810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.292 [2024-06-10 11:57:12.975837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.292 [2024-06-10 11:57:12.975869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.292 [2024-06-10 11:57:12.976213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.292 [2024-06-10 11:57:12.976244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.292 [2024-06-10 11:57:12.976273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.292 [2024-06-10 11:57:12.976301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.292 [2024-06-10 11:57:12.976628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.292 [2024-06-10 11:57:12.976642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.292 [2024-06-10 11:57:12.976652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.292 [2024-06-10 11:57:12.976663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.292 [2024-06-10 11:57:12.979117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.292 [2024-06-10 11:57:12.979153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.292 [2024-06-10 11:57:12.979182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.292 [2024-06-10 11:57:12.979210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.292 [2024-06-10 11:57:12.979553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.292 [2024-06-10 11:57:12.979585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.292 [2024-06-10 11:57:12.979614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.292 [2024-06-10 11:57:12.979643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.292 [2024-06-10 11:57:12.979896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.292 [2024-06-10 11:57:12.979910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.292 [2024-06-10 11:57:12.979922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.979934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.982227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.982261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.982288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.982317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.982653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.982692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.982732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.982787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.983068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.983083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.983095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.983106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.985653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.985686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.985715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.985743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.986021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.986053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.986081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.986125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.986378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.986392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.986402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.986412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.989142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.989184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.989227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.989273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.989602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.989648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.989687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.989715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.990029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.990044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.990054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.990064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.992546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.992579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.992607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.992645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.992938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.992985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.993025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.993054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.993395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.993410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.993421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.993432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.995735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.995780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.995843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.995884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.996196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.996227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.996256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.996285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.996596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.996611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.996622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.996635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.998780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.998816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.998854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.998886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.999255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.999287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.999316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.999344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.999650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.999664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.999677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:12.999689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:13.001816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:13.001849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:13.001880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:13.001907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:13.002251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:13.002282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:13.002311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:13.002339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:13.002627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:13.002641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:13.002653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:13.002663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:13.004891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:13.004925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:13.004954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:13.004983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:13.005319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:13.005350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.293 [2024-06-10 11:57:13.005379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.005419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.005743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.005760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.005771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.005781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.007924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.007958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.007986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.008014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.008319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.008349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.008377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.008405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.008721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.008736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.008746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.008757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.010854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.010891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.010920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.010964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.011341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.011371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.011401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.011427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.011734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.011748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.011758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.011769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.013975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.014006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.014033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.014060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.014394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.014426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.014455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.014482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.014783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.014797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.014808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.014818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.016938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.016972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.017000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.017031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.017366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.017398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.017427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.017455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.017691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.017706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.017719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.017731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.019842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.019878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.019907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.019935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.020283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.020324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.020367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.020407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.020716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.020730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.020740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.020749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.023036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.023069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.023097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.023125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.023413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.023445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.023473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.023504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.023792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.023807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.023817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.023827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.026304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.026347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.026398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.026438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.294 [2024-06-10 11:57:13.026776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.315 [2024-06-10 11:57:13.026822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.315 [2024-06-10 11:57:13.026862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.315 [2024-06-10 11:57:13.026895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.315 [2024-06-10 11:57:13.027162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.315 [2024-06-10 11:57:13.027176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.315 [2024-06-10 11:57:13.027186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.315 [2024-06-10 11:57:13.027198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.315 [2024-06-10 11:57:13.029343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.315 [2024-06-10 11:57:13.029376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.315 [2024-06-10 11:57:13.029404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.315 [2024-06-10 11:57:13.029437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.315 [2024-06-10 11:57:13.029764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.315 [2024-06-10 11:57:13.029799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.315 [2024-06-10 11:57:13.029829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.315 [2024-06-10 11:57:13.029857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.315 [2024-06-10 11:57:13.030214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.315 [2024-06-10 11:57:13.030229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.315 [2024-06-10 11:57:13.030239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.315 [2024-06-10 11:57:13.030250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.315 [2024-06-10 11:57:13.032290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.315 [2024-06-10 11:57:13.032332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.315 [2024-06-10 11:57:13.032383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.315 [2024-06-10 11:57:13.032422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.315 [2024-06-10 11:57:13.032814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.315 [2024-06-10 11:57:13.032845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.315 [2024-06-10 11:57:13.032877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.315 [2024-06-10 11:57:13.032906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.315 [2024-06-10 11:57:13.033215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.315 [2024-06-10 11:57:13.033229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.315 [2024-06-10 11:57:13.033239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.315 [2024-06-10 11:57:13.033256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.315 [2024-06-10 11:57:13.035216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.315 [2024-06-10 11:57:13.035249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.315 [2024-06-10 11:57:13.035278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.315 [2024-06-10 11:57:13.035309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.315 [2024-06-10 11:57:13.035671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.315 [2024-06-10 11:57:13.035702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.315 [2024-06-10 11:57:13.035734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.035762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.036086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.036101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.036112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.036124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.038116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.038147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.038175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.038202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.038537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.038569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.038600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.038628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.038914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.038929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.038939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.038950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.041106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.041139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.041167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.041195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.041530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.041565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.041596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.041625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.041964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.041979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.041992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.042003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.043894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.043926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.043955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.043982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.044295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.044325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.044354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.044381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.044683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.044698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.044710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.044721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.046639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.046672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.046700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.046731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.047122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.047155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.047184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.047213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.047519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.047533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.047544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.047556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.048765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.048797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.048825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.048857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.049067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.049102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.049130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.049162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.049343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.049355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.049366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.049376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.051022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.051056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.051088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.051114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.051487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.051520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.051549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.051579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.051905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.051922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.051933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.051945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.053882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.054672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.055089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.055388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.056667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.057485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.057750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.058018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.058284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.058299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.058309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.058320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.060799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.061348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.062346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.316 [2024-06-10 11:57:13.063406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.064548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.064834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.065099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.065356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.065704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.065718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.065729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.065742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.068004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.068760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.069544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.070493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.071370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.071632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.071893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.072156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.072480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.072494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.072504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.072515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.074079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.074919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.075870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.076812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.077266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.077528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.077787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.078050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.078357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.078370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.078381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.078391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.080600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.081533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.082514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.083560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.084145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.084406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.084668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.084931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.085118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.085132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.085142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.085153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.087135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.088087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.089036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.089469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.090097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.090362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.090620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.091532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.091755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.091767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.091777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.091787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.093981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.094934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.095699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.096159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.096720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.096991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.097904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.098903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.099093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.099105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.099114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.099124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.101320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.101773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.102039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.102302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.102910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.103832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.104662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.105611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.105797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.105809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.105819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.317 [2024-06-10 11:57:13.105828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.107881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.108145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.108411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.108669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.109600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.110400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.111350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.112297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.112534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.112548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.112559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.112569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.113933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.114196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.114456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.114714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.115857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.116765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.117654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.118483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.118734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.118747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.118757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.118767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.120242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.120509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.120771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.121054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.122114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.123041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.124009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.124511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.124741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.124758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.124768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.124778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.126349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.126612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.126877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.127833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.129038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.129932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.130375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.131297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.131488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.131500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.131509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.131519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.133240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.133505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.134103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.134897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.136040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.136857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.137721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.138497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.138676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.138687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.138697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.138705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.140649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.140918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.141843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.142848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.143964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.144439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.145231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.146168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.146358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.146370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.146380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.146390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.148362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.149207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.150003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.150938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.151702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.152741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.153740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.154807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.155007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.155020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.318 [2024-06-10 11:57:13.155030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.155041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.157376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.158170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.159127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.160073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.161064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.161851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.162789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.163738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.164009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.164025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.164040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.164051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.167060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.168116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.169104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.170024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.171053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.172017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.172969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.173628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.173969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.173983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.173994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.174007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.176477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.177422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.178369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.178786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.179952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.181001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.182032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.182292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.182611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.182625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.182637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.182649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.185152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.186108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.186808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.187755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.188964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.189916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.190362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.190622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.190950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.190968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.190978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.190990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.193524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.194570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.195144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.195934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.197061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.197928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.198189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.198450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.198755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.198768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.198779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.198790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.201053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.201595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.202594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.203636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.204782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.205059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.205323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.205583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.205902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.205916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.205927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.205943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.208133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.208875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.209666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.210599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.211508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.211769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.212031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.212293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.212616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.212630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.212641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.212655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.214119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.215000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.215954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.216901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.217354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.217615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.217877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.218136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.319 [2024-06-10 11:57:13.218374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.320 [2024-06-10 11:57:13.218387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.320 [2024-06-10 11:57:13.218397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.320 [2024-06-10 11:57:13.218408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.320 [2024-06-10 11:57:13.220395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.320 [2024-06-10 11:57:13.221187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.320 [2024-06-10 11:57:13.222134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.320 [2024-06-10 11:57:13.223082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.320 [2024-06-10 11:57:13.223634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.320 [2024-06-10 11:57:13.223897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.320 [2024-06-10 11:57:13.224161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.320 [2024-06-10 11:57:13.224609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.320 [2024-06-10 11:57:13.224816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.320 [2024-06-10 11:57:13.224830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.320 [2024-06-10 11:57:13.224841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.320 [2024-06-10 11:57:13.224852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.320 [2024-06-10 11:57:13.226764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.320 [2024-06-10 11:57:13.227711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.320 [2024-06-10 11:57:13.228661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.320 [2024-06-10 11:57:13.229158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.320 [2024-06-10 11:57:13.229770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.320 [2024-06-10 11:57:13.230043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.320 [2024-06-10 11:57:13.230314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.320 [2024-06-10 11:57:13.231427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.320 [2024-06-10 11:57:13.231658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.320 [2024-06-10 11:57:13.231681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.320 [2024-06-10 11:57:13.231692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.320 [2024-06-10 11:57:13.231702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.320 [2024-06-10 11:57:13.233792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.234739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.235616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.235930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.236552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.236816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.237151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.237972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.238158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.238170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.238180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.238189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.240272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.241231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.241875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.242138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.242736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.243003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.244062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.244996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.245186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.245198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.245207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.245217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.247343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.248281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.248549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.248812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.249335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.249922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.250713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.251660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.251848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.251860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.251874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.251885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.253956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.254396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.254657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.254918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.255534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.256438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.257430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.258406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.258696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.258711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.258720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.258730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.260168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.260431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.260691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.260958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.261548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.261807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.262069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.262330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.262602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.262616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.262629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.262640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.264632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.264911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.265178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.265209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.265825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.266097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.266361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.266632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.266899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.266914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.266924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.266936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.269004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.269287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.585 [2024-06-10 11:57:13.269554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.269826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.269860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.270215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.270487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.270752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.271021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.271285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.271631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.271645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.271656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.271667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.273366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.273402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.273432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.273461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.273758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.273798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.273828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.273856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.273891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.274235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.274251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.274263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.274274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.275930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.275966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.275995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.276023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.276363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.276408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.276437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.276466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.276495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.276802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.276815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.276842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.276853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.278664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.278697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.278724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.278751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.279085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.279126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.279157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.279185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.279214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.279505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.279519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.279530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.279540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.281275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.281308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.281336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.281364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.281645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.281695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.281741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.281781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.281810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.282105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.282120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.282130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.282140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.283906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.283938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.283999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.284028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.284307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.284370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.284406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.284448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.284501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.284787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.284800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.284810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.284820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.286596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.286630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.286659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.286689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.286945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.286996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.287029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.287057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.287095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.287469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.287484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.287496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.287507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.289215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.289268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.289307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.586 [2024-06-10 11:57:13.289344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.289660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.289700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.289729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.289757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.289787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.290117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.290132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.290143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.290155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.291817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.291850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.291885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.291918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.292304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.292345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.292376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.292405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.292434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.292719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.292733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.292744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.292755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.294496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.294528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.294556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.294585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.294914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.294962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.294993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.295021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.295062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.295422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.295437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.295448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.295460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.297170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.297205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.297234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.297262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.297533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.297573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.297603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.297631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.297661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.297997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.298015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.298027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.298038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.300001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.300037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.300065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.300094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.300429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.300471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.300500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.300530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.300573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.300891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.300910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.300921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.300933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.302648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.302681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.302712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.302740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.303049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.303112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.303156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.303186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.303215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.303465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.303480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.303491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.303502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.305152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.305185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.305214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.305252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.305507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.305561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.305600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.305645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.305687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.305984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.305999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.306009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.306020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.307817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.307860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.307895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.307924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.308186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.308235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.587 [2024-06-10 11:57:13.308265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.308293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.308333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.308697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.308711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.308721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.308731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.310403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.310439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.310488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.310528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.310822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.310862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.310895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.310923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.310952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.311283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.311297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.311307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.311318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.313013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.313047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.313078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.313108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.313483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.313525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.313558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.313586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.313617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.313902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.313916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.313928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.313938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.315634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.315666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.315694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.315722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.316048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.316091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.316121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.316156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.316185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.316559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.316573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.316584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.316596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.318220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.318253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.318280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.318308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.318588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.318628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.318658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.318686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.318716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.319046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.319062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.319076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.319089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.320841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.320877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.320907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.320934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.321275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.321315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.321344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.321373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.321403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.321687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.321701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.321711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.321721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.323423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.323455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.323486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.323515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.323847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.323891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.323922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.323963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.323993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.324316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.324330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.324342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.324353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.326209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.326242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.326274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.326305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.326554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.326607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.326639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.326669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.326698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.588 [2024-06-10 11:57:13.326992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.589 [2024-06-10 11:57:13.327007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.589 [2024-06-10 11:57:13.327018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.589 [2024-06-10 11:57:13.327029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.589 [2024-06-10 11:57:13.329127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.589 [2024-06-10 11:57:13.329170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.589 [2024-06-10 11:57:13.329209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.589 [2024-06-10 11:57:13.329253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.589 [2024-06-10 11:57:13.329573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.589 [2024-06-10 11:57:13.329624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.589 [2024-06-10 11:57:13.329667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.589 [2024-06-10 11:57:13.329697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.589 [2024-06-10 11:57:13.329724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.589 [2024-06-10 11:57:13.330028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.589 [2024-06-10 11:57:13.330042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.589 [2024-06-10 11:57:13.330055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.589 [2024-06-10 11:57:13.330065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.589 [2024-06-10 11:57:13.331777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.589 [2024-06-10 11:57:13.331809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.589 [2024-06-10 11:57:13.331836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.589 [2024-06-10 11:57:13.331864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.589 [2024-06-10 11:57:13.332233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.589 [2024-06-10 11:57:13.332285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.589 [2024-06-10 11:57:13.332318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.589 [2024-06-10 11:57:13.332353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.589 [2024-06-10 11:57:13.332381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.589 [2024-06-10 11:57:13.332566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.589 [2024-06-10 11:57:13.332580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.589 [2024-06-10 11:57:13.332590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.589 [2024-06-10 11:57:13.332600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.589 [2024-06-10 11:57:13.334347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.589 [2024-06-10 11:57:13.334379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.589 [2024-06-10 11:57:13.334410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.589 [2024-06-10 11:57:13.334448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.589 [2024-06-10 11:57:13.334632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.589 [2024-06-10 11:57:13.334671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.589 [2024-06-10 11:57:13.334705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.589 [2024-06-10 11:57:13.334735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.589 [2024-06-10 11:57:13.334766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.589 [2024-06-10 11:57:13.334951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.589 [2024-06-10 11:57:13.334964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.589 [2024-06-10 11:57:13.334974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.589 [2024-06-10 11:57:13.334983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.589 [2024-06-10 11:57:13.336127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.589 [2024-06-10 11:57:13.336158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.589 [2024-06-10 11:57:13.336185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.589 [2024-06-10 11:57:13.336212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.589 [2024-06-10 11:57:13.336388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.589 [2024-06-10 11:57:13.336436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.589 [2024-06-10 11:57:13.336466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.589 [2024-06-10 11:57:13.336494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.589 [2024-06-10 11:57:13.336523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.589 [2024-06-10 11:57:13.336699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.589 [2024-06-10 11:57:13.336712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.590 [2024-06-10 11:57:13.336725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.590 [2024-06-10 11:57:13.336734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.590 [2024-06-10 11:57:13.338549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.590 [2024-06-10 11:57:13.338582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.590 [2024-06-10 11:57:13.338611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.590 [2024-06-10 11:57:13.338639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.590 [2024-06-10 11:57:13.338864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.590 [2024-06-10 11:57:13.338909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.590 [2024-06-10 11:57:13.338937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.590 [2024-06-10 11:57:13.338966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.590 [2024-06-10 11:57:13.338993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.590 [2024-06-10 11:57:13.339211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.590 [2024-06-10 11:57:13.339224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.590 [2024-06-10 11:57:13.339234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.590 [2024-06-10 11:57:13.339243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.590 [2024-06-10 11:57:13.340350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.590 [2024-06-10 11:57:13.340388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.590 [2024-06-10 11:57:13.340421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.590 [2024-06-10 11:57:13.340450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.590 [2024-06-10 11:57:13.340634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.590 [2024-06-10 11:57:13.340682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.590 [2024-06-10 11:57:13.340710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.590 [2024-06-10 11:57:13.340738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.590 [2024-06-10 11:57:13.340765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.590 [2024-06-10 11:57:13.340949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.590 [2024-06-10 11:57:13.340961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.590 [2024-06-10 11:57:13.340971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.590 [2024-06-10 11:57:13.340981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.590 [2024-06-10 11:57:13.342613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.590 [2024-06-10 11:57:13.342645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.590 [2024-06-10 11:57:13.342675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.590 [2024-06-10 11:57:13.342708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.590 [2024-06-10 11:57:13.343029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.590 [2024-06-10 11:57:13.343072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.590 [2024-06-10 11:57:13.343101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.590 [2024-06-10 11:57:13.343129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.590 [2024-06-10 11:57:13.343156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.590 [2024-06-10 11:57:13.343366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.590 [2024-06-10 11:57:13.343378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.590 [2024-06-10 11:57:13.343387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.590 [2024-06-10 11:57:13.343397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.590 [2024-06-10 11:57:13.344498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.344530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.344557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.344591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.344777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.344821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.344850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.344883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.344920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.345105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.345119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.345129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.345140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.346687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.346718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.346748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.346777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.347107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.347146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.347175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.347206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.347237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.347417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.347431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.347441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.347453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.348564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.348596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.349377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.349413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.349596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.349643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.349672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.349701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.349729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.349909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.349922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.349933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.349943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.351740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.351775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.351803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.352477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.352711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.352758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.352787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.352815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.352843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.353026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.353039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.353050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.353065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.355116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.356050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.356319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.356581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.356861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.357136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.357691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.358478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.359409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.359594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.359606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.359616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.359625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.361695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.362283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.362553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.362814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.363190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.363461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.364423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.365468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.366514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.366699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.366712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.366721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.366731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.368774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.369043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.369303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.369562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.369898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.370733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.371546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.372483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.373413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.373668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.373682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.373692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.373702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.375119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.375391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.375658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.591 [2024-06-10 11:57:13.375926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.376202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.377013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.377985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.378936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.379696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.379902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.379914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.379924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.379934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.381280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.381542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.381812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.382120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.382303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.383205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.384144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.385111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.385620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.385841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.385854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.385864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.385879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.387282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.387543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.387804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.388666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.388897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.389847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.390778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.391334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.392349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.392533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.392547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.392557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.392568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.394175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.394438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.394983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.395756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.395943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.396911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.397813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.398536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.399322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.399506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.399518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.399527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.399536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.401231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.401498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.402470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.403517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.403702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.404651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.405084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.405860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.406796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.406984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.406997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.407006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.407015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.408881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.409705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.410486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.411428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.411612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.412235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.413294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.414237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.415221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.415419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.415432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.415441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.415450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.417721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.418512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.419445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.420372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.420582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.421362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.422145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.423080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.424022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.424287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.424301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.424311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.424321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.427341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.428369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.429438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.430440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.430716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.592 [2024-06-10 11:57:13.431499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.432420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.433344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.434143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.434453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.434467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.434477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.434487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.436824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.437815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.438791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.439230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.439421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.440438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.441528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.442573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.442844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.443179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.443193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.443205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.443217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.445610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.446585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.447187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.448245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.448431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.449377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.450309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.450636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.450900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.451213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.451226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.451237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.451248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.453648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.454659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.455330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.456138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.456323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.457273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.458059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.458320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.458580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.458911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.458925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.458935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.458947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.461072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.461508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.462398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.463384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.463570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.464523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.464791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.465058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.465319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.465640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.465656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.465667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.465678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.467555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.468412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.469178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.470106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.470289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.470921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.471201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.471469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.471736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.472085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.472099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.472110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.472121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.473592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.474371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.475300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.476234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.476421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.476700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.476964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.477241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.477506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.477697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.477710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.477720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.477732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.479883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.480894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.481938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.482969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.483230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.483502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.483761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.593 [2024-06-10 11:57:13.484026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.484737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.484964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.484978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.484988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.484999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.486859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.487792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.488718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.489142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.489532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.489802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.490065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.490426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.491220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.491405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.491423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.491433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.491444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.493500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.494450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.495182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.495442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.495782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.496058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.496319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.497279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.498135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.498321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.498333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.498344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.498354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.500420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.501415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.501684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.501946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.502224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.502497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.503102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.503883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.504810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.504999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.505011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.505021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.505030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.507094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.507614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.507877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.508135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.508478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.508749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.509675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.510694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.511681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.511871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.511883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.511894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.511903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.513942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.514206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.514465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.514725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.515061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.515855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.516634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.517567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.518509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.518765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.518779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.518788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.518799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.520179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.520463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.520729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.520999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.521305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.522247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.523312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.524363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.525383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.525678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.525692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.525702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.594 [2024-06-10 11:57:13.525713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.863 [2024-06-10 11:57:13.527007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.863 [2024-06-10 11:57:13.527279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.863 [2024-06-10 11:57:13.527545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.863 [2024-06-10 11:57:13.527813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.863 [2024-06-10 11:57:13.528080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.863 [2024-06-10 11:57:13.528889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.863 [2024-06-10 11:57:13.529822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.863 [2024-06-10 11:57:13.530755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.863 [2024-06-10 11:57:13.531508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.863 [2024-06-10 11:57:13.531701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.863 [2024-06-10 11:57:13.531716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.863 [2024-06-10 11:57:13.531726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.863 [2024-06-10 11:57:13.531738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.863 [2024-06-10 11:57:13.533136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.863 [2024-06-10 11:57:13.533400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.863 [2024-06-10 11:57:13.533659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.863 [2024-06-10 11:57:13.533925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.863 [2024-06-10 11:57:13.534112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.863 [2024-06-10 11:57:13.535083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.863 [2024-06-10 11:57:13.536041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.863 [2024-06-10 11:57:13.536311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.863 [2024-06-10 11:57:13.537343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.863 [2024-06-10 11:57:13.537529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.863 [2024-06-10 11:57:13.537544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.863 [2024-06-10 11:57:13.537554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.863 [2024-06-10 11:57:13.537564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.863 [2024-06-10 11:57:13.539478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.863 [2024-06-10 11:57:13.539749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.863 [2024-06-10 11:57:13.540028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.863 [2024-06-10 11:57:13.540298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.863 [2024-06-10 11:57:13.540632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.863 [2024-06-10 11:57:13.540915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.863 [2024-06-10 11:57:13.541185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.541452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.541715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.542031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.542046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.542057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.542069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.544006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.544282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.544549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.544835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.545172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.545445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.545716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.545983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.546244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.546613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.546627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.546639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.546654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.548676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.548948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.549213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.549480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.549783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.550064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.550327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.550590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.550852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.551123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.551138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.551149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.551161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.553484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.553755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.554024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.554284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.554636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.554916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.555184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.555450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.555710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.555994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.556008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.556019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.556031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.557965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.558231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.558266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.558524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.558814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.559100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.559367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.559632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.559899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.560229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.560243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.560253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.560263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.562236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.562509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.562786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.562829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.563158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.563439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.563706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.563975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.564247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.564500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.564515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.564526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.564537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.566416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.566458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.566497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.566550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.566823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.566877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.566920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.566968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.567000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.567285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.567299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.567314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.567325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.569114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.569147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.569175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.569203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.864 [2024-06-10 11:57:13.569494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.569543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.569573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.569614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.569642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.570001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.570016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.570026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.570038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.571779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.571837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.571879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.571908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.572218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.572258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.572287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.572314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.572344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.572660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.572675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.572686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.572697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.574371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.574414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.574442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.574474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.574839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.574887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.574917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.574945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.574975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.575257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.575271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.575281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.575291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.576978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.577011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.577039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.577067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.577402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.577445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.577487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.577516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.577546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.577907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.577922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.577935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.577946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.579698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.579733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.579761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.579789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.580093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.580131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.580161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.580194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.580225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.580553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.580567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.580579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.580590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.582368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.582401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.582429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.582457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.582796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.582836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.582871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.582916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.582946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.583225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.583239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.583250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.583262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.585007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.585042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.585074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.585102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.585426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.585482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.585523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.585565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.585611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.585923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.585936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.585946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.585961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.587727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.587760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.587789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.587816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.588082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.865 [2024-06-10 11:57:13.588135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.588166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.588195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.588233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.588551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.588565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.588576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.588586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.590705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.590758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.590798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.590837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.591201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.591254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.591295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.591324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.591351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.591656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.591670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.591682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.591693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.593445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.593478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.593524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.593558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.593899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.593958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.593988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.594016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.594045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.594370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.594385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.594396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.594409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.596055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.596087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.596118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.596147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.596488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.596528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.596558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.596586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.596615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.596952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.596967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.596976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.596988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.598751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.598783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.598811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.598839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.599165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.599208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.599238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.599266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.599300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.599605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.599620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.599630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.599639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.601299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.601331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.601359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.601388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.601695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.601736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.601766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.601793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.601822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.602149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.602164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.602179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.602191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.603819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.603854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.603887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.603916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.604239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.604280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.604309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.604337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.604366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.604663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.604678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.604688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.604699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.606042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.606074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.606104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.606133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.606465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.606505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.606534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.866 [2024-06-10 11:57:13.606563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.606593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.606925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.606940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.606952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.606964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.608592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.608625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.608653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.608681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.608965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.609018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.609047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.609074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.609102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.609419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.609433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.609444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.609455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.610533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.610564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.610590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.610618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.610894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.610940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.610973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.611002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.611029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.611236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.611248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.611258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.611269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.612422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.612453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.612481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.612509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.612838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.612885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.612916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.612944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.612975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.613360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.613375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.613385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.613399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.614552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.614590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.614625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.614653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.614833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.614883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.614914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.614942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.614971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.615158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.615171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.615180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.615190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.616299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.616342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.616370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.616398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.616744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.616788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.616817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.616845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.616879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.617129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.617142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.617153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.617163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.618499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.618531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.618558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.618585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.618764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.618810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.618838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.618871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.618900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.619182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.619195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.619204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.619214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.620234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.620268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.620296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.620323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.620632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.620672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.867 [2024-06-10 11:57:13.620702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.868 [2024-06-10 11:57:13.620730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.868 [2024-06-10 11:57:13.620759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.868 [2024-06-10 11:57:13.621091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.868 [2024-06-10 11:57:13.621108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.868 [2024-06-10 11:57:13.621120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.868 [2024-06-10 11:57:13.621130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.868 [2024-06-10 11:57:13.622700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.868 [2024-06-10 11:57:13.622739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.868 [2024-06-10 11:57:13.622775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.868 [2024-06-10 11:57:13.622802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.868 [2024-06-10 11:57:13.622984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.623030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.623058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.623086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.623113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.623351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.623365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.623375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.623386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.624413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.624445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.624472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.624498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.624790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.624845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.624881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.624909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.624937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.625276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.625291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.625302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.625313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.626803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.626834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.626861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.626892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.627069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.627113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.627142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.627169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.627196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.627371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.627383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.627392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.627402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.628556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.628607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.628634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.628662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.628843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.628894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.628924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.628952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.628981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.629320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.629335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.629345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.629356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.630923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.630955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.630989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.631016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.631196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.631241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.631270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.631298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.631326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.631508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.631520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.631530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.631539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.632705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.632736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.632763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.632791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.632977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.633024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.633055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.633088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.633117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.633353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.633366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.633376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.633386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.635051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.635086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.635113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.635140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.869 [2024-06-10 11:57:13.635353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.635399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.635430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.635457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.635485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.635663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.635675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.635685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.635696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.636820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.636856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.637789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.637823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.638004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.638049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.638080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.638109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.638138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.638490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.638504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.638515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.638526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.640072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.640105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.640137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.641068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.641252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.641304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.641336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.641362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.641392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.641591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.641604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.641617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.641629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.643192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.643464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.643729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.643998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.644336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.645329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.646357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.647419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.648423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.648690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.648704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.648714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.648725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.649996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.650261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.650526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.650786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.650979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.651751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.652685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.653616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.654046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.654232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.654250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.654262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.654273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.655721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.655992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.656257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.656950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.657191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.658143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.659083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.659778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.660719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.660955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.660969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.660980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.660990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.662518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.662783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.663185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.663978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.664164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.665257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.666282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.666913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.667696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.667890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.667903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.667913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.667923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.669656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.669930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.670968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.671940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.672142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.673093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.870 [2024-06-10 11:57:13.673514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.674370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.675319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.675507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.675519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.675529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.675539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.677299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.678012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.678815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.679760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.679955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.680674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.681607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.682435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.683365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.683553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.683567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.683577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.683587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.685748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.686534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.687467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.688400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.688586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.689213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.689993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.690918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.691856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.692089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.692103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.692113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.692124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.695042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.695989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.697002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.698078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.698403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.699251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.700189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.701114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.701973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.702242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.702255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.702265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.702275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.704753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.705687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.706656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.707197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.707391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.708222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.709188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.710157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.710429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.710781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.710796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.710812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.710825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.713355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.714317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.715174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.716015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.716270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.717262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.718202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.718777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.719059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.719409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.719424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.719435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.719447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.721967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.722932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.723419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.724200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.724385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.725385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.726350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.726618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.726889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.727230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.727257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.727268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.727279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.729492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.730026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.730992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.732037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.732224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.733195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.733487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.733754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.734045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.871 [2024-06-10 11:57:13.734391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.734406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.734417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.734430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.736454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.737224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.738012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.738959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.739147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.739775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.740044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.740305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.740565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.740908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.740923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.740934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.740944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.742373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.743160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.744091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.745025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.745212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.745489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.745749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.746028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.746290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.746496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.746510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.746519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.746529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.748742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.749699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.750697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.751769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.752077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.752355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.752616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.752878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.753566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.753798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.753812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.753822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.753832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.755697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.756628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.757538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.758012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.758354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.758632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.758911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.759175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.760129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.760320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.760333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.760346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.760356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.762439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.763372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.764101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.764364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.764694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.764972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.765235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.766244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.767151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.767340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.767353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.767363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.767373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.769501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.770542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.770806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.771072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.771351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.771635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.772225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.773037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.774010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.774202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.774215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.774225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.774236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.776316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.776665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.776930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.777194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.777533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.778040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.778857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.779785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.780715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.780912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.780927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.872 [2024-06-10 11:57:13.780938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.873 [2024-06-10 11:57:13.780952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.873 [2024-06-10 11:57:13.782630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.873 [2024-06-10 11:57:13.782907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.873 [2024-06-10 11:57:13.783169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.873 [2024-06-10 11:57:13.783444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.873 [2024-06-10 11:57:13.783797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.873 [2024-06-10 11:57:13.784783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.873 [2024-06-10 11:57:13.785662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.873 [2024-06-10 11:57:13.786601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.873 [2024-06-10 11:57:13.787591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.873 [2024-06-10 11:57:13.787960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.873 [2024-06-10 11:57:13.787974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.873 [2024-06-10 11:57:13.787985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.873 [2024-06-10 11:57:13.787996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.873 [2024-06-10 11:57:13.789340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.873 [2024-06-10 11:57:13.789606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.873 [2024-06-10 11:57:13.789871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.873 [2024-06-10 11:57:13.790132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.873 [2024-06-10 11:57:13.790316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.873 [2024-06-10 11:57:13.791138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.873 [2024-06-10 11:57:13.792100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.873 [2024-06-10 11:57:13.793039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.873 [2024-06-10 11:57:13.793454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.873 [2024-06-10 11:57:13.793639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.873 [2024-06-10 11:57:13.793653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.873 [2024-06-10 11:57:13.793665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.873 [2024-06-10 11:57:13.793676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.873 [2024-06-10 11:57:13.795087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.873 [2024-06-10 11:57:13.795359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.873 [2024-06-10 11:57:13.795631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.873 [2024-06-10 11:57:13.796127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.873 [2024-06-10 11:57:13.796335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.873 [2024-06-10 11:57:13.797410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.873 [2024-06-10 11:57:13.798406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.873 [2024-06-10 11:57:13.798899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.873 [2024-06-10 11:57:13.799620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.873 [2024-06-10 11:57:13.799810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.873 [2024-06-10 11:57:13.799824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.873 [2024-06-10 11:57:13.799834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:29.873 [2024-06-10 11:57:13.799845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.201 [2024-06-10 11:57:13.801857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.201 [2024-06-10 11:57:13.802846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.201 [2024-06-10 11:57:13.803593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.201 [2024-06-10 11:57:13.804462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.201 [2024-06-10 11:57:13.804662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.201 [2024-06-10 11:57:13.805382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.201 [2024-06-10 11:57:13.805981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.201 [2024-06-10 11:57:13.806715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.201 [2024-06-10 11:57:13.807570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.201 [2024-06-10 11:57:13.807761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.201 [2024-06-10 11:57:13.807775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.201 [2024-06-10 11:57:13.807785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.201 [2024-06-10 11:57:13.807804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.201 [2024-06-10 11:57:13.809603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.201 [2024-06-10 11:57:13.809890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.201 [2024-06-10 11:57:13.810162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.201 [2024-06-10 11:57:13.810440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.201 [2024-06-10 11:57:13.810807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.201 [2024-06-10 11:57:13.811091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.201 [2024-06-10 11:57:13.811375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.201 [2024-06-10 11:57:13.811643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.201 [2024-06-10 11:57:13.811923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.201 [2024-06-10 11:57:13.812235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.201 [2024-06-10 11:57:13.812250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.201 [2024-06-10 11:57:13.812261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.201 [2024-06-10 11:57:13.812273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.201 [2024-06-10 11:57:13.814325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.201 [2024-06-10 11:57:13.814606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.201 [2024-06-10 11:57:13.814882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.201 [2024-06-10 11:57:13.815152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.201 [2024-06-10 11:57:13.815448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.201 [2024-06-10 11:57:13.815727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.201 [2024-06-10 11:57:13.816005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.201 [2024-06-10 11:57:13.816280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.201 [2024-06-10 11:57:13.816552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.201 [2024-06-10 11:57:13.816884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.201 [2024-06-10 11:57:13.816901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.201 [2024-06-10 11:57:13.816912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.201 [2024-06-10 11:57:13.816924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.201 [2024-06-10 11:57:13.818824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.201 [2024-06-10 11:57:13.819105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.201 [2024-06-10 11:57:13.819374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.201 [2024-06-10 11:57:13.819646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.201 [2024-06-10 11:57:13.819980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.201 [2024-06-10 11:57:13.820278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.820555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.820827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.821104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.821469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.821484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.821497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.821513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.823466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.823743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.824023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.824300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.824581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.824865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.825142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.825422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.825693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.825977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.825993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.826004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.826015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.827964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.828247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.828522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.828795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.829124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.829405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.829677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.829953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.830234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.830530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.830547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.830558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.830569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.832533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.832812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.833088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.833357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.833689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.833978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.834256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.834533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.834805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.835124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.835140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.835151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.835162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.837109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.837387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.837657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.837932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.838230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.838513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.838781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.839051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.839319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.839654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.839667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.839678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.839689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.841686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.841967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.842005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.842275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.842577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.842857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.843128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.843396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.843672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.843959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.843973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.843984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.843995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.846250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.846533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.846814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.846850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.847186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.847467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.847735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.848019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.848294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.848654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.848668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.848679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.848691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.850392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.850435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.850465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.850495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.850830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.850877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.202 [2024-06-10 11:57:13.850908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.850937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.850967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.851301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.851315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.851327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.851339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.853054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.853088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.853116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.853145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.853480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.853520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.853550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.853579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.853610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.853886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.853901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.853911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.853922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.855680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.855715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.855744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.855772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.856074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.856113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.856143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.856171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.856199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.856535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.856551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.856561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.856572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.858233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.858266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.858296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.858325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.858628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.858676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.858706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.858735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.858763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.859103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.859121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.859133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.859143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.860950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.860983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.861011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.861038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.861356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.861398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.861428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.861458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.861487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.861760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.861775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.861785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.861795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.863477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.863515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.863542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.863570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.863899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.863948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.863982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.864029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.864072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.864427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.864441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.864451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.864462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.866337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.866370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.866399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.866436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.866710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.866757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.866798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.866828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.866886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.867187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.867200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.867212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.867222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.869073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.869131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.869166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.869194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.869479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.869531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.869563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.203 [2024-06-10 11:57:13.869593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.869621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.869956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.869973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.869985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.869997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.871765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.871820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.871858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.871894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.872157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.872208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.872239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.872267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.872299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.872636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.872651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.872662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.872672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.874207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.874241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.874280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.874311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.874497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.874536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.874573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.874619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.874650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.874915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.874933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.874943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.874954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.876713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.876747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.876777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.876816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.877089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.877144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.877184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.877214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.877244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.877583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.877598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.877609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.877619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.879247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.879281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.879311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.879338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.879522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.879567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.879598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.879626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.879663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.879849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.879862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.879879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.879889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.881105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.881142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.881169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.881196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.881412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.881460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.881491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.881522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.881563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.881922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.881939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.881951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.881963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.883493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.883532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.883560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.883586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.883767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.883812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.883841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.883875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.883903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.884086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.884098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.884108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.884117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.885283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.885316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.885351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.885381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.885564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.885608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.885641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.885670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.204 [2024-06-10 11:57:13.885698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.886031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.886046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.886057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.886067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.887691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.887725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.887762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.887791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.887995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.888035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.888072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.888103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.888131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.888315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.888327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.888337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.888347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.889544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.889577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.889604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.889632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.889813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.889860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.889896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.889930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.889959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.890233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.890252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.890262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.890273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.891944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.891977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.892005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.892036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.892220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.892261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.892291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.892318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.892357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.892541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.892553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.892563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.892573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.893764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.893797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.893825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.893852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.894040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.894085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.894115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.894142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.894176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.894463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.894475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.894487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.894498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.896231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.896263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.896297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.896324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.896575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.896620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.896649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.896678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.896705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.896890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.896902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.896912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.896922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.898099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.898132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.898160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.898187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.898364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.898412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.898441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.898468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.898496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.898755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.898767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.898777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.898787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.900955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.900990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.901018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.901044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.205 [2024-06-10 11:57:13.901293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.901340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.901369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.901401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.901428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.901610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.901622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.901632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.901641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.902798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.902831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.902879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.902911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.903096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.903144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.903172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.903201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.903227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.903446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.903459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.903470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.903480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.905219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.905253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.905283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.905314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.905499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.905546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.905576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.905612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.905639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.905823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.905836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.905849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.905860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.906973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.907005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.907033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.907061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.907243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.907291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.907320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.907353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.907386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.907571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.907583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.907593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.907605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.909337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.909370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.909399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.909426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.909618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.909660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.909688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.909717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.909750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.909938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.909951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.909961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.909971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.911193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.911226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.911253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.911283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.911462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.911508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.911538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.911566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.911594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.911771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.911784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.911793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.911803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.913535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.913569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.913602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.913630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.913907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.913950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.913979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.914007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.914034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.206 [2024-06-10 11:57:13.914254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.914268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.914277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.914288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.915442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.915477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.915510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.915537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.915717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.915772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.915801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.915834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.915862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.916049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.916062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.916072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.916085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.917683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.917715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.917743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.917770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.918089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.918132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.918162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.918190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.918218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.918445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.918459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.918469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.918479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.919634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.919665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.920472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.920506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.920690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.920734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.920764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.920797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.920831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.921014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.921027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.921039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.921052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.922787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.922819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.922848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.923862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.924078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.924124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.924154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.924183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.924226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.924409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.924423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.924432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.924442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.926544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.927271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.927542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.927814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.928149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.928430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.929375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.930212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.931160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.931348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.931361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.931370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.931381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.933647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.933933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.934205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.934478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.934802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.935380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.936216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.937189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.938152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.938401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.938415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.938425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.938435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.939906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.940171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.940432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.940695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.941012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.941921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.942923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.943870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.207 [2024-06-10 11:57:13.944773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.945018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.945032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.945042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.945052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.946445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.946712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.946980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.947240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.947423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.948214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.949143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.950074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.950498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.950684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.950698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.950709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.950719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.952193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.952459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.952721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.953502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.953728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.954675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.955603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.956245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.957263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.957504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.957518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.957528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.957537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.959208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.959485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.959879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.960676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.960861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.961881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.962997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.963597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.964376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.964559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.964571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.964580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.964594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.966365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.966632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.967707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.968635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.968819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.969817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.970254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.971107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.972112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.972302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.972315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.972325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.972335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.974180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.974921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.975697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.976627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.976810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.977531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.978526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.979408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.980348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.980532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.980546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.980556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.980566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.982761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.983552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.984474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.985391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.985580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.986270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.987037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.987961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.988880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.989104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.989117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.989128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.989138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.992104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.993009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.993954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.994946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.995237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.996141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.997137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.998081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.998998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.999254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.208 [2024-06-10 11:57:13.999268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:13.999277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:13.999287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.001784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.002723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.003652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.004207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.004393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.005182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.006103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.007043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.007383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.007743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.007757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.007768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.007779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.010344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.011335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.012263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.012969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.013197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.014156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.015086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.015781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.016054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.016387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.016401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.016414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.016426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.018896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.019822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.020230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.021016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.021199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.022221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.023277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.023547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.023810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.024074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.024088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.024099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.024110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.026417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.027122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.028073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.028913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.029097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.030054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.030520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.030788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.031051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.031410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.031424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.031435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.031446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.033671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.034279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.035058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.036000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.036185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.037013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.037277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.037541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.037805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.038133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.038147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.038160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.038172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.039656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.040512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.041451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.042386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.042571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.042851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.043115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.043378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.043641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.043869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.043883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.043892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.043902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.045955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.046797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.047717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.048649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.048940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.049220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.049484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.049744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.050162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.050347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.050361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.209 [2024-06-10 11:57:14.050372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.050383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.052283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.053224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.054165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.054760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.055086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.055369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.055638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.055950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.056970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.057168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.057181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.057190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.057200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.059298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.060233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.061059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.061322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.061647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.061925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.062193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.062729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.063661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.063848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.063860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.063873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.063883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.065543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.065818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.066085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.066350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.066682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.067707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.068687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.069724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.070747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.071148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.071161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.071171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.071181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.072528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.072819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.073088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.073349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.073629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.073912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.074178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.074438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.074699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.075033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.075049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.075061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.075073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.077006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.077273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.077542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.077810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.078183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.078467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.078741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.079015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.079297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.079584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.079598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.079609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.079620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.081792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.082065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.082327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.082588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.082971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.083250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.083517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.083795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.084060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.084358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.084372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.084383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.084394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.086306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.210 [2024-06-10 11:57:14.086574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.086840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.087105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.087376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.087652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.087920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.088185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.088448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.088750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.088764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.088775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.088786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.090716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.090984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.091249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.091515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.091856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.092135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.092401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.092663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.092934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.093283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.093301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.093315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.093327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.095258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.095528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.095790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.096058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.096390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.096668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.096938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.097202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.097464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.097823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.097840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.097851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.097863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.099882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.100150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.100412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.100677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.100984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.101262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.101522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.101784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.102049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.102374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.102388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.102398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.102410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.104575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.104844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.105116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.105382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.105618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.106291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.106579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.107270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.107757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.108105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.108120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.108132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.108143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.109802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.110373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.111035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.111306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.111608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.111900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.112174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.113100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.113375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.113703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.113718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.113729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.113741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.115489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.116101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.116672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.116950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.117206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.118079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.118357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.118627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.118946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.119238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.119251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.119262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.119273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.120940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.121220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.121257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.211 [2024-06-10 11:57:14.121532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.121890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.123036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.123307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.123577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.124592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.124982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.124997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.125008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.125019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.126811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.127170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.128017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.128053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.128406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.128689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.128967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.129438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.130171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.130514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.130529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.130543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.130555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.132085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.132119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.132148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.132176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.132447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.132499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.132529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.132559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.132592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.132779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.132794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.132804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.132814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.134609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.134642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.134670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.134699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.135006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.135058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.135099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.135129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.135157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.135393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.135408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.135419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.135431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.136864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.136899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.136937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.136978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.137374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.137414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.137444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.137473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.137503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.137791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.137805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.137816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.137827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.139483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.139517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.139545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.139574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.139791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.139833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.139861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.139895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.139923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.140206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.140220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.140230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.212 [2024-06-10 11:57:14.140240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.475 [2024-06-10 11:57:14.141835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.475 [2024-06-10 11:57:14.141875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.475 [2024-06-10 11:57:14.141918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.475 [2024-06-10 11:57:14.141949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.475 [2024-06-10 11:57:14.142303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.475 [2024-06-10 11:57:14.142348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.475 [2024-06-10 11:57:14.142378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.475 [2024-06-10 11:57:14.142410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.475 [2024-06-10 11:57:14.142441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.475 [2024-06-10 11:57:14.142705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.475 [2024-06-10 11:57:14.142719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.475 [2024-06-10 11:57:14.142730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.475 [2024-06-10 11:57:14.142740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.475 [2024-06-10 11:57:14.144384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.475 [2024-06-10 11:57:14.144430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.475 [2024-06-10 11:57:14.144459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.475 [2024-06-10 11:57:14.144490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.475 [2024-06-10 11:57:14.144675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.475 [2024-06-10 11:57:14.144717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.475 [2024-06-10 11:57:14.144747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.475 [2024-06-10 11:57:14.144774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.475 [2024-06-10 11:57:14.144809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.475 [2024-06-10 11:57:14.145168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.475 [2024-06-10 11:57:14.145184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.475 [2024-06-10 11:57:14.145197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.475 [2024-06-10 11:57:14.145208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.475 [2024-06-10 11:57:14.146849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.475 [2024-06-10 11:57:14.146906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.475 [2024-06-10 11:57:14.146948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.475 [2024-06-10 11:57:14.146979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.475 [2024-06-10 11:57:14.147166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.475 [2024-06-10 11:57:14.147211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.475 [2024-06-10 11:57:14.147245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.475 [2024-06-10 11:57:14.147274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.475 [2024-06-10 11:57:14.147304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.475 [2024-06-10 11:57:14.147535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.475 [2024-06-10 11:57:14.147547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.475 [2024-06-10 11:57:14.147561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.475 [2024-06-10 11:57:14.147572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.475 [2024-06-10 11:57:14.149272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.475 [2024-06-10 11:57:14.149318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.475 [2024-06-10 11:57:14.149352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.475 [2024-06-10 11:57:14.149380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.475 [2024-06-10 11:57:14.149570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.475 [2024-06-10 11:57:14.149618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.475 [2024-06-10 11:57:14.149647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.475 [2024-06-10 11:57:14.149676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.475 [2024-06-10 11:57:14.149704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.475 [2024-06-10 11:57:14.150026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.475 [2024-06-10 11:57:14.150043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.475 [2024-06-10 11:57:14.150056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.475 [2024-06-10 11:57:14.150067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.475 [2024-06-10 11:57:14.151784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.475 [2024-06-10 11:57:14.151818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.475 [2024-06-10 11:57:14.151847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.475 [2024-06-10 11:57:14.151882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.475 [2024-06-10 11:57:14.152243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.475 [2024-06-10 11:57:14.152297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.475 [2024-06-10 11:57:14.152329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.475 [2024-06-10 11:57:14.152357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.475 [2024-06-10 11:57:14.152388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.475 [2024-06-10 11:57:14.152698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.152712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.152723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.152734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.154517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.154550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.154578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.154609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.154937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.154980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.155011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.155039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.155070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.155256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.155270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.155279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.155290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.156479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.156512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.156539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.156567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.156780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.156826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.156856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.156887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.156915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.157101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.157114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.157124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.157134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.158751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.158785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.158816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.158844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.159209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.159248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.159278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.159307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.159340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.159535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.159549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.159558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.159568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.160777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.160823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.160854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.160885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.161072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.161120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.161155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.161183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.161211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.161395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.161407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.161417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.161427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.162883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.162916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.162944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.162983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.163328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.163377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.163409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.163437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.163467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.163772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.163785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.163795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.163809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.164893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.164925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.164954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.164981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.165163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.165204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.165240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.165268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.165301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.165482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.165494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.165504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.165513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.166942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.166976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.167017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.167044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.167338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.167397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.167429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.167458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.167487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.167814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.167828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.167838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.167848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.170985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.171022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.171050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.171078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.171264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.171311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.171340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.171369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.171397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.171579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.171592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.171601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.171611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.173721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.173757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.173784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.173811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.173995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.174042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.174071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.174100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.174128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.174309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.174322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.174334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.174345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.176985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.177026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.177054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.177082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.177397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.177437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.177468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.177503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.177547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.177909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.177925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.177937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.177949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.180608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.180648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.180680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.180714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.180910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.180956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.180986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.181036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.181065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.476 [2024-06-10 11:57:14.181243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.181256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.181266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.181277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.184020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.184057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.184087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.184121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.184304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.184349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.184378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.184425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.184453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.184629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.184641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.184651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.184662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.187725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.187776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.187816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.187846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.188223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.188265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.188295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.188324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.188357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.188625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.188640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.188650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.188660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.191043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.191082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.191112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.191143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.191329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.191373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.191403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.191438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.191467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.191653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.191666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.191677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.191686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.194445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.194482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.194513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.194545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.194733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.194774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.194810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.194839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.194874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.195056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.195068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.195078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.195088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.198020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.198057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.198085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.198112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.198404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.198456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.198487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.198514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.198543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.198893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.198908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.198919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.198931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.201299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.201335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.201363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.201390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.201663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.201709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.201742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.201772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.201801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.202032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.202046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.202056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.202066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.204691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.204728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.204756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.204784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.205102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.205142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.205176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.205203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.205232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.205428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.205441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.205451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.205461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.208319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.208355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.208607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.208637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.208664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.208950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.208964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.208974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.211142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.211179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.211207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.211234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.214561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.214597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.214629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.214973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.215005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.215034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.217820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.217872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.218629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.218663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.218880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.218912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.218940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.218968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.220959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.220992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.221020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.221058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.221774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.221993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.222041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.222070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.222098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.222125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.224401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.225394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.225663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.225929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.226195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.226247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.226488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.226527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.227187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.227228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.227984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.228029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.228937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.229121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.229133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.477 [2024-06-10 11:57:14.229143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.229153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.231277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.231551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.231811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.232073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.232393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.233036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.233814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.234739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.235660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.235897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.235912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.235923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.235934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.237372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.237638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.237902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.238163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.238414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.239205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.240143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.241081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.241813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.242010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.242024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.242034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.242044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.243464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.243727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.243992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.244519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.244746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.245749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.246678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.247576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.248356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.248569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.248592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.248602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.248612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.250211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.250473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.250894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.251663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.251845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.252889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.253928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.254528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.255310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.255494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.255506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.255515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.255524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.257291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.257554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.258519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.259579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.259763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.260712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.261141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.261928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.262851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.263037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.263049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.263059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.263068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.264949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.265962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.266877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.267830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.268017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.268463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.269332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.270317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.271261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.271443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.271455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.271465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.271474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.274152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.274942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.275869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.276802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.277066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.278109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.279105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.280134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.281206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.281482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.281496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.281506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.281516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.283999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.284930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.285856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.286372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.286554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.287337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.288264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.289193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.289512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.289880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.289895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.289905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.289920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.292363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.293313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.478 [2024-06-10 11:57:14.294043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.294962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.295175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.296141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.297076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.297559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.297819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.298369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.298385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.298396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.298406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.300597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.301024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.301813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.302757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.302946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.303952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.304212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.304473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.304733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.305063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.305079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.305090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.305105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.306606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.307478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.308427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.309368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.309551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.309828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.310093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.310353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.310615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.310816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.310830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.310839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.310849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.313050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.314107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.315133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.316095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.316399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.316670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.316935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.317195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.318002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.318220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.318232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.318242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.318251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.320308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.321295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.322340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.322610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.322946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.323218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.323478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.324159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.324947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.325131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.325143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.325153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.325162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.327255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.328208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.328478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.328740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.329033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.329305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.329958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.330703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.331526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.331733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.331756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.331765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.331775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.333918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.334197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.334461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.335407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.335757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.336036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.336926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.337917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.338871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.339061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.339074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.339087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.339097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.341099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.341448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.341717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.341987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.342347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.342629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.343640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.344668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.345818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.346024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.346040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.346050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.346059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.348296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.348569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.348829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.349094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.349432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.349706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.349993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.350264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.350532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.350851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.350869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.350880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.350892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.352791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.353057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.353322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.353592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.353884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.354161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.354421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.354683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.354949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.355263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.355277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.355287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.355298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.357340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.357612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.357884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.358144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.358478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.358749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.359016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.479 [2024-06-10 11:57:14.359279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.359549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.359885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.359900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.359912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.359924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.362047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.362308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.362569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.362831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.363184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.363459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.363722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.363985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.364244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.364530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.364544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.364554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.364565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.366509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.366775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.367042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.367302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.367630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.367905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.368168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.368429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.368695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.369054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.369068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.369080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.369091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.371039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.371307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.371565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.371823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.372164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.372443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.372712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.372974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.373234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.373552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.373566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.373578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.373592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.375432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.375696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.375966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.376234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.376574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.376847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.377112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.377376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.377642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.377916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.377930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.377944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.377955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.380067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.380340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.380603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.380861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.381203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.381475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.381741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.382007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.382268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.382569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.382583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.382593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.382605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.384500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.384764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.385026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.385292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.385589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.385870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.386129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.386388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.386649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.386958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.386973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.386983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.386994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.388982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.389255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.389526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.389786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.390090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.390360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.390619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.390887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.391149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.391473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.391487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.391498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.391511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.393505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.393769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.394038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.394302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.394608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.394889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.395151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.395408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.395669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.395988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.396002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.396014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.396025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.397916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.398181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.398448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.398709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.399054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.399325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.399585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.399853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.400124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.400449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.400463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.400474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.400486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.402463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.402728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.402990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.480 [2024-06-10 11:57:14.403253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.481 [2024-06-10 11:57:14.403544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.481 [2024-06-10 11:57:14.404595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.481 [2024-06-10 11:57:14.404862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.481 [2024-06-10 11:57:14.405795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.481 [2024-06-10 11:57:14.406060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.481 [2024-06-10 11:57:14.406392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.481 [2024-06-10 11:57:14.406408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.481 [2024-06-10 11:57:14.406420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.481 [2024-06-10 11:57:14.406431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.481 [2024-06-10 11:57:14.408880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.481 [2024-06-10 11:57:14.408920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.481 [2024-06-10 11:57:14.409166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.481 [2024-06-10 11:57:14.409416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.481 [2024-06-10 11:57:14.409666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.481 [2024-06-10 11:57:14.409936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.481 [2024-06-10 11:57:14.410182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.481 [2024-06-10 11:57:14.410439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.481 [2024-06-10 11:57:14.410697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.481 [2024-06-10 11:57:14.410876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.481 [2024-06-10 11:57:14.410906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.481 [2024-06-10 11:57:14.410920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.481 [2024-06-10 11:57:14.410932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.481 [2024-06-10 11:57:14.412818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.481 [2024-06-10 11:57:14.413767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.481 [2024-06-10 11:57:14.414732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.481 [2024-06-10 11:57:14.414768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.481 [2024-06-10 11:57:14.414960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.481 [2024-06-10 11:57:14.415242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.481 [2024-06-10 11:57:14.415275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.481 [2024-06-10 11:57:14.415539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.481 [2024-06-10 11:57:14.415571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.481 [2024-06-10 11:57:14.415936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.481 [2024-06-10 11:57:14.415951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.481 [2024-06-10 11:57:14.415963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.481 [2024-06-10 11:57:14.415975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.744 [2024-06-10 11:57:14.417208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.744 [2024-06-10 11:57:14.417240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.744 [2024-06-10 11:57:14.417274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.744 [2024-06-10 11:57:14.417305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.744 [2024-06-10 11:57:14.417493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.744 [2024-06-10 11:57:14.418020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.744 [2024-06-10 11:57:14.418055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.744 [2024-06-10 11:57:14.418741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.744 [2024-06-10 11:57:14.418775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.744 [2024-06-10 11:57:14.418970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.744 [2024-06-10 11:57:14.418983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.744 [2024-06-10 11:57:14.418992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.744 [2024-06-10 11:57:14.419002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.744 [2024-06-10 11:57:14.420407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.744 [2024-06-10 11:57:14.420439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.744 [2024-06-10 11:57:14.420478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.744 [2024-06-10 11:57:14.420511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.744 [2024-06-10 11:57:14.420875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.744 [2024-06-10 11:57:14.420916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.744 [2024-06-10 11:57:14.420946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.744 [2024-06-10 11:57:14.420974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.744 [2024-06-10 11:57:14.421003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.744 [2024-06-10 11:57:14.421255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.744 [2024-06-10 11:57:14.421269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.744 [2024-06-10 11:57:14.421279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.744 [2024-06-10 11:57:14.421288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.744 [2024-06-10 11:57:14.422358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.744 [2024-06-10 11:57:14.422391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.744 [2024-06-10 11:57:14.422418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.744 [2024-06-10 11:57:14.422459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.744 [2024-06-10 11:57:14.422640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.744 [2024-06-10 11:57:14.422679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.744 [2024-06-10 11:57:14.422714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.744 [2024-06-10 11:57:14.422744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.744 [2024-06-10 11:57:14.422775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.744 [2024-06-10 11:57:14.422960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.744 [2024-06-10 11:57:14.422973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.744 [2024-06-10 11:57:14.422983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.744 [2024-06-10 11:57:14.422993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.744 [2024-06-10 11:57:14.424367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.744 [2024-06-10 11:57:14.424399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.744 [2024-06-10 11:57:14.424427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.744 [2024-06-10 11:57:14.424454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.744 [2024-06-10 11:57:14.424778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.744 [2024-06-10 11:57:14.424817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.744 [2024-06-10 11:57:14.424847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.744 [2024-06-10 11:57:14.424879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.744 [2024-06-10 11:57:14.424912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.744 [2024-06-10 11:57:14.425243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.744 [2024-06-10 11:57:14.425257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.744 [2024-06-10 11:57:14.425267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.744 [2024-06-10 11:57:14.425278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.744 [2024-06-10 11:57:14.426325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.744 [2024-06-10 11:57:14.426363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.744 [2024-06-10 11:57:14.426391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.744 [2024-06-10 11:57:14.426419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.744 [2024-06-10 11:57:14.426623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.744 [2024-06-10 11:57:14.426664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.426691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.426718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.426745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.426950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.426962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.426972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.426982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.428327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.428362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.428393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.428422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.428703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.428743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.428773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.428801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.428830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.429146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.429162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.429173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.429189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.430284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.430316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.430343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.430371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.430648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.430694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.430727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.430755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.430784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.431006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.431020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.431030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.431040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.432248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.432280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.432313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.432342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.432671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.432711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.432752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.432781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.432810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.433154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.433169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.433182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.433193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.434350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.434381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.434408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.434435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.434690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.434738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.434767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.434814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.434843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.435028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.435041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.435050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.435060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.436191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.436222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.436250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.436277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.436591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.436629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.436658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.436688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.436717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.437096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.437111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.437122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.437133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.438324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.438355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.438388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.438419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.438599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.438641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.438672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.745 [2024-06-10 11:57:14.438699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.438732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.438919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.438931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.438942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.438951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.440092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.440126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.440157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.440184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.440517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.440557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.440586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.440616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.440646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.440929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.440943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.440953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.440965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.442249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.442280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.442306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.442333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.442509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.442557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.442585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.442612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.442647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.442924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.442937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.442947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.442956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.444021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.444053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.444082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.444110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.444437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.444477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.444506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.444536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.444566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.444877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.444890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.444900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.444912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.446284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.446315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.446343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.446371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.446551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.446596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.446625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.446652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.446679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.446936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.446951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.446960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.446970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.448004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.448041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.448074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.448103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.448427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.448466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.448495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.448523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.448553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.448885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.448899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.448910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.448921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.450522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.450562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.450593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.450620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.450799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.450849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.450883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.450911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.450939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.451194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.746 [2024-06-10 11:57:14.451207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.451217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.451228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.452260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.452293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.452322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.452351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.452653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.452697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.452726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.452754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.452782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.453128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.453143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.453154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.453166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.454620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.454651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.454680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.454706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.454894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.454939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.454984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.455017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.455045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.455224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.455238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.455249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.455260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.456327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.456359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.456390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.456417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.456702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.456759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.456789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.456818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.456845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.457196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.457213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.457224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.457235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.458697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.458731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.458758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.458785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.458965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.459013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.459042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.459068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.459096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.459273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.459285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.459295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.459305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.460492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.460526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.460553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.460581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.460760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.460803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.460833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.460861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.460902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.461249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.461263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.461273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.461283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.462815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.462853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.462884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.462927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.463111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.463160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.463188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.463217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.463244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.463427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.463439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.463449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.747 [2024-06-10 11:57:14.463459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.464615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.464647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.464676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.464716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.464902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.464942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.464977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.465005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.465035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.465357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.465371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.465381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.465391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.467040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.467071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.467103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.467132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.467315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.467357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.467394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.467426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.467454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.467637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.467652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.467663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.467673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.468833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.468864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.468895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.468922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.469101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.469146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.469174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.469201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.469236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.469507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.469519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.469529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.469538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.471204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.471235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.471262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.471288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.471493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.471536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.471564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.471592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.471619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.471798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.471810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.471820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.471830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.472986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.473023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.473277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.473308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.473336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.473551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.473566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.473577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.489466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.490239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.490272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.490309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.491174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.491351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.496738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.496790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.497559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.497600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.498518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.498561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.499479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.499668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.499680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.499690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.499699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.502357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.503151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.504088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.505024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.748 [2024-06-10 11:57:14.506435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.749 [2024-06-10 11:57:14.507421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.749 [2024-06-10 11:57:14.508442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.749 [2024-06-10 11:57:14.509529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.749 [2024-06-10 11:57:14.509809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.749 [2024-06-10 11:57:14.509823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.749 [2024-06-10 11:57:14.509833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.749 [2024-06-10 11:57:14.509844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.749 [2024-06-10 11:57:14.512308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.749 [2024-06-10 11:57:14.513256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.749 [2024-06-10 11:57:14.514203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.749 [2024-06-10 11:57:14.514895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.749 [2024-06-10 11:57:14.515872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.749 [2024-06-10 11:57:14.516817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.749 [2024-06-10 11:57:14.517759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.749 [2024-06-10 11:57:14.518246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.749 [2024-06-10 11:57:14.518600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.749 [2024-06-10 11:57:14.518615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.749 [2024-06-10 11:57:14.518625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.749 [2024-06-10 11:57:14.518635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.749 [2024-06-10 11:57:14.521029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.749 [2024-06-10 11:57:14.522011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.749 [2024-06-10 11:57:14.523010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.749 [2024-06-10 11:57:14.523570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.749 [2024-06-10 11:57:14.524788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.749 [2024-06-10 11:57:14.525735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.749 [2024-06-10 11:57:14.526596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.749 [2024-06-10 11:57:14.526857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.749 [2024-06-10 11:57:14.527192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.749 [2024-06-10 11:57:14.527207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.749 [2024-06-10 11:57:14.527217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.749 [2024-06-10 11:57:14.527229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.749 [2024-06-10 11:57:14.529641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.749 [2024-06-10 11:57:14.530595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.749 [2024-06-10 11:57:14.531095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.749 [2024-06-10 11:57:14.532058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.749 [2024-06-10 11:57:14.533188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.749 [2024-06-10 11:57:14.534139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.749 [2024-06-10 11:57:14.534413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.749 [2024-06-10 11:57:14.534675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.749 [2024-06-10 11:57:14.534971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.749 [2024-06-10 11:57:14.534985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.749 [2024-06-10 11:57:14.534995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.749 [2024-06-10 11:57:14.535007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.749 [2024-06-10 11:57:14.537275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.749 [2024-06-10 11:57:14.538154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.749 [2024-06-10 11:57:14.538939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.749 [2024-06-10 11:57:14.539727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.749 [2024-06-10 11:57:14.540870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.749 [2024-06-10 11:57:14.541535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.749 [2024-06-10 11:57:14.541793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.749 [2024-06-10 11:57:14.542057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.749 [2024-06-10 11:57:14.542399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.749 [2024-06-10 11:57:14.542413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.749 [2024-06-10 11:57:14.542423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.749 [2024-06-10 11:57:14.542435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.749 [2024-06-10 11:57:14.544526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.749 [2024-06-10 11:57:14.544952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.749 [2024-06-10 11:57:14.545746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.749 [2024-06-10 11:57:14.546693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.749 [2024-06-10 11:57:14.547972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.749 [2024-06-10 11:57:14.548241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.749 [2024-06-10 11:57:14.548502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.749 [2024-06-10 11:57:14.548764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.749 [2024-06-10 11:57:14.549112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.749 [2024-06-10 11:57:14.549126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.549137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.549148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.550845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.551862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.552785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.553761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.554386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.554649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.554912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.555173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.555460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.555473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.555484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.555495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.557108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.557900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.558841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.559795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.560294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.560554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.560817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.561079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.561264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.561279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.561289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.561299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.563277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.564288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.565253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.566162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.566701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.566966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.567228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.568030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.568250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.568264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.568273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.568283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.570143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.571091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.572022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.572338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.572944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.573204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.573467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.574445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.574631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.574643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.574652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.574662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.576743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.577011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.577272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.577531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.578560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.579385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.580337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.581271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.581513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.581530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.581539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.581549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.582893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.583155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.583415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.583675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.584708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.585678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.586619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.587338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.587594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.587608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.587617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.587626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.588902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.589167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.589428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.589687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.590189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.590453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.590715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.750 [2024-06-10 11:57:14.590990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.591318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.591334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.591345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.591357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.593422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.593688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.593955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.594234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.594873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.595133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.595397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.595665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.595997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.596011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.596021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.596033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.597983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.598252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.598512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.598773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.599373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.599639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.599904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.600166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.600472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.600486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.600496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.600507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.602352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.602613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.602878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.603143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.603682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.603947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.604218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.604480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.604736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.604750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.604764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.604775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.606730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.607002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.607273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.607536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.608098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.608361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.608630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.608902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.609231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.609245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.609256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.609269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.611110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.611378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.611640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.611904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.612420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.612681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.612943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.613207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.613538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.613552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.613563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.613574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.615393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.615656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.615923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.616194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.616803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.617071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.617335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.617602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.617903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.617918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.617928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.617939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.620310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.620578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.620841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.621104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.621707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.751 [2024-06-10 11:57:14.621982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.622250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.622513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.622803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.622816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.622827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.622838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.624689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.624954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.625215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.625476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.626044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.626308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.626568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.626826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.627077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.627091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.627101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.627117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.628982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.629250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.629516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.629781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.630381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.630643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.630913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.631182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.631543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.631560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.631571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.631582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.633515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.633782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.634047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.634309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.634910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.635179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.635440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.635700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.636039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.636054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.636065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.636076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.638044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.638304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.638566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.638831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.639360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.639620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.639888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.640152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.640431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.640444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.640455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.640466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.642461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.643314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.643796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.644471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.645047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.645306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.645566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.645826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.646095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.646110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.646120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.646131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.648207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.648479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.648750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.649012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.649629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.649902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.650215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.651061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.651416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.651430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.651440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.651451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.653800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.654743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.655689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.752 [2024-06-10 11:57:14.656500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.753 [2024-06-10 11:57:14.657517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.753 [2024-06-10 11:57:14.658442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.753 [2024-06-10 11:57:14.659332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.753 [2024-06-10 11:57:14.659885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.753 [2024-06-10 11:57:14.660212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.753 [2024-06-10 11:57:14.660226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.753 [2024-06-10 11:57:14.660236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.753 [2024-06-10 11:57:14.660246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.753 [2024-06-10 11:57:14.661779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.753 [2024-06-10 11:57:14.662665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.753 [2024-06-10 11:57:14.663616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.753 [2024-06-10 11:57:14.664658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.753 [2024-06-10 11:57:14.665861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.753 [2024-06-10 11:57:14.666839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.753 [2024-06-10 11:57:14.666877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.753 [2024-06-10 11:57:14.667811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.753 [2024-06-10 11:57:14.668001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.753 [2024-06-10 11:57:14.668014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.753 [2024-06-10 11:57:14.668024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.753 [2024-06-10 11:57:14.668034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.753 [2024-06-10 11:57:14.670002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.753 [2024-06-10 11:57:14.670265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.753 [2024-06-10 11:57:14.671134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.753 [2024-06-10 11:57:14.671169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.753 [2024-06-10 11:57:14.672345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.753 [2024-06-10 11:57:14.672380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.753 [2024-06-10 11:57:14.673321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.753 [2024-06-10 11:57:14.673747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.753 [2024-06-10 11:57:14.673937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.753 [2024-06-10 11:57:14.673952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.753 [2024-06-10 11:57:14.673963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.753 [2024-06-10 11:57:14.673974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.753 [2024-06-10 11:57:14.675353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.753 [2024-06-10 11:57:14.675390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.753 [2024-06-10 11:57:14.675648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.753 [2024-06-10 11:57:14.675678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.753 [2024-06-10 11:57:14.676036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.753 [2024-06-10 11:57:14.676831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.753 [2024-06-10 11:57:14.677621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.753 [2024-06-10 11:57:14.678569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.753 [2024-06-10 11:57:14.678753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.753 [2024-06-10 11:57:14.678765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.753 [2024-06-10 11:57:14.678775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.753 [2024-06-10 11:57:14.678784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.753 [2024-06-10 11:57:14.680835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.753 [2024-06-10 11:57:14.680874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.753 [2024-06-10 11:57:14.681193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.753 [2024-06-10 11:57:14.681221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.753 [2024-06-10 11:57:14.681441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.753 [2024-06-10 11:57:14.681473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.753 [2024-06-10 11:57:14.681744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.753 [2024-06-10 11:57:14.681777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.753 [2024-06-10 11:57:14.682046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.753 [2024-06-10 11:57:14.682060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.753 [2024-06-10 11:57:14.682070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.753 [2024-06-10 11:57:14.682081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.753 [2024-06-10 11:57:14.683511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.753 [2024-06-10 11:57:14.683542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.753 [2024-06-10 11:57:14.683573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.753 [2024-06-10 11:57:14.683601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.753 [2024-06-10 11:57:14.684749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.753 [2024-06-10 11:57:14.684785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.753 [2024-06-10 11:57:14.685488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:30.753 [2024-06-10 11:57:14.685520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.017 [2024-06-10 11:57:14.685710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.017 [2024-06-10 11:57:14.685723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.017 [2024-06-10 11:57:14.685734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.017 [2024-06-10 11:57:14.685744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.017 [2024-06-10 11:57:14.686839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.017 [2024-06-10 11:57:14.686887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.017 [2024-06-10 11:57:14.686915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.017 [2024-06-10 11:57:14.686944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.017 [2024-06-10 11:57:14.687298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.017 [2024-06-10 11:57:14.687330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.017 [2024-06-10 11:57:14.687360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.017 [2024-06-10 11:57:14.687388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.017 [2024-06-10 11:57:14.687688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.017 [2024-06-10 11:57:14.687702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.017 [2024-06-10 11:57:14.687712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.017 [2024-06-10 11:57:14.687721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.017 [2024-06-10 11:57:14.689021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.017 [2024-06-10 11:57:14.689052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.017 [2024-06-10 11:57:14.689080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.017 [2024-06-10 11:57:14.689108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.017 [2024-06-10 11:57:14.689316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.017 [2024-06-10 11:57:14.689347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.017 [2024-06-10 11:57:14.689374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.017 [2024-06-10 11:57:14.689402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.017 [2024-06-10 11:57:14.689662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.017 [2024-06-10 11:57:14.689676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.017 [2024-06-10 11:57:14.689686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.017 [2024-06-10 11:57:14.689696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.017 [2024-06-10 11:57:14.690719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.017 [2024-06-10 11:57:14.690750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.017 [2024-06-10 11:57:14.690785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.017 [2024-06-10 11:57:14.690812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.017 [2024-06-10 11:57:14.691061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.017 [2024-06-10 11:57:14.691091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.017 [2024-06-10 11:57:14.691118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.017 [2024-06-10 11:57:14.691147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.017 [2024-06-10 11:57:14.691436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.017 [2024-06-10 11:57:14.691451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.017 [2024-06-10 11:57:14.691461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.017 [2024-06-10 11:57:14.691471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.017 [2024-06-10 11:57:14.693078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.017 [2024-06-10 11:57:14.693110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.017 [2024-06-10 11:57:14.693138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.017 [2024-06-10 11:57:14.693170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.017 [2024-06-10 11:57:14.693382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.017 [2024-06-10 11:57:14.693418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.017 [2024-06-10 11:57:14.693461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.017 [2024-06-10 11:57:14.693491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.017 [2024-06-10 11:57:14.693669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.017 [2024-06-10 11:57:14.693681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.017 [2024-06-10 11:57:14.693693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.017 [2024-06-10 11:57:14.693703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.017 [2024-06-10 11:57:14.694817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.017 [2024-06-10 11:57:14.694849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.017 [2024-06-10 11:57:14.694879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.017 [2024-06-10 11:57:14.694911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.695119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.695149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.695176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.695204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.695381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.695393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.695403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.695414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.697154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.697187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.697215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.697246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.697522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.697551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.697579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.697607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.697828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.697842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.697851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.697861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.698950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.698983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.699017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.699052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.699265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.699300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.699332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.699359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.699540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.699556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.699566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.699575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.700979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.701012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.701042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.701070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.701286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.701316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.701344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.701380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.701733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.701747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.701757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.701768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.702971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.703003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.703031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.703071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.703283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.703321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.703351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.703381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.703623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.703637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.703647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.703656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.704745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.704777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.704818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.704850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.705226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.705259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.705288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.705316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.705589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.705603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.705613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.705623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.706918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.706949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.706976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.707004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.707215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.707244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.707272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.707299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.707591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.707605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.707615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.707624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.018 [2024-06-10 11:57:14.708645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.708684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.708713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.708742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.709000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.709030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.709058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.709086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.709385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.709398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.709412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.709423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.711038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.711070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.711098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.711128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.711335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.711371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.711407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.711440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.711621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.711633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.711644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.711654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.712791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.712823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.712850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.712884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.713095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.713125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.713152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.713180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.713357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.713368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.713378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.713387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.715052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.715085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.715112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.715140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.715417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.715450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.715478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.715507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.715720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.715733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.715743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.715753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.716843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.716886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.716915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.716951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.717157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.717193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.717225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.717252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.717430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.717443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.717453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.717462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.718862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.718897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.718925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.718953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.719158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.719188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.719215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.719250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.719630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.719645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.719655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.719665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.720824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.720863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.720899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.720930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.721141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.721172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.721201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.721230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.721410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.721423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.019 [2024-06-10 11:57:14.721433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.721443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.722566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.722608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.722636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.722664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.723024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.723058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.723087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.723117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.723398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.723412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.723422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.723432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.724725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.724756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.724783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.724810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.725022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.725051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.725082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.725109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.725399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.725412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.725422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.725432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.726459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.726498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.726526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.726554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.726790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.726819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.726848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.726878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.727221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.727235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.727245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.727256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.728832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.728864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.728900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.728931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.729144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.729175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.729207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.729235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.729414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.729427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.729438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.729448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.730549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.730583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.730610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.730637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.730849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.730885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.730912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.730939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.731116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.731128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.731138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.731147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.732857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.732893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.732921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.732949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.733208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.733237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.733264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.733292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.733503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.733516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.733526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.733535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.734631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.734669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.734697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.734728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.734936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.734973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.735004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.735035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.020 [2024-06-10 11:57:14.735215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.735227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.735237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.735247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.736648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.736681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.736709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.736739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.736951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.736982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.737012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.737047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.737422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.737437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.737450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.737462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.738602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.738639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.738669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.739547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.739806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.739837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.739870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.739898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.740109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.740123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.740132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.740142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.741414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.741446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.741477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.741505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.741820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.741860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.742122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.742154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.742376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.742390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.742399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.742409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.743482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.743514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.744545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.744584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.744794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.745733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.745766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.745810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.745991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.746003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.746013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.746023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.748311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.748347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.748604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.748630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.749634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.749668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.749694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.749721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.749909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.749921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.749931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.749940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.752938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.753209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.753242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.753271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.753528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.753814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.754089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.754846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.754883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.755668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.755852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.755869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.755880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.755890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.757960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.758902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.759263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.760059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.021 [2024-06-10 11:57:14.760402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.022 [2024-06-10 11:57:14.760443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.022 [2024-06-10 11:57:14.760842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.022 [2024-06-10 11:57:14.760879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.022 [2024-06-10 11:57:14.761537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.022 [2024-06-10 11:57:14.761864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.022 [2024-06-10 11:57:14.761882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.022 [2024-06-10 11:57:14.761894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.022 [2024-06-10 11:57:14.761905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.022 [2024-06-10 11:57:14.765478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.022 [2024-06-10 11:57:14.766431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.022 [2024-06-10 11:57:14.767368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.022 [2024-06-10 11:57:14.767702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.022 [2024-06-10 11:57:14.768040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.022 [2024-06-10 11:57:14.768300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.022 [2024-06-10 11:57:14.768548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.022 [2024-06-10 11:57:14.768978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.022 [2024-06-10 11:57:14.769768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.022 [2024-06-10 11:57:14.769954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.022 [2024-06-10 11:57:14.769966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.022 [2024-06-10 11:57:14.769976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.022 [2024-06-10 11:57:14.769986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.022 [2024-06-10 11:57:14.772031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.022 [2024-06-10 11:57:14.772987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.022 [2024-06-10 11:57:14.773712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.022 [2024-06-10 11:57:14.774655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.022 [2024-06-10 11:57:14.774965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.022 [2024-06-10 11:57:14.775243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.022 [2024-06-10 11:57:14.776122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.022 [2024-06-10 11:57:14.776412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.022 [2024-06-10 11:57:14.776674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.022 [2024-06-10 11:57:14.776861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.022 [2024-06-10 11:57:14.776878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.022 [2024-06-10 11:57:14.776888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.022 [2024-06-10 11:57:14.776898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.022 [2024-06-10 11:57:14.780360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.022 [2024-06-10 11:57:14.781345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.022 [2024-06-10 11:57:14.781608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.022 [2024-06-10 11:57:14.781871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.022 [2024-06-10 11:57:14.782182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.022 [2024-06-10 11:57:14.782456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.022 [2024-06-10 11:57:14.783274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.022 [2024-06-10 11:57:14.784072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.022 [2024-06-10 11:57:14.785003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.022 [2024-06-10 11:57:14.785190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.022 [2024-06-10 11:57:14.785202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.022 [2024-06-10 11:57:14.785211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.022 [2024-06-10 11:57:14.785221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.022 [2024-06-10 11:57:14.787261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.022 [2024-06-10 11:57:14.787646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.022 [2024-06-10 11:57:14.788491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.022 [2024-06-10 11:57:14.788752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.022 [2024-06-10 11:57:14.789040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.022 [2024-06-10 11:57:14.789931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.022 [2024-06-10 11:57:14.790192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.022 [2024-06-10 11:57:14.790750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.022 [2024-06-10 11:57:14.791544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.022 [2024-06-10 11:57:14.791728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.022 [2024-06-10 11:57:14.791740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.022 [2024-06-10 11:57:14.791749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.022 [2024-06-10 11:57:14.791759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.022 [2024-06-10 11:57:14.795265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.022 [2024-06-10 11:57:14.795539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.022 [2024-06-10 11:57:14.795801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.022 [2024-06-10 11:57:14.796063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.022 [2024-06-10 11:57:14.796387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.022 [2024-06-10 11:57:14.797373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.022 [2024-06-10 11:57:14.798449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.799469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.800436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.800696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.800709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.800719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.800728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.802370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.802894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.803157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.804047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.804369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.804641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.805529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.806320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.807272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.807458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.807470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.807480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.807489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.810395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.810662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.810931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.811730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.811957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.812917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.813857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.814464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.815509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.815715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.815727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.815737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.815748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.817100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.817368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.818296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.818557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.818876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.819795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.820809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.821861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.822475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.822696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.822710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.822720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.822731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.826319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.827112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.828063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.829012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.829337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.830403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.831400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.832465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.833470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.833747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.833761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.833770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.833780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.835412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.836010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.836803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.837747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.837937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.838745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.839624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.840407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.841353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.841537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.841550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.841559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.841569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.844742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.845713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.846615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.847149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.847391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.848386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.849328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.850177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.850702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.850895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.850908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.850918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.023 [2024-06-10 11:57:14.850929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.852696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.852979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.853231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.853480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.853818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.854106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.854370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.854636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.855682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.856034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.856053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.856065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.856077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.858321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.858590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.858849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.859118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.859527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.859855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.860717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.860982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.861491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.861682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.861695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.861706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.861720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.863576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.863843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.864108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.864372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.864613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.865298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.865777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.866039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.866958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.867269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.867283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.867293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.867304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.869684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.869966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.870235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.871218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.871583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.871855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.872857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.873126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.873387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.873708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.873721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.873732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.873743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.875768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.876052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.876563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.877217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.877531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.878011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.878702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.878973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.879234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.879515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.879529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.879539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.879550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.882400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.882833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.883098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.884055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.884389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.884666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.884937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.885216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.885478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.885804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.885817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.885828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.885839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.888536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.888810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.024 [2024-06-10 11:57:14.889073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.890007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.890380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.890652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.890932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.891208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.891470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.891750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.891763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.891774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.891785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.894500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.895238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.895504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.895772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.896077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.896351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.896612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.896873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.897133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.897396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.897410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.897425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.897437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.899669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.900101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.900360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.900624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.900908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.901185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.901445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.901717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.901984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.902253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.902267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.902278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.902289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.904446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.904719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.904992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.905255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.905550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.905820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.906087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.906356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.906890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.907079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.907093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.907104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.907115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.908971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.909245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.909512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.909769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.910073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.910344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.910604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.910871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.911657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.911927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.911941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.911951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.911961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.914089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.914354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.914616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.914881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.915200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.915474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.916475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.916743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.917005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.917191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.917205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.917215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.917225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.919280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.919545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.919805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.920078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.920470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.920780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.921651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.025 [2024-06-10 11:57:14.921918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.026 [2024-06-10 11:57:14.922440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.026 [2024-06-10 11:57:14.922628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.026 [2024-06-10 11:57:14.922641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.026 [2024-06-10 11:57:14.922652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.026 [2024-06-10 11:57:14.922663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.026 [2024-06-10 11:57:14.924910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.026 [2024-06-10 11:57:14.925182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.026 [2024-06-10 11:57:14.925442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.026 [2024-06-10 11:57:14.925706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.026 [2024-06-10 11:57:14.925972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.026 [2024-06-10 11:57:14.926687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.026 [2024-06-10 11:57:14.927154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.026 [2024-06-10 11:57:14.927414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.026 [2024-06-10 11:57:14.928319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.026 [2024-06-10 11:57:14.928634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.026 [2024-06-10 11:57:14.928648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.026 [2024-06-10 11:57:14.928658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.026 [2024-06-10 11:57:14.928668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.026 [2024-06-10 11:57:14.930606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.026 [2024-06-10 11:57:14.930878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.026 [2024-06-10 11:57:14.931805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.026 [2024-06-10 11:57:14.932075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.026 [2024-06-10 11:57:14.932414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.026 [2024-06-10 11:57:14.932684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.026 [2024-06-10 11:57:14.932966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.026 [2024-06-10 11:57:14.934044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.026 [2024-06-10 11:57:14.934312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.026 [2024-06-10 11:57:14.934639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.026 [2024-06-10 11:57:14.934653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.026 [2024-06-10 11:57:14.934667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.026 [2024-06-10 11:57:14.934678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.026 [2024-06-10 11:57:14.938790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.026 [2024-06-10 11:57:14.939416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.026 [2024-06-10 11:57:14.939452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.026 [2024-06-10 11:57:14.940239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.026 [2024-06-10 11:57:14.940425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.026 [2024-06-10 11:57:14.941391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.026 [2024-06-10 11:57:14.942046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.026 [2024-06-10 11:57:14.942985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.026 [2024-06-10 11:57:14.943253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.026 [2024-06-10 11:57:14.943592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.026 [2024-06-10 11:57:14.943608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.026 [2024-06-10 11:57:14.943620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.026 [2024-06-10 11:57:14.943631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.026 [2024-06-10 11:57:14.945877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.026 [2024-06-10 11:57:14.946822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.026 [2024-06-10 11:57:14.947762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.026 [2024-06-10 11:57:14.948240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.026 [2024-06-10 11:57:14.948447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.026 [2024-06-10 11:57:14.949485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.026 [2024-06-10 11:57:14.950455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.026 [2024-06-10 11:57:14.950491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.026 [2024-06-10 11:57:14.951294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.026 [2024-06-10 11:57:14.951523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.026 [2024-06-10 11:57:14.951536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.026 [2024-06-10 11:57:14.951546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.026 [2024-06-10 11:57:14.951555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.026 [2024-06-10 11:57:14.954744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.026 [2024-06-10 11:57:14.954786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.026 [2024-06-10 11:57:14.955710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.026 [2024-06-10 11:57:14.955747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.026 [2024-06-10 11:57:14.955964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.026 [2024-06-10 11:57:14.956678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.026 [2024-06-10 11:57:14.956711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.026 [2024-06-10 11:57:14.957786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.289 [2024-06-10 11:57:14.958682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.289 [2024-06-10 11:57:14.958877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.289 [2024-06-10 11:57:14.958890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.289 [2024-06-10 11:57:14.958900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.289 [2024-06-10 11:57:14.958910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.289 [2024-06-10 11:57:14.960463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.289 [2024-06-10 11:57:14.960512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.289 [2024-06-10 11:57:14.961554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.289 [2024-06-10 11:57:14.961819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.289 [2024-06-10 11:57:14.962147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.289 [2024-06-10 11:57:14.962193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.289 [2024-06-10 11:57:14.963069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.289 [2024-06-10 11:57:14.964058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.289 [2024-06-10 11:57:14.965036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.965225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.965237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.965247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.965258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.968153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.968507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.968776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.968816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.969010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.969050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.969087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.969359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.969391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.969683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.969697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.969708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.969718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.970760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.970793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.970821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.970851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.971041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.971862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.971903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.972885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.972919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.973106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.973120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.973130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.973140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.975221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.975260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.975289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.975318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.975581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.975624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.975653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.975682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.975710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.975933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.975946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.975956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.975970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.977104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.977137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.977170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.977199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.977385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.977426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.977462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.977496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.977524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.977704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.977717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.977727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.977737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.980142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.980179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.980209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.980239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.980571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.980611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.980641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.980673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.980703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.980892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.980906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.980918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.980930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.982076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.982109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.982138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.982165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.982408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.982455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.982485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.982513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.982542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.982727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.982741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.982751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.982762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.985345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.985390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.985422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.985451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.985800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.290 [2024-06-10 11:57:14.985840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.985877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.985907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.985937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.986199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.986214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.986224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.986235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.987314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.987348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.987381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.987410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.987598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.987639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.987676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.987704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.987742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.987933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.987948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.987958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.987970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.991100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.991137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.991167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.991195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.991484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.991533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.991565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.991595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.991627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.991960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.991975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.991988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.992000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.993057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.993090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.993126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.993155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.993401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.993444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.993473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.993502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.993532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.993751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.993765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.993776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.993787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.997293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.997338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.997367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.997396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.997579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.997625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.997655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.997685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.997713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.998049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.998064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.998075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.998085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.999173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.999204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.999234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.999261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.999523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.999566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.999602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.999633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.999660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.999839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.999852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.999862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:14.999878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:15.002757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:15.002794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:15.002824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:15.002854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:15.003047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:15.003092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:15.003121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:15.003148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:15.003182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:15.003526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:15.003541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:15.003551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:15.003561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:15.004755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:15.004787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:15.004816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:15.004851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:15.005035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:15.005073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.291 [2024-06-10 11:57:15.005108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.005135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.005164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.005374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.005387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.005397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.005406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.007983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.008020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.008048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.008078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.008391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.008436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.008466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.008493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.008521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.008714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.008728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.008738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.008748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.010089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.010122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.010153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.010180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.010359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.010402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.010430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.010457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.010485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.010808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.010822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.010832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.010842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.013920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.013957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.013985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.014012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.014327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.014367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.014398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.014427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.014456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.014644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.014657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.014667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.014677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.016164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.016198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.016226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.016253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.016439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.016482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.016511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.016550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.016579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.016764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.016778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.016789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.016800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.019692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.019728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.019755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.019787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.020133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.020174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.020203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.020232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.020262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.020549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.020563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.020572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.020582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.022036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.022072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.022101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.022128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.022307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.022357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.022386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.022414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.022441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.022616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.022628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.022638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.022648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.025237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.025274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.025301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.025328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.025511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.025556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.025585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.025611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.292 [2024-06-10 11:57:15.025641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.025969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.025984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.025994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.026006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.027416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.027448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.027475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.027502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.027722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.027767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.027795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.027823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.027851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.028040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.028056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.028066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.028076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.031216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.031254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.031283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.031311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.031494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.031534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.031563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.031592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.031627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.031981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.031999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.032010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.032020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.033708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.033744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.033774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.033800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.034014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.034060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.034090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.034118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.034145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.034326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.034339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.034348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.034358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.037187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.037227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.037254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.037281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.037585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.037633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.037662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.037690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.037717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.037925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.037939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.037949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.037960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.039557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.039590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.039620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.039649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.039833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.039877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.039906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.039933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.039967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.040148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.040160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.040171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.040181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.043075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.043119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.043146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.043172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.043352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.043396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.043429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.043458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.043497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.043683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.043695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.043705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.043714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.045446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.045480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.045511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.045539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.045838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.045883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.045912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.045940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.045968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.046195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.046209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.293 [2024-06-10 11:57:15.046218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.294 [2024-06-10 11:57:15.046228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.294 [2024-06-10 11:57:15.049149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.294 [2024-06-10 11:57:15.049185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.294 [2024-06-10 11:57:15.050126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.294 [2024-06-10 11:57:15.050161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.294 [2024-06-10 11:57:15.050435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.294 [2024-06-10 11:57:15.050485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.294 [2024-06-10 11:57:15.050514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.294 [2024-06-10 11:57:15.050540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.294 [2024-06-10 11:57:15.050569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.294 [2024-06-10 11:57:15.050774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.294 [2024-06-10 11:57:15.050791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.294 [2024-06-10 11:57:15.050801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.294 [2024-06-10 11:57:15.050812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.294 [2024-06-10 11:57:15.052384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.294 [2024-06-10 11:57:15.052422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.294 [2024-06-10 11:57:15.052450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.294 [2024-06-10 11:57:15.052478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.294 [2024-06-10 11:57:15.052657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.294 [2024-06-10 11:57:15.052697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.294 [2024-06-10 11:57:15.052725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.294 [2024-06-10 11:57:15.053605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.294 [2024-06-10 11:57:15.053639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.294 [2024-06-10 11:57:15.053822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.294 [2024-06-10 11:57:15.053835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.294 [2024-06-10 11:57:15.053845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.294 [2024-06-10 11:57:15.053854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.294 [2024-06-10 11:57:15.057036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.294 [2024-06-10 11:57:15.057686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.294 [2024-06-10 11:57:15.057720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.294 [2024-06-10 11:57:15.058125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.294 [2024-06-10 11:57:15.058453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.294 [2024-06-10 11:57:15.058494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.294 [2024-06-10 11:57:15.059315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.294 [2024-06-10 11:57:15.059350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.294 [2024-06-10 11:57:15.059378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.294 [2024-06-10 11:57:15.059718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.294 [2024-06-10 11:57:15.059733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.294 [2024-06-10 11:57:15.059743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.294 [2024-06-10 11:57:15.059755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.294 [2024-06-10 11:57:15.062128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.294 [2024-06-10 11:57:15.062943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.294 [2024-06-10 11:57:15.062982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.294 [2024-06-10 11:57:15.063009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.294 [2024-06-10 11:57:15.063190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.294 [2024-06-10 11:57:15.064155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.294 [2024-06-10 11:57:15.064189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.294 [2024-06-10 11:57:15.064216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.294 [2024-06-10 11:57:15.064243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.294 [2024-06-10 11:57:15.064576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.294 [2024-06-10 11:57:15.064589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.294 [2024-06-10 11:57:15.064599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.294 [2024-06-10 11:57:15.064610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.294 [2024-06-10 11:57:15.069581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.294 [2024-06-10 11:57:15.069628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.294 [2024-06-10 11:57:15.069656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.294 [2024-06-10 11:57:15.070597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.294 [2024-06-10 11:57:15.070785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.294 [2024-06-10 11:57:15.071771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.294 [2024-06-10 11:57:15.072463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.294 [2024-06-10 11:57:15.072498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.294 [2024-06-10 11:57:15.073295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.073481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.073494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.073503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.073513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.076248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.076516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.077428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.078425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.078611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.078655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.079308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.079346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.080144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.080337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.080349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.080359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.080369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.084285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.084553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.085283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.086071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.086259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.087217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.087885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.088861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.089727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.089916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.089928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.089938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.089948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.092678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.093141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.093943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.094877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.095065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.096022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.096746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.097547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.098486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.098672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.098685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.098698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.098708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.101049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.102006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.103045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.103980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.104169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.104662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.105444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.106390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.107335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.107543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.107566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.107577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.107587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.112210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.113081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.114026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.115012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.115331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.116226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.117192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.118137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.118988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.119232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.119246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.119256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.119266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.122524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.123476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.124425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.124853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.125045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.125911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.126856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.127846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.128297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.128483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.128497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.128507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.128516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.131168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.132121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.132823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.133768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.134019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.134980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.135909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.136348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.137341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.295 [2024-06-10 11:57:15.137703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.137717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.137729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.137741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.140565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.140927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.141871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.142812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.143007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.143654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.144200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.144472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.145357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.145660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.145673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.145683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.145694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.149020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.150105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.151106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.152038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.152299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.153076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.153338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.153929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.154500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.154825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.154840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.154851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.154863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.158562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.159519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.160455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.160818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.161010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.161288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.161596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.162455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.162715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.162995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.163009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.163019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.163034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.166660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.167617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.168314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.168987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.169229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.169506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.170127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.170676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.170947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.171234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.171248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.171258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.171269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.173417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.173690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.173997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.174851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.175221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.175509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.176403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.176661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.176930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.177257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.177270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.177280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.177291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.180197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.180476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.181144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.181619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.181943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.182515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.183041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.183299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.183546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.183781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.183794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.183805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.183816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.187741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.188020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.189049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.189318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.189646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.190657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.190930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.191193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.191472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.191785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.296 [2024-06-10 11:57:15.191799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.191810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.191820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.195632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.195964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.196791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.197058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.197339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.198231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.198491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.198757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.199040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.199332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.199345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.199356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.199366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.202373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.203019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.203535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.203796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.204016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.204602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.204875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.205145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.205417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.205615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.205630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.205641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.205652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.208135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.209015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.209335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.209601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.209793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.210179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.210446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.210716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.210994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.211186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.211201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.211212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.211227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.213494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.214544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.214814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.215176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.215367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.215651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.215925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.216201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.216550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.216743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.216757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.216768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.216779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.219036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.219958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.220230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.220702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.220901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.221184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.221454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.221726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.222193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.222384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.222398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.222408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.222419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.224708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.225507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.225804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.226439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.226678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.226990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.227286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.227587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.228050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.228262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.228278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.228289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.228301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.297 [2024-06-10 11:57:15.230680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.558 [2024-06-10 11:57:15.231816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.558 [2024-06-10 11:57:15.232116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.558 [2024-06-10 11:57:15.232440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.558 [2024-06-10 11:57:15.232647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.558 [2024-06-10 11:57:15.232966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.558 [2024-06-10 11:57:15.233262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.558 [2024-06-10 11:57:15.233571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.558 [2024-06-10 11:57:15.233930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.558 [2024-06-10 11:57:15.234139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.558 [2024-06-10 11:57:15.234154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.558 [2024-06-10 11:57:15.234165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.558 [2024-06-10 11:57:15.234177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.558 [2024-06-10 11:57:15.236586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.558 [2024-06-10 11:57:15.237660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.558 [2024-06-10 11:57:15.237957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.558 [2024-06-10 11:57:15.238430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.558 [2024-06-10 11:57:15.238642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.558 [2024-06-10 11:57:15.238959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.558 [2024-06-10 11:57:15.239253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.558 [2024-06-10 11:57:15.239559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.558 [2024-06-10 11:57:15.240076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.558 [2024-06-10 11:57:15.240287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.558 [2024-06-10 11:57:15.240302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.558 [2024-06-10 11:57:15.240314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.558 [2024-06-10 11:57:15.240326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.558 [2024-06-10 11:57:15.242762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.558 [2024-06-10 11:57:15.243742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.558 [2024-06-10 11:57:15.244046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.558 [2024-06-10 11:57:15.244622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.558 [2024-06-10 11:57:15.244834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.245148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.245439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.245738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.246324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.246540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.246555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.246567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.246579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.249218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.250005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.250300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.251070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.251319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.251606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.251884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.252157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.252854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.253115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.253129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.253140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.253150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.255285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.256081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.256884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.257288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.257634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.258361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.258846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.259120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.259388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.259678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.259693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.259704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.259715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.263720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.264003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.264998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.265271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.265611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.266560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.266837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.267118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.267392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.267690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.267704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.267715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.267726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.271794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.272171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.273003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.273273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.273546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.274412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.274679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.275316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.276137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.276327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.276339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.276349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.276360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.279724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.280785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.280819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.281082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.281398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.282364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.282630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.283152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.283933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.284119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.284132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.284141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.284152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.287650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.288679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.288954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.289217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.289404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.289679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.289953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.289988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.290775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.290973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.290986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.290996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.291006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.294597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.294639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.295609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.295649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.296027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.296302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.559 [2024-06-10 11:57:15.296341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.297300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.297789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.297990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.298005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.298015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.298025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.301421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.301468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.302386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.303097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.303355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.303402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.303664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.304413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.304847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.305191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.305207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.305218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.305229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.308445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.309513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.310492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.310528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.310718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.310767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.310798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.311622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.311658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.312028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.312042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.312054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.312066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.313845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.313890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.313928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.313958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.314147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.315198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.315235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.316046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.316079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.316296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.316308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.316318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.316328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.320000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.320038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.320066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.320095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.320288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.320337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.320368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.320397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.320426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.320761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.320776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.320787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.320798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.323940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.323979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.324014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.324043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.324230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.324276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.324306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.324335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.324363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.324546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.324558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.324568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.324577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.326829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.326872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.326902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.326931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.327245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.327288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.327319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.327347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.327376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.327597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.327614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.327624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.327634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.330478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.330517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.330546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.330575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.330759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.330805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.330833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.330862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.560 [2024-06-10 11:57:15.330904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.331210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.331222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.331232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.331242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.334102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.334139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.334169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.334202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.334409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.334456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.334486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.334515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.334557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.334744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.334756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.334767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.334777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.337906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.337949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.337995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.338031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.338223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.338268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.338300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.338329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.338358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.338719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.338734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.338745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.338757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.341468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.341505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.341534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.341562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.341747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.341796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.341826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.341855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.341895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.342168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.342181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.342190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.342200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.344965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.345005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.345034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.345066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.345412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.345452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.345489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.345517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.345545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.345730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.345743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.345754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.345765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.348195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.348232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.348262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.348292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.348483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.348524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.348560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.348590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.348624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.348812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.348825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.348835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.348845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.351915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.351954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.351985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.352014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.352363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.352409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.352440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.352470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.352500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.352824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.352838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.352852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.352863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.355945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.355981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.356010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.356038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.356221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.356270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.356301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.356331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.356358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.356543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.356556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.561 [2024-06-10 11:57:15.356566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.356575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.358808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.358845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.358879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.358909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.359137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.359179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.359209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.359237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.359266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.359487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.359501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.359510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.359521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.362520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.362566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.362597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.362626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.362835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.362886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.362916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.362946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.362988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.363178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.363190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.363200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.363210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.365112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.365151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.365186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.365214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.365403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.365450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.365479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.365507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.365535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.365720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.365732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.365742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.365752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.368480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.368518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.368547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.368575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.368915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.368958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.368989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.369022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.369051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.369415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.369429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.369440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.369452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.371919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.371956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.371984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.372012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.372230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.372277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.372305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.372336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.372364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.372546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.372558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.372568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.372578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.375581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.375617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.375646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.375673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.375911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.375958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.375986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.376015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.376043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.376229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.376241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.376255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.376267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.379472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.379510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.379543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.379571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.379920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.379961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.379991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.380021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.380051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.380381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.380395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.380406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.380416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.382872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.562 [2024-06-10 11:57:15.382910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.382940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.382968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.383155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.383200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.383230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.383266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.383302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.383493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.383506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.383516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.383526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.386403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.386441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.386469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.386505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.386696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.386743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.386773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.386810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.386838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.387028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.387042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.387052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.387062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.389994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.390030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.390058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.390085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.390357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.390403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.390431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.390460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.390487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.390806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.390821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.390831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.390842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.393203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.393238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.393268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.393295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.393592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.393638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.393668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.393699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.393728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.393948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.393962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.393971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.393982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.396652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.396689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.396717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.396746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.397039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.397083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.397111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.397141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.397168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.397385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.397398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.397407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.397418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.400425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.400472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.400503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.400531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.400716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.400761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.400791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.400820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.400847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.401198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.401213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.401223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.401242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.403864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.563 [2024-06-10 11:57:15.403912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.403940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.403967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.404152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.404197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.404227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.404256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.404286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.404470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.404483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.404493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.404503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.407153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.407189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.407219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.407248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.407570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.407610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.407640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.407667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.407697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.407886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.407898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.407908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.407919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.410403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.410439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.411376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.411410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.411664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.411714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.411744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.411786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.411815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.412185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.412200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.412210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.412220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.414680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.414716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.414743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.414771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.415010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.415056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.415086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.416108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.416149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.416331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.416343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.416353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.416362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.419171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.419895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.419930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.420704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.420893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.420940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.421870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.421902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.421933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.422253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.422267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.422276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.422286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.425480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.425748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.425780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.425810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.426136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.426923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.426958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.426986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.427019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.427202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.427214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.427223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.427233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.431086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.431133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.431162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.431424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.431753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.432029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.432291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.432324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.433311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.433550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.433564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.433574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.433584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.437425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.437701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.437967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.438225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.564 [2024-06-10 11:57:15.438542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.438588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.439271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.439305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.440077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.440264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.440276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.440286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.440295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.443666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.443941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.444204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.444468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.444753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.445537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.446412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.447313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.448024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.448240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.448254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.448263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.448273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.451923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.452768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.453687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.454623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.454928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.455920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.456972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.457968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.458921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.459237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.459251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.459260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.459270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.462923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.463855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.464796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.465429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.465616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.466412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.467344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.468174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.468456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.468781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.468795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.468806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.468817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.471608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.472589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.473641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.474718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.474924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.475201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.475464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.475721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.475983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.476232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.476246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.476256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.476266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.479740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.480721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.481342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.481610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.481932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.482200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.482458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.482718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.483005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.483376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.483392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.483404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.483415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.485965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.486237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.486501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.486768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.487050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.487328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.487587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.487846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.488117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.488436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.488450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.488460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.488472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.490876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.491148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.491408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.491669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.491948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.492224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.492484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.565 [2024-06-10 11:57:15.492743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.566 [2024-06-10 11:57:15.493005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.566 [2024-06-10 11:57:15.493321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.566 [2024-06-10 11:57:15.493335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.566 [2024-06-10 11:57:15.493346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.566 [2024-06-10 11:57:15.493357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.566 [2024-06-10 11:57:15.495838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.566 [2024-06-10 11:57:15.496123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.566 [2024-06-10 11:57:15.496387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.566 [2024-06-10 11:57:15.496645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.566 [2024-06-10 11:57:15.496987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.566 [2024-06-10 11:57:15.497260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.566 [2024-06-10 11:57:15.497536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.566 [2024-06-10 11:57:15.497819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.566 [2024-06-10 11:57:15.498093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.566 [2024-06-10 11:57:15.498428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.566 [2024-06-10 11:57:15.498443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.566 [2024-06-10 11:57:15.498453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.566 [2024-06-10 11:57:15.498464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.566 [2024-06-10 11:57:15.500984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.566 [2024-06-10 11:57:15.501262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.566 [2024-06-10 11:57:15.501540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.566 [2024-06-10 11:57:15.501815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.830 [2024-06-10 11:57:15.502150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.830 [2024-06-10 11:57:15.502436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.830 [2024-06-10 11:57:15.502703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.830 [2024-06-10 11:57:15.502977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.830 [2024-06-10 11:57:15.503251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.830 [2024-06-10 11:57:15.503549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.830 [2024-06-10 11:57:15.503564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.830 [2024-06-10 11:57:15.503575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.830 [2024-06-10 11:57:15.503587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.830 [2024-06-10 11:57:15.506041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.830 [2024-06-10 11:57:15.506319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.830 [2024-06-10 11:57:15.506586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.830 [2024-06-10 11:57:15.506857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.830 [2024-06-10 11:57:15.507138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.830 [2024-06-10 11:57:15.507423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.830 [2024-06-10 11:57:15.507689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.830 [2024-06-10 11:57:15.507960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.830 [2024-06-10 11:57:15.508232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.830 [2024-06-10 11:57:15.508519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.830 [2024-06-10 11:57:15.508534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.830 [2024-06-10 11:57:15.508545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.830 [2024-06-10 11:57:15.508556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.830 [2024-06-10 11:57:15.511151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.830 [2024-06-10 11:57:15.511432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.830 [2024-06-10 11:57:15.511693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.830 [2024-06-10 11:57:15.511974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.830 [2024-06-10 11:57:15.512293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.830 [2024-06-10 11:57:15.512576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.830 [2024-06-10 11:57:15.512850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.830 [2024-06-10 11:57:15.513126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.830 [2024-06-10 11:57:15.513392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.830 [2024-06-10 11:57:15.513754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.830 [2024-06-10 11:57:15.513771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.830 [2024-06-10 11:57:15.513782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.830 [2024-06-10 11:57:15.513794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.830 [2024-06-10 11:57:15.516264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.830 [2024-06-10 11:57:15.516540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.830 [2024-06-10 11:57:15.516809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.830 [2024-06-10 11:57:15.517072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.830 [2024-06-10 11:57:15.517345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.830 [2024-06-10 11:57:15.517616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.830 [2024-06-10 11:57:15.517884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.830 [2024-06-10 11:57:15.518152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.830 [2024-06-10 11:57:15.518413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.830 [2024-06-10 11:57:15.518726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.830 [2024-06-10 11:57:15.518741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.830 [2024-06-10 11:57:15.518752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.830 [2024-06-10 11:57:15.518763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.830 [2024-06-10 11:57:15.521276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.830 [2024-06-10 11:57:15.521545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.830 [2024-06-10 11:57:15.521810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.830 [2024-06-10 11:57:15.522081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.830 [2024-06-10 11:57:15.522384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.830 [2024-06-10 11:57:15.522655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.830 [2024-06-10 11:57:15.522918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.830 [2024-06-10 11:57:15.523178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.830 [2024-06-10 11:57:15.523440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.830 [2024-06-10 11:57:15.523707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.830 [2024-06-10 11:57:15.523721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.830 [2024-06-10 11:57:15.523732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.830 [2024-06-10 11:57:15.523743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.830 [2024-06-10 11:57:15.525884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.830 [2024-06-10 11:57:15.526158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.526423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.526680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.527029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.527301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.527575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.527839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.528108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.528431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.528445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.528455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.528467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.530389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.530651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.530915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.531177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.531478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.531753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.532019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.532279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.532538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.532845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.532860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.532874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.532884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.534456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.534720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.534984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.535248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.535485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.535760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.536028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.536286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.536545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.536810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.536824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.536835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.536846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.538766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.539037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.539304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.539565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.539878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.540151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.540413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.540679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.540956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.541286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.541301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.541311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.541324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.543768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.544732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.545504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.546418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.546640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.547594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.548517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.549037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.549300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.549624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.549642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.549653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.549663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.552168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.553176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.553815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.554600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.554784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.555733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.556515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.556775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.557036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.557351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.557364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.557374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.557386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.559536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.559978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.560847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.561837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.562026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.563016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.563286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.563544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.563803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.564141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.564155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.564166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.564178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.565952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.566901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.567775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.831 [2024-06-10 11:57:15.568704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.568893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.569352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.569610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.569872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.570134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.570421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.570434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.570446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.570456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.572077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.572870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.573811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.574789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.575027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.575318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.575576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.575846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.576130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.576319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.576332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.576342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.576354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.578232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.579168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.580105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.580890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.581195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.581479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.581740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.582004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.582967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.583222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.583235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.583244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.583254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.585297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.586318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.587386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.587648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.587979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.588249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.588509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.589186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.589971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.590156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.590168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.590178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.590187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.592195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.593131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.593525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.593788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.594102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.594375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.594733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.595530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.596461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.596646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.596658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.596670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.596681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.598761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.599465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.599740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.600003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.600367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.600638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.601660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.602638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.603678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.603864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.603880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.603889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.603899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.605986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.606248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.606285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.606544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.606871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.607142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.608067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.608892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.609825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.610011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.610023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.610033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.610043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.612145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.612418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.612686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.612951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.832 [2024-06-10 11:57:15.613286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.613982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.614759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.614793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.615721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.615912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.615924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.615933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.615943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.618069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.618113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.618375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.618406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.618739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.619017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.619049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.619305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.620260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.620483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.620497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.620506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.620516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.622522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.622560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.623603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.624587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.624871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.624914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.625175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.625439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.625698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.625916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.625930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.625941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.625952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.627087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.628150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.629104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.629145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.629332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.629373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.629409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.630359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.630393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.630696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.630710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.630719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.630729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.632399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.632430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.632457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.632491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.632674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.633615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.633649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.634621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.634660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.634915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.634927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.634937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.634951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.636013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.636045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.636074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.636101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.636417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.636457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.636487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.636515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.636546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.636850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.636864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.636878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.636889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.638343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.638375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.638403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.638431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.638610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.638655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.638684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.638712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.638739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.639019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.639031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.639042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.639052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.640092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.640123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.640159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.640190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.640467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.640505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.640534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.833 [2024-06-10 11:57:15.640562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.640592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.640918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.640933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.640944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.640954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.642448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.642479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.642506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.642547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.642727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.642764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.642802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.642831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.642858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.643046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.643059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.643070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.643080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.644121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.644151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.644179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.644206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.644493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.644540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.644570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.644597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.644629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.644966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.644980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.644991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.645002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.646483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.646514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.646545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.646573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.646753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.646797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.646825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.646853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.646884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.647065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.647077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.647087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.647097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.648257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.648289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.648316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.648343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.648560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.648608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.648638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.648668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.648698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.649068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.649084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.649095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.649111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.650641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.650678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.650706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.650733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.650915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.650960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.650989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.651015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.651044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.651218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.651229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.651240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.651250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.652402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.652435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.652462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.652495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.652677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.652716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.834 [2024-06-10 11:57:15.652750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.652780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.652808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.653139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.653154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.653165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.653175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.654834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.654865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.654898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.654933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.655120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.655162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.655189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.655223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.655254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.655438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.655450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.655460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.655471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.656627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.656659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.656685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.656713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.656895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.656943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.656971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.657000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.657026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.657307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.657321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.657330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.657341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.659167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.659198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.659225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.659253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.659467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.659512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.659540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.659568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.659598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.659779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.659791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.659800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.659810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.660956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.660993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.661024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.661051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.661232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.661277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.661306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.661332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.661360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.661601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.661615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.661625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.661635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.663554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.663594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.663626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.663653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.663841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.663891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.663925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.663953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.663982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.664176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.664188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.664198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.664207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.665297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.665328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.665356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.665382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.665558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.665605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.665633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.665668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.665699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.665882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.665895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.665905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.665915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.667678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.667710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.667737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.667767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.667948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.667989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.668017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.668047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.668080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.835 [2024-06-10 11:57:15.668266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.668278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.668287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.668297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.669521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.669553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.669580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.669606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.669784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.669833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.669862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.669892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.669920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.670099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.670111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.670121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.670132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.671830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.671863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.671897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.671925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.672186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.672226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.672253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.672282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.672308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.672553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.672567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.672577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.672587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.673692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.673731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.673758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.673791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.673975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.674024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.674058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.674102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.674131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.674318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.674331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.674342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.674353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.676000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.676032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.676061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.676089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.676417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.676457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.676489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.676516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.676544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.676731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.676745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.676755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.676764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.677874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.677906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.677933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.677960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.678173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.678218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.678247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.678275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.678301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.678479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.678491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.678501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.678511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.679980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.680015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.680045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.680072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.680414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.680454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.680484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.680513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.680541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.680733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.680745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.680756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.680766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.681912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.681944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.681971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.681998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.682207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.682255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.682285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.682312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.682339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.682515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.682527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.836 [2024-06-10 11:57:15.682537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.682546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.684063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.684109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.684138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.684167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.684522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.684566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.684595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.684624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.684653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.684900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.684913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.684923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.684933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.686003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.686035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.686063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.686093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.686274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.686318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.686348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.686381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.686409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.686589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.686602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.686612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.686622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.688015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.688047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.688078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.688107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.688453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.688494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.688524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.688552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.688581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.688912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.688929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.688940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.688951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.690116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.690147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.691077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.691111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.691292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.691339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.691368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.691396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.691423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.691681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.691695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.691705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.691715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.693810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.693841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.693872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.693900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.694120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.694163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.694193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.695115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.695148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.695329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.695342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.695352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.695362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.696519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.697456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.697490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.697753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.698116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.698157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.698416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.698446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.698474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.698789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.698803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.698814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.698825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.699926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.700606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.700638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.700670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.700850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.701653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.701686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.701713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.701752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.701938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.701950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.701960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.701971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.703761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.837 [2024-06-10 11:57:15.703806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.703835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.704722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.704944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.705887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.706820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.706853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.707263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.707446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.707459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.707469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.707479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.708822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.709087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.709346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.709976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.710192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.710238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.711183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.711216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.712143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.712336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.712348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.712359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.712370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.714163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.714427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.714686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.714952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.715295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.715567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.715833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.716100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.716358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.716715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.716731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.716746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.716758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.718718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.718982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.719244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.719510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.719843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.720120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.720380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.720638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.720903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.721179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.721194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.721204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.721216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.723449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.723715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.723980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.724239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.724608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.724886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.725152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.725417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.725679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.725987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.726001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.726012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.726024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.727962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.728222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.728487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.728748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.729038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.729314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.729571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.729829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.730090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.730365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.730379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.730390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.730401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.732365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.732637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.732908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.733171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.733496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.733776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.734043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.734308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.734574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.734921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.734936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.734948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.734959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.736903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.737171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.737433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.737692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.838 [2024-06-10 11:57:15.737999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.738275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.738538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.738802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.739070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.739413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.739427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.739438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.739451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.741450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.741713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.741984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.742251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.742627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.742904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.743164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.743426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.743719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.743998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.744013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.744025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.744037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.746164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.746433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.746694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.746956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.747279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.747552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.747819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.748088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.748350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.748658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.748671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.748685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.748696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.750588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.750869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.751134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.751401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.751690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.751986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.752246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.752504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.752763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.753066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.753080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.753091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.753101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.755076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.755347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.755609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.755875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.756189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.756460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.756723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.756993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.757257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.757598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.757613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.757624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.757636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.759655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.759919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.760178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.760446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.760744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.761026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.761288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.761545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.761803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.762138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.762151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.762162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.762172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.764110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.764532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.765299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.765910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.766166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.766467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.766733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.767003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.839 [2024-06-10 11:57:15.767271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.840 [2024-06-10 11:57:15.767548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.840 [2024-06-10 11:57:15.767562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.840 [2024-06-10 11:57:15.767573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.840 [2024-06-10 11:57:15.767584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.840 [2024-06-10 11:57:15.769583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.840 [2024-06-10 11:57:15.769859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.840 [2024-06-10 11:57:15.770146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.840 [2024-06-10 11:57:15.770414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:31.840 [2024-06-10 11:57:15.770721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.104 [2024-06-10 11:57:15.771004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.104 [2024-06-10 11:57:15.771274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.104 [2024-06-10 11:57:15.771559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.104 [2024-06-10 11:57:15.771830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.104 [2024-06-10 11:57:15.772188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.104 [2024-06-10 11:57:15.772202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.104 [2024-06-10 11:57:15.772214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.104 [2024-06-10 11:57:15.772226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.104 [2024-06-10 11:57:15.774166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.104 [2024-06-10 11:57:15.774438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.104 [2024-06-10 11:57:15.774705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.104 [2024-06-10 11:57:15.774978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.104 [2024-06-10 11:57:15.775309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.104 [2024-06-10 11:57:15.775593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.104 [2024-06-10 11:57:15.775880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.104 [2024-06-10 11:57:15.776151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.104 [2024-06-10 11:57:15.776430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.104 [2024-06-10 11:57:15.776763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.104 [2024-06-10 11:57:15.776777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.104 [2024-06-10 11:57:15.776788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.104 [2024-06-10 11:57:15.776799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.104 [2024-06-10 11:57:15.778268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.104 [2024-06-10 11:57:15.779088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.104 [2024-06-10 11:57:15.780057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.104 [2024-06-10 11:57:15.781024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.104 [2024-06-10 11:57:15.781212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.104 [2024-06-10 11:57:15.781495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.104 [2024-06-10 11:57:15.781764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.104 [2024-06-10 11:57:15.782054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.104 [2024-06-10 11:57:15.782312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.104 [2024-06-10 11:57:15.782504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.104 [2024-06-10 11:57:15.782518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.104 [2024-06-10 11:57:15.782527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.104 [2024-06-10 11:57:15.782541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.104 [2024-06-10 11:57:15.784632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.104 [2024-06-10 11:57:15.785605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.104 [2024-06-10 11:57:15.786619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.104 [2024-06-10 11:57:15.787726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.104 [2024-06-10 11:57:15.788036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.104 [2024-06-10 11:57:15.788322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.104 [2024-06-10 11:57:15.788591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.104 [2024-06-10 11:57:15.788857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.104 [2024-06-10 11:57:15.789406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.104 [2024-06-10 11:57:15.789661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.789675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.789685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.789696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.791556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.792531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.793491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.794069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.794419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.794691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.794952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.795210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.796188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.796373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.796385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.796394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.796404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.798593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.799635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.800593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.800853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.801191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.801466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.801726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.802345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.803137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.803324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.803336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.803345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.803355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.805348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.806276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.806768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.807033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.807373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.807643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.807976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.808800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.809738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.809925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.809938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.809947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.809957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.812036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.812852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.813120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.813382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.813670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.813950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.814750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.815534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.816473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.816658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.816671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.816680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.816689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.818727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.819068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.819329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.819586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.819914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.820372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.821160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.822076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.823009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.823198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.823211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.823221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.823231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.824978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.825248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.825510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.825777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.826119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.827017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.827796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.828724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.829655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.830002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.830016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.830026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.830036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.831314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.831579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.105 [2024-06-10 11:57:15.831841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.832105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.832333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.833125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.834053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.834986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.835622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.835807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.835821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.835830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.835842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.837216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.837481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.837746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.838062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.838245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.839157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.840112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.841094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.841584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.841803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.841817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.841827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.841838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.843269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.843530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.843789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.844702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.844918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.845862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.846792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.847318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.848326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.848513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.848525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.848535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.848545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.850108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.850380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.850760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.851547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.851732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.852763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.853832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.854413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.855200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.855384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.855396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.855406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.855415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.857056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.857317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.857350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.858355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.858538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.859482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.860422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.860830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.861641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.861830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.861841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.861851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.861860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.863492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.863757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.864409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.865198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.865383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.866338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.867101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.867134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.868085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.868312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.868326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.868335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.868346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.869900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.869935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.870192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.870222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.870483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.871304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.871339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.872299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.873226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.873418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.873430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.873440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.873453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.875160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.875196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.875469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.875743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.106 [2024-06-10 11:57:15.876103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.876155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.876415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.877327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.878339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.878524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.878536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.878545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.878555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.879721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.880668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.881441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.881474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.881804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.881846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.881881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.882140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.882172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.882537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.882551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.882562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.882573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.883743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.883791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.883819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.883847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.884033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.884713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.884746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.885535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.885567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.885752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.885764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.885774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.885784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.887475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.887519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.887549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.887577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.887926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.887968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.887997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.888024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.888054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.888289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.888303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.888312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.888322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.889379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.889411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.889439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.889471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.889653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.889691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.889727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.889758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.889789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.889980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.889992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.890003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.890013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.891389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.891423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.891451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.891479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.891762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.891801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.891833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.891861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.891895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.892226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.892240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.892250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.892259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.893281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.893313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.893349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.893376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.893601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.893644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.893673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.893700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.893728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.893951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.893965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.893975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.893985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.895262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.895299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.895326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.895354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.895643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.895691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.107 [2024-06-10 11:57:15.895722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.895750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.895777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.896101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.896116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.896127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.896138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.897229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.897260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.897290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.897317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.897607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.897650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.897693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.897721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.897748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.897940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.897955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.897965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.897975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.899140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.899172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.899200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.899230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.899552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.899595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.899627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.899655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.899697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.900053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.900067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.900077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.900089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.901252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.901283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.901327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.901355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.901537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.901580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.901610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.901640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.901667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.901844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.901857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.901871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.901881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.903006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.903063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.903093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.903120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.903450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.903489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.903520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.903549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.903580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.903841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.903858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.903873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.903883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.905235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.905267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.905297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.905324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.905505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.905551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.905580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.905611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.905638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.905913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.905926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.905938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.905950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.906985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.907020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.907049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.907078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.907385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.907429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.907462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.907491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.907522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.907862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.907884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.907895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.907907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.909383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.909430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.909464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.909492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.909676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.909721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.909749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.108 [2024-06-10 11:57:15.909778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.909805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.910039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.910052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.910062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.910073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.911093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.911134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.911161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.911192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.911473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.911522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.911552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.911580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.911609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.911964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.911979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.911990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.912002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.913506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.913538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.913568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.913597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.913777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.913822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.913854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.913887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.913923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.914109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.914122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.914132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.914142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.915290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.915322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.915350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.915378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.915607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.915656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.915686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.915725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.915755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.916121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.916137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.916147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.916159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.917713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.917749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.917777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.917804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.918002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.918047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.918076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.918103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.918131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.918309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.918321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.918334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.918344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.919459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.919490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.919517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.919557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.919737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.919774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.919810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.919839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.919873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.920159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.920172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.920184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.920195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.921847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.921883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.921913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.921947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.922134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.922175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.922203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.109 [2024-06-10 11:57:15.922237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.922274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.922459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.922471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.922482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.922494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.923625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.923658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.923689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.923718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.924066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.924109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.924140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.924168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.924198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.924523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.924537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.924548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.924559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.926047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.926084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.926111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.926138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.926315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.926361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.926391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.926418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.926447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.926708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.926721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.926731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.926742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.927819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.927852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.927888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.927926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.928251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.928300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.928329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.928361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.928391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.928727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.928742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.928753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.928766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.930198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.930230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.930260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.930288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.930468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.930512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.930542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.930575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.930608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.930790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.930803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.930813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.930823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.931928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.931961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.931990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.932019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.932267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.932317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.932360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.932391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.932419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.932765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.932782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.932794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.932810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.934369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.934401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.934430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.934458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.934637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.934683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.934712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.934740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.934769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.934955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.934969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.934979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.934990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.936135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.936176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.936208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.936235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.936418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.936463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.936494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.936523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.936552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.936882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.110 [2024-06-10 11:57:15.936913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.936927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.936940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.938550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.938589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.938622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.938657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.938843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.938905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.938934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.938963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.938992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.939177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.939189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.939200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.939210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.940347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.940379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.941315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.941350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.941641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.941691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.941720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.941749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.941778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.942118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.942133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.942145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.942157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.943641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.943690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.943718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.943746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.943937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.943982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.944012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.945117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.945165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.945518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.945532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.945544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.945555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.946590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.946857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.946898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.947158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.947421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.947469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.947728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.947759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.947788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.948031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.948045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.948056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.948067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.949741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.950012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.950051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.950092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.950480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.950766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.950802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.950831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.950858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.951188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.951202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.951214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.951231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.953099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.953138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.953168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.953425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.953803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.954079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.954347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.954384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.954650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.955010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.955026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.955037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.955050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.956990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.957257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.957515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.957776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.958069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.958131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.958394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.958430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.958709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.959044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.959058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.959069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.959081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.111 [2024-06-10 11:57:15.960974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.961239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.961512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.961775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.962086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.962362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.962630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.962903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.963165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.963496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.963510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.963520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.963530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.965508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.965781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.966088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.966355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.966691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.966969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.967232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.967495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.967759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.968098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.968112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.968123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.968133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.970156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.970423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.970687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.970976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.971319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.971601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.971879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.972151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.972422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.972759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.972773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.972784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.972796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.974766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.975047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.975319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.975604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.975908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.976190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.976458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.976724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.977002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.977322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.977335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.977345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.977356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.979463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.979741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.980017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.980285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.980587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.980865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.981139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.981410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.981682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.982037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.982052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.982063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.982078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.984029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.984303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.984584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.984853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.985125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.985407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.985694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.985971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.986248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.986602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.986616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.986626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.986639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.988668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.988946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.989217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.989489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.989823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.990109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.990381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.990649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.990926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.991214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.991228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.991239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.991250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.993270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.112 [2024-06-10 11:57:15.993546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:15.993816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:15.994091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:15.994444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:15.994728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:15.995004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:15.995273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:15.995542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:15.995917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:15.995932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:15.995943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:15.995955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:15.997929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:15.998206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:15.998478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:15.999560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:15.999843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:16.000843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:16.001123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:16.001394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:16.001660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:16.002013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:16.002028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:16.002040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:16.002052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:16.003997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:16.004274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:16.004544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:16.004815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:16.005126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:16.005406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:16.005674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:16.005947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:16.006219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:16.006489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:16.006502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:16.006513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:16.006525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:16.008731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:16.009022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:16.009303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:16.009565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:16.009907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:16.010184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:16.010463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:16.010728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:16.011004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:16.011331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:16.011345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:16.011355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:16.011367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:16.013726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:16.014752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:16.015320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:16.016141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:16.016328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:16.017280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:16.018099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:16.018363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:16.018626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:16.018913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:16.018927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:16.018938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:16.018949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:16.021178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:16.021638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:16.022550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:16.023565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:16.023754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:16.024715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:16.024987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:16.025251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:16.025512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:16.025846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:16.025860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:16.025876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:16.025887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:16.027941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:16.028659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.113 [2024-06-10 11:57:16.029432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.114 [2024-06-10 11:57:16.030409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.114 [2024-06-10 11:57:16.030601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.114 [2024-06-10 11:57:16.031332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.114 [2024-06-10 11:57:16.031604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.114 [2024-06-10 11:57:16.031875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.114 [2024-06-10 11:57:16.032157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.114 [2024-06-10 11:57:16.032492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.114 [2024-06-10 11:57:16.032506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.114 [2024-06-10 11:57:16.032517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.114 [2024-06-10 11:57:16.032529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.114 [2024-06-10 11:57:16.034041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.114 [2024-06-10 11:57:16.034838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.114 [2024-06-10 11:57:16.035769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.114 [2024-06-10 11:57:16.036751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.114 [2024-06-10 11:57:16.036944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.114 [2024-06-10 11:57:16.037229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.114 [2024-06-10 11:57:16.037499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.114 [2024-06-10 11:57:16.037767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.114 [2024-06-10 11:57:16.038042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.114 [2024-06-10 11:57:16.038276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.114 [2024-06-10 11:57:16.038290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.114 [2024-06-10 11:57:16.038301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.114 [2024-06-10 11:57:16.038312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.114 [2024-06-10 11:57:16.040170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.114 [2024-06-10 11:57:16.040994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.114 [2024-06-10 11:57:16.041979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.114 [2024-06-10 11:57:16.042962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.114 [2024-06-10 11:57:16.043250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.114 [2024-06-10 11:57:16.043536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.114 [2024-06-10 11:57:16.043808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.114 [2024-06-10 11:57:16.044085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.114 [2024-06-10 11:57:16.044500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.114 [2024-06-10 11:57:16.044690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.114 [2024-06-10 11:57:16.044704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.114 [2024-06-10 11:57:16.044715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.114 [2024-06-10 11:57:16.044726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.378 [2024-06-10 11:57:16.046688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.378 [2024-06-10 11:57:16.047656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.378 [2024-06-10 11:57:16.048586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.378 [2024-06-10 11:57:16.049159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.378 [2024-06-10 11:57:16.049493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.378 [2024-06-10 11:57:16.049769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.378 [2024-06-10 11:57:16.050055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.378 [2024-06-10 11:57:16.050324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.378 [2024-06-10 11:57:16.051362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.378 [2024-06-10 11:57:16.051580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.378 [2024-06-10 11:57:16.051596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.378 [2024-06-10 11:57:16.051607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.378 [2024-06-10 11:57:16.051618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.378 [2024-06-10 11:57:16.053614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.378 [2024-06-10 11:57:16.054619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.378 [2024-06-10 11:57:16.055668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.378 [2024-06-10 11:57:16.055941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.378 [2024-06-10 11:57:16.056274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.378 [2024-06-10 11:57:16.056549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.378 [2024-06-10 11:57:16.056812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.378 [2024-06-10 11:57:16.057338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.378 [2024-06-10 11:57:16.058135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.378 [2024-06-10 11:57:16.058323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.378 [2024-06-10 11:57:16.058335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.378 [2024-06-10 11:57:16.058345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.378 [2024-06-10 11:57:16.058354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.378 [2024-06-10 11:57:16.060389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.378 [2024-06-10 11:57:16.061338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.378 [2024-06-10 11:57:16.061800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.378 [2024-06-10 11:57:16.062076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.378 [2024-06-10 11:57:16.062414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.378 [2024-06-10 11:57:16.062693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.378 [2024-06-10 11:57:16.062965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.378 [2024-06-10 11:57:16.063999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.378 [2024-06-10 11:57:16.065005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.378 [2024-06-10 11:57:16.065195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.378 [2024-06-10 11:57:16.065208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.378 [2024-06-10 11:57:16.065217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.378 [2024-06-10 11:57:16.065227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.378 [2024-06-10 11:57:16.067337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.378 [2024-06-10 11:57:16.068277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.378 [2024-06-10 11:57:16.068549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.378 [2024-06-10 11:57:16.068819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.378 [2024-06-10 11:57:16.069103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.378 [2024-06-10 11:57:16.069380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.378 [2024-06-10 11:57:16.070050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.378 [2024-06-10 11:57:16.070828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.378 [2024-06-10 11:57:16.071766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.378 [2024-06-10 11:57:16.071958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.378 [2024-06-10 11:57:16.071971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.378 [2024-06-10 11:57:16.071980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.378 [2024-06-10 11:57:16.071990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.378 [2024-06-10 11:57:16.074051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.378 [2024-06-10 11:57:16.074361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.378 [2024-06-10 11:57:16.074622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.074909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.075273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.075649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.076472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.077408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.078331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.078515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.078527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.078537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.078547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.080429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.080696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.080982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.081251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.081596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.082469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.083238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.084149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.085071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.085344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.085358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.085368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.085379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.086695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.086965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.087230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.087494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.087749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.088543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.089472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.090396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.091102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.091289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.091302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.091314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.091324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.092688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.092958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.093235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.093499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.093686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.094510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.095433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.096363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.096796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.096988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.097002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.097016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.097027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.098476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.098745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.099014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.099806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.100025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.100973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.101900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.102532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.103535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.103759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.103773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.103782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.103792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.105404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.105677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.106061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.106847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.107037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.108079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.109126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.109723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.110512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.110699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.110710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.110720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.110729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.112485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.112751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.112789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.113787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.113978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.114925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.115849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.116276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.117061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.117245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.117258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.117267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.117277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.118895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.119160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.119799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.379 [2024-06-10 11:57:16.120580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.120767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.121714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.122494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.122527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.123493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.123741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.123755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.123764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.123774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.125435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.125476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.125735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.125766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.126032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.126823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.126856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.127780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.128710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.128911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.128923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.128933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.128943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.130604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.130640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.130909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.131172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.131518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.131566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.131824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.132778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.133828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.134021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.134033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.134043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.134053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.135287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.136229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.136842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.136881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.137230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.137279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.137310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.137567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.137609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.137973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.137988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.138016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.138027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.139181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.139215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.139243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.139270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.139477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.140239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.140272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.141037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.141071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.141257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.141270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.141281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.141290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.142897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.142930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.142958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.142985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.143307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.143346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.143378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.143406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.143435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.143616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.143627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.143637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.143647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.144817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.144853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.144885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.144918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.145138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.145185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.145215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.145244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.145272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.145456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.145468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.145479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.145489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.147030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.147070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.147111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.147142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.380 [2024-06-10 11:57:16.147497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.147538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.147569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.147597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.147629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.147888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.147902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.147912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.147922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.148978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.149009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.149037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.149065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.149248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.149289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.149325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.149356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.149390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.149570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.149582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.149592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.149602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.150987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.151021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.151048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.151077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.151380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.151426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.151458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.151486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.151515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.151845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.151859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.151873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.151884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.152916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.152948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.152987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.153015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.153231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.153272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.153300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.153329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.153356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.153593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.153606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.153615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.153628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.155034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.155067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.155095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.155125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.155379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.155430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.155460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.155488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.155516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.155855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.155872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.155883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.155894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.156949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.156980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.157008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.157036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.157395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.157442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.157471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.157497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.157525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.157738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.157752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.157762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.157773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.159040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.159073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.159101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.159135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.159472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.159520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.159560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.159589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.159618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.159985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.160000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.160012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.160023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.161156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.161188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.161218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.161245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.381 [2024-06-10 11:57:16.161481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.161525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.161554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.161594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.161622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.161804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.161817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.161828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.161838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.163041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.163073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.163101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.163128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.163459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.163500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.163529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.163559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.163591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.163944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.163958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.163969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.163980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.165211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.165242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.165269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.165302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.165595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.165643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.165672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.165700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.165729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.165910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.165922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.165932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.165942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.167439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.167484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.167522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.167552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.167912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.167953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.167983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.168012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.168043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.168293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.168307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.168317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.168327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.169406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.169438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.169468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.169500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.169680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.169721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.169757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.169802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.169829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.170017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.170030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.170040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.170050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.171453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.171487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.171517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.171545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.171864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.171924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.171954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.171985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.172014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.172332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.172345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.172355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.172365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.173403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.173442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.173469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.173497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.173702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.382 [2024-06-10 11:57:16.173744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.173771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.173799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.173829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.174035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.174048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.174059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.174069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.175553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.175586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.175615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.175644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.175897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.175946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.175987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.176015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.176043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.176371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.176385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.176396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.176406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.177440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.177472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.177502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.177529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.177907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.177954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.177983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.178010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.178039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.178255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.178269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.178278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.178288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.179623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.179656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.179684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.179712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.180008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.180059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.180089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.180117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.180146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.180467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.180482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.180492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.180503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.181623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.181655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.181683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.181710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.181975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.182022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.182056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.182084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.182123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.182329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.182341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.182352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.182362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.183610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.183642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.183670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.183698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.184027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.184066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.184097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.184131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.184172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.184537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.184551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.184562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.184572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.185746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.185783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.185813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.185840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.186024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.186070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.186100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.186128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.186159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.186361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.186374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.186384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.186393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.187577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.187609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.187651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.383 [2024-06-10 11:57:16.187691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.188072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.188121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.188153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.188181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.188210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.188468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.188482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.188491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.188503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.190170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.190202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.190232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.190259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.190575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.190613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.190655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.190696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.190725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.191087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.191102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.191112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.191122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.192835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.192871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.192900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.192929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.193211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.193258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.193288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.193316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.193345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.193677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.193692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.193702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.193712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.195566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.195609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.195640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.195668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.196007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.196046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.196075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.196104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.196133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.196392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.196406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.196416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.196427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.198134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.198167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.198430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.198471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.198836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.198893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.198945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.198982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.199021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.199336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.199350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.199360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.199370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.201088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.201124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.201152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.201180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.201451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.201499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.201528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.201799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.201835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.202163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.202178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.202190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.202200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.204014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.204278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.204310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.204569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.204898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.204945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.205204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.205240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.205270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.205502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.205516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.205528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.205540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.207411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.207677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.207719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.207747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.384 [2024-06-10 11:57:16.208116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.208392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.208431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.208461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.208488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.208766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.208780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.208791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.208802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.210672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.210712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.210740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.211004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.211343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.211617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.211888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.211925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.212185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.212505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.212519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.212530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.212542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.214515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.214782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.215057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.215320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.215618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.215668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.215935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.215972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.216229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.216523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.216541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.216555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.216569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.218564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.218831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.219100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.219365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.219630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.220269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.220793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.221079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.221882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.222150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.222163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.222173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.222183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.224153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.224422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.225344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.225636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.225977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.226259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.226534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.227475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.227753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.228096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.228111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.228122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.228136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.229947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.230967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.231241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.231602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.231791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.232084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.232355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.232632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.233022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.233216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.233230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.233240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.233251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.235167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.235448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.235941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.236649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.236995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.237398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.238211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.238478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.238746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.239081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.239097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.239108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.239120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.241249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.241523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.241793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.242070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.242298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.385 [2024-06-10 11:57:16.242933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.243206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.243983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.244402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.244735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.244749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.244763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.244775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.246485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.247493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.247770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.248055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.248372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.248647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.249632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.249921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.250193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.250382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.250395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.250406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.250417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.253244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.253512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.253878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.254676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.255072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.255353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.256337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.256607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.257599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.257882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.257895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.257909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.257920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.259657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.259930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.260196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.260964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.261221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.261497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.262144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.262665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.262930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.263193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.263206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.263218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.263230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.265540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.265907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.266168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.266431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.266695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.267512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.267864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.268129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.269070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.269384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.269398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.269408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.269418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.271392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.271662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.271932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.272205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.272531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.272808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.273079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.273344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.273608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.273894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.273908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.273919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.273931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.276278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.277064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.277996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.278933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.279185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.280021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.280799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.281763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.282712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.282982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.282996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.283006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.283016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.285835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.286847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.287859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.288790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.289040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.289831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.386 [2024-06-10 11:57:16.290795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.387 [2024-06-10 11:57:16.291765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.387 [2024-06-10 11:57:16.292442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.387 [2024-06-10 11:57:16.292710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.387 [2024-06-10 11:57:16.292723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.387 [2024-06-10 11:57:16.292733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.387 [2024-06-10 11:57:16.292743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.387 [2024-06-10 11:57:16.295139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.387 [2024-06-10 11:57:16.296073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.387 [2024-06-10 11:57:16.297000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.387 [2024-06-10 11:57:16.297416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.387 [2024-06-10 11:57:16.297855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.387 [2024-06-10 11:57:16.298865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.387 [2024-06-10 11:57:16.299819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.387 [2024-06-10 11:57:16.300317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.387 [2024-06-10 11:57:16.300582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.387 [2024-06-10 11:57:16.300914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.387 [2024-06-10 11:57:16.300928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.387 [2024-06-10 11:57:16.300940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.387 [2024-06-10 11:57:16.300951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.387 [2024-06-10 11:57:16.303513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.387 [2024-06-10 11:57:16.304571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.387 [2024-06-10 11:57:16.305144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.387 [2024-06-10 11:57:16.305931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.387 [2024-06-10 11:57:16.306114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.387 [2024-06-10 11:57:16.307068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.387 [2024-06-10 11:57:16.307911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.387 [2024-06-10 11:57:16.308176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.387 [2024-06-10 11:57:16.308438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.387 [2024-06-10 11:57:16.308731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.387 [2024-06-10 11:57:16.308744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.387 [2024-06-10 11:57:16.308755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.387 [2024-06-10 11:57:16.308770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.387 [2024-06-10 11:57:16.311058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.387 [2024-06-10 11:57:16.311539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.387 [2024-06-10 11:57:16.312479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.387 [2024-06-10 11:57:16.313518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.387 [2024-06-10 11:57:16.313707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.387 [2024-06-10 11:57:16.314699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.387 [2024-06-10 11:57:16.314979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.387 [2024-06-10 11:57:16.315247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.387 [2024-06-10 11:57:16.315518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.387 [2024-06-10 11:57:16.315864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.387 [2024-06-10 11:57:16.315884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.387 [2024-06-10 11:57:16.315895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.387 [2024-06-10 11:57:16.315907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.387 [2024-06-10 11:57:16.317928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.387 [2024-06-10 11:57:16.318820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.650 [2024-06-10 11:57:16.319641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.650 [2024-06-10 11:57:16.320612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.650 [2024-06-10 11:57:16.320802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.650 [2024-06-10 11:57:16.321354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.650 [2024-06-10 11:57:16.321627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.650 [2024-06-10 11:57:16.321904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.650 [2024-06-10 11:57:16.322168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.650 [2024-06-10 11:57:16.322510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.650 [2024-06-10 11:57:16.322527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.650 [2024-06-10 11:57:16.322538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.650 [2024-06-10 11:57:16.322549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.650 [2024-06-10 11:57:16.324157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.650 [2024-06-10 11:57:16.324954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.650 [2024-06-10 11:57:16.325908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.650 [2024-06-10 11:57:16.326857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.650 [2024-06-10 11:57:16.327087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.650 [2024-06-10 11:57:16.327366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.650 [2024-06-10 11:57:16.327628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.650 [2024-06-10 11:57:16.327895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.650 [2024-06-10 11:57:16.328157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.650 [2024-06-10 11:57:16.328347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.650 [2024-06-10 11:57:16.328360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.650 [2024-06-10 11:57:16.328370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.650 [2024-06-10 11:57:16.328381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.650 [2024-06-10 11:57:16.330416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.650 [2024-06-10 11:57:16.331409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.650 [2024-06-10 11:57:16.332340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.650 [2024-06-10 11:57:16.333243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.650 [2024-06-10 11:57:16.333494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.650 [2024-06-10 11:57:16.333769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.650 [2024-06-10 11:57:16.334035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.650 [2024-06-10 11:57:16.334298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.650 [2024-06-10 11:57:16.334963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.650 [2024-06-10 11:57:16.335187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.650 [2024-06-10 11:57:16.335201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.650 [2024-06-10 11:57:16.335211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.650 [2024-06-10 11:57:16.335222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.650 [2024-06-10 11:57:16.337191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.650 [2024-06-10 11:57:16.338151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.650 [2024-06-10 11:57:16.339115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.650 [2024-06-10 11:57:16.339457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.650 [2024-06-10 11:57:16.339815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.650 [2024-06-10 11:57:16.340096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.650 [2024-06-10 11:57:16.340361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.340681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.341546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.341735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.341747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.341757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.341767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.343872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.344851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.345510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.345786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.346141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.346423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.346692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.347609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.348384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.348570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.348584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.348596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.348608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.350884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.351913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.352186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.352463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.352730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.353030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.353674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.354457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.355390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.355574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.355589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.355599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.355615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.357734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.358050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.358334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.358602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.358974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.359344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.360172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.361147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.362123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.362318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.362332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.362343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.362354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.364027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.364306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.364572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.364846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.365183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.366286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.367286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.368354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.369422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.369688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.369701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.369711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.369721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.371101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.371366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.371627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.371890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.372080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.372885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.373827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.374772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.375205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.375391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.375404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.375415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.375426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.376957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.377224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.377485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.378500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.378730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.379705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.380638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.381050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.381889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.382076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.382088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.382097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.382107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.383767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.384040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.384887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.385678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.385861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.386826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.651 [2024-06-10 11:57:16.387373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.388395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.389440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.389629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.389641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.389650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.389660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.391498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.392263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.392297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.393075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.393261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.394224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.394756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.395748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.396789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.396979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.396992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.397001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.397011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.398802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.399496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.400298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.401236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.401424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.402140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.403091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.403133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.404097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.404284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.404297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.404306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.404317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.406135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.406173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.406970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.407004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.407218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.408177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.408211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.409139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.409551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.409734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.409748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.409758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.409771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.411265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.411303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.411561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.411820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.412030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.412074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.412854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.413804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.414751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.415026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.415040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.415050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.415060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.416152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.416420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.416680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.416712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.417025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.417070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.417101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.417356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.417389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.417573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.417587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.417598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.417608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.418765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.418797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.418825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.418852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.419096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.420056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.420092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.421031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.421065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.421326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.421341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.421350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.421361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.423091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.423123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.423150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.423178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.423383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.423428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.423455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.423483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.652 [2024-06-10 11:57:16.423510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.423695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.423708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.423717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.423727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.424923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.424956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.424983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.425010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.425189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.425234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.425261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.425288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.425315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.425622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.425635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.425646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.425656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.427575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.427607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.427635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.427662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.427903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.427948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.427976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.428004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.428031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.428206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.428219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.428228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.428238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.429427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.429473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.429500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.429527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.429707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.429751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.429779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.429806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.429833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.430053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.430066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.430076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.430088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.432131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.432162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.432189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.432216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.432423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.432468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.432497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.432525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.432552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.432731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.432744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.432753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.432763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.433975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.434015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.434048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.434076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.434258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.434308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.434336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.434363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.434391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.434616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.434629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.434639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.434650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.436702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.436735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.436762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.436788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.437008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.437057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.437084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.437112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.437139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.437316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.437328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.437338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.437348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.438516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.438549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.438584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.438614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.438897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.438936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.438965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.438993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.653 [2024-06-10 11:57:16.439022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.439349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.439370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.439381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.439393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.440879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.440912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.440952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.440979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.441156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.441209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.441238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.441266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.441293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.441487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.441500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.441511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.441521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.442606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.442638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.442668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.442704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.442980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.443019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.443046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.443075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.443102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.443428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.443443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.443453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.443463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.444990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.445025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.445052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.445096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.445278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.445323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.445360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.445389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.445416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.445597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.445610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.445620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.445630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.446713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.446745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.446772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.446802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.447113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.447156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.447185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.447213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.447241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.447550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.447565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.447576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.447587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.449120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.449153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.449179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.449220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.449402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.449440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.449481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.449510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.449537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.449715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.449729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.449739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.449748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.450840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.450877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.450907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.450936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.451254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.451304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.451333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.451361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.451390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.451714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.451729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.451740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.451753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.454135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.454172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.454199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.454227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.454546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.454597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.654 [2024-06-10 11:57:16.454626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.454654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.454683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.454903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.454918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.454933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.454944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.458250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.458291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.458321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.458349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.458531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.458576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.458605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.458632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.458661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.458984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.458998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.459010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.459022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.461902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.461945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.461981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.462010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.462187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.462236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.462267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.462295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.462322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.462500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.462512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.462522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.462531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.465188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.465230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.465270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.465298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.465481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.465526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.465553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.465581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.465608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.465785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.465797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.465807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.465816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.468572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.468609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.468637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.468667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.469060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.469104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.469134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.469162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.469190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.469474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.469487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.469498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.469508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.471693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.471730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.471759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.471787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.472084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.472150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.472204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.472246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.472274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.472563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.472577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.472587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.472597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.474840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.474884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.474913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.474940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.475279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.655 [2024-06-10 11:57:16.475336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.475367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.475396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.475427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.475724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.475738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.475749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.475760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.477980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.478017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.478048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.478076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.478375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.478430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.478470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.478500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.478530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.478768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.478782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.478798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.478809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.481074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.481112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.481139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.481166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.481483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.481526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.481556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.481583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.481613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.481922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.481937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.481948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.481960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.484213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.484250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.484278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.484306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.484577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.484626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.484656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.484684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.484714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.484966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.484980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.484990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.485001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.487214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.487252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.487280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.487311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.487629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.487670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.487698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.487727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.487757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.488067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.488082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.488092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.488103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.490370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.490407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.490439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.490467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.490744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.490792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.490821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.490850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.490884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.491152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.491166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.491176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.491186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.493403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.493442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.493701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.493732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.494024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.494067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.494096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.494131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.494160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.494506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.494522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.494533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.494546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.496949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.497000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.497039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.497067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.497360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.656 [2024-06-10 11:57:16.497410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.657 [2024-06-10 11:57:16.497441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.657 [2024-06-10 11:57:16.497699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.657 [2024-06-10 11:57:16.497733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.657 [2024-06-10 11:57:16.498070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.657 [2024-06-10 11:57:16.498085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.657 [2024-06-10 11:57:16.498097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.657 [2024-06-10 11:57:16.498108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.657 [2024-06-10 11:57:16.500397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.657 [2024-06-10 11:57:16.500663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.657 [2024-06-10 11:57:16.500697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.657 [2024-06-10 11:57:16.500963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.657 [2024-06-10 11:57:16.501280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.657 [2024-06-10 11:57:16.501339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.657 [2024-06-10 11:57:16.501587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.657 [2024-06-10 11:57:16.501616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.657 [2024-06-10 11:57:16.501644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.657 [2024-06-10 11:57:16.501961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.657 [2024-06-10 11:57:16.501976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.657 [2024-06-10 11:57:16.501986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.657 [2024-06-10 11:57:16.502002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.657 [2024-06-10 11:57:16.503760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.657 [2024-06-10 11:57:16.504029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.657 [2024-06-10 11:57:16.504062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.657 [2024-06-10 11:57:16.504091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.657 [2024-06-10 11:57:16.504368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.657 [2024-06-10 11:57:16.504636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.657 [2024-06-10 11:57:16.504668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.657 [2024-06-10 11:57:16.504697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.657 [2024-06-10 11:57:16.504725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.657 [2024-06-10 11:57:16.504987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.657 [2024-06-10 11:57:16.505001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.657 [2024-06-10 11:57:16.505012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.657 [2024-06-10 11:57:16.505023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.657 [2024-06-10 11:57:16.507008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.657 [2024-06-10 11:57:16.507054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.657 [2024-06-10 11:57:16.507091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.657 [2024-06-10 11:57:16.507357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.657 [2024-06-10 11:57:16.507648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.657 [2024-06-10 11:57:16.507929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.657 [2024-06-10 11:57:16.508191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.657 [2024-06-10 11:57:16.508225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.657 [2024-06-10 11:57:16.508482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.657 [2024-06-10 11:57:16.508761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.657 [2024-06-10 11:57:16.508774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.657 [2024-06-10 11:57:16.508784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.657 [2024-06-10 11:57:16.508795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:46:32.919 [2024-06-10 11:57:16.615226] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.919 [2024-06-10 11:57:16.615323] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.919 [2024-06-10 11:57:16.615573] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.919 [2024-06-10 11:57:16.615622] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.919 [2024-06-10 11:57:16.615873] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.919 [2024-06-10 11:57:16.616165] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.919 [2024-06-10 11:57:16.616359] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.919 [2024-06-10 11:57:16.616388] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.919 [2024-06-10 11:57:16.618310] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.919 [2024-06-10 11:57:16.619250] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.919 [2024-06-10 11:57:16.620192] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.919 [2024-06-10 11:57:16.620964] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.919 [2024-06-10 11:57:16.621526] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.919 [2024-06-10 11:57:16.621793] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.919 [2024-06-10 11:57:16.622066] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.919 [2024-06-10 11:57:16.623042] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.919 [2024-06-10 11:57:16.623300] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.919 [2024-06-10 11:57:16.623318] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.919 [2024-06-10 11:57:16.625230] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.919 [2024-06-10 11:57:16.626175] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.919 [2024-06-10 11:57:16.627170] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.919 [2024-06-10 11:57:16.627443] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.919 [2024-06-10 11:57:16.628043] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.919 [2024-06-10 11:57:16.628312] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.919 [2024-06-10 11:57:16.628891] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.919 [2024-06-10 11:57:16.629683] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.919 [2024-06-10 11:57:16.629878] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.919 [2024-06-10 11:57:16.629895] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.919 [2024-06-10 11:57:16.631907] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.919 [2024-06-10 11:57:16.632846] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.919 [2024-06-10 11:57:16.633372] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.919 [2024-06-10 11:57:16.633642] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.919 [2024-06-10 11:57:16.634226] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.919 [2024-06-10 11:57:16.634494] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.919 [2024-06-10 11:57:16.635432] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.919 [2024-06-10 11:57:16.636433] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.636625] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.636641] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.638701] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.639579] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.639846] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.640119] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.640694] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.641548] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.642340] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.643282] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.643470] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.643487] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.645557] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.645831] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.646106] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.646371] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.647181] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.647973] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.648911] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.649849] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.650092] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.650112] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.651756] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.652043] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.652320] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.652590] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.653921] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.654969] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.656000] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.656943] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.657195] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.657212] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.658529] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.658801] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.659073] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.659339] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.660319] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.661262] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.662201] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.662648] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.662855] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.662878] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.664279] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.664547] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.664585] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.664847] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.665822] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.666075] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.667041] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.667987] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.668027] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.668451] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.669250] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.669442] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.669460] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.669474] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.671115] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.671158] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.671426] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.671482] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.671671] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.672526] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.672565] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.673599] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.673644] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.673828] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.673844] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.673857] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.675859] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.675906] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.676170] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.676209] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.676526] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.676802] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.676842] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.677110] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.677147] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.677342] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.677360] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.677375] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.679464] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.679509] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.680445] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.680483] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.680740] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.681028] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.681069] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.681338] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.681375] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.920 [2024-06-10 11:57:16.681724] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.681746] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.681762] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.683936] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.683982] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.684747] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.684783] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.685001] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.685899] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.685938] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.686392] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.686429] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.686617] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.686635] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.686649] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.688623] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.688664] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.689385] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.689425] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.689615] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.690255] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.690295] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.691016] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.691056] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.691243] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.691261] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.691275] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.693265] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.693307] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.694084] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.694123] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.694315] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.695304] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.695345] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.695956] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.695995] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.696185] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.696202] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.696216] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.698107] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.698149] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.698415] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.698457] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.698647] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.699692] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.699738] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.700451] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.700489] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.700714] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.700732] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.700746] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.702570] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.702614] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.702893] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.702956] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.703151] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.703588] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.703629] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.703909] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.703948] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.704252] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.704272] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.704291] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.706188] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.706232] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.706497] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.706536] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.706854] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.707137] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.707185] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.707462] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.707502] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.707833] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.707852] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.707877] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.709779] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.709822] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.710092] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.710132] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.710440] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.710716] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.710757] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.711029] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.711083] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.711369] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.711388] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.711402] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.713668] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.713728] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.714017] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.714057] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.714393] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.921 [2024-06-10 11:57:16.714673] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.714709] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.714981] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.715018] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.715301] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.715319] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.715335] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.717281] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.717329] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.717614] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.717676] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.718002] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.718280] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.718321] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.718585] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.718621] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.718967] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.718987] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.719005] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.721129] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.721176] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.721440] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.721492] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.721843] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.722141] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.722179] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.722445] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.722483] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.722806] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.722825] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.722839] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.724754] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.724803] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.725073] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.725113] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.725419] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.725700] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.725746] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.726020] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.726059] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.726393] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.726413] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.726432] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.728411] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.728454] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.728716] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.728760] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.729130] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.729406] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.729453] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.729726] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.729771] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.730090] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.730109] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.730124] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.732008] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.732053] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.732318] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.732355] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.732692] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.732974] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.733013] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.733281] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.733329] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.733723] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.733740] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.733754] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.735756] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.735806] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.736078] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.736120] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.736466] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.736741] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.736777] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.737060] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.737097] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.737411] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.737429] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.737444] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.739386] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.739429] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.739695] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.739735] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.739989] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.740268] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.740311] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.740576] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.740615] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.740998] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.741020] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.741035] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.922 [2024-06-10 11:57:16.742924] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.742969] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.743246] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.743284] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.743537] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.743838] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.743886] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.744165] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.744204] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.744545] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.744565] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.744583] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.746605] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.746650] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.746918] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.746954] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.747271] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.747549] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.747595] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.747861] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.747908] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.748257] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.748275] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.748292] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.750224] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.750277] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.750542] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.750579] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.750887] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.751164] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.751202] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.751470] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.751514] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.751768] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.751787] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.751809] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.753909] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.753955] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.754224] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.754265] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.754651] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.754936] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.754973] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.755238] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.755274] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.755540] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.755566] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.755582] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.757515] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.757559] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.758413] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.758452] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.758823] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.759108] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.759149] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.759416] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.759454] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.759753] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.759771] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.759786] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.761445] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.761489] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.761528] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.761564] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.761910] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.762187] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.762227] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.762263] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.762298] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.762582] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.762600] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.762613] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.763775] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.763810] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.763842] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.763878] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.764151] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.764199] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.764240] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.764273] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.764306] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.764654] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.764673] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.764689] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.766201] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.766236] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.766267] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.766298] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.766565] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.766613] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.766659] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.766693] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.923 [2024-06-10 11:57:16.766740] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.767145] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.767174] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.767189] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.768914] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.768950] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.768981] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.769013] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.769199] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.769248] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.769281] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.769313] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.769344] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.769645] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.769662] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.769676] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.770897] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.770933] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.771335] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.838970] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.839030] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.839951] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.841777] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.842035] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.842074] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.842118] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.842948] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.843138] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.843188] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.843234] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.844159] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.844200] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.844242] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.844843] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.845037] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.845055] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.847812] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.848103] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.848142] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.848875] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.849097] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.849146] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.849189] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.850113] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.850153] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.851092] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.851386] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.851404] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.851418] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.852452] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.852724] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.852764] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.853031] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.853393] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.853441] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.853704] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.853741] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.854563] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.854752] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.854768] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.854781] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.857663] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.858353] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.858391] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.858664] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.859016] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.859072] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.859344] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.859383] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.859982] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.860218] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.860236] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.860251] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.861387] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.862199] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:32.924 [2024-06-10 11:57:16.862238] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.187 [2024-06-10 11:57:16.863181] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.187 [2024-06-10 11:57:16.863379] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.187 [2024-06-10 11:57:16.863429] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.187 [2024-06-10 11:57:16.863810] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.187 [2024-06-10 11:57:16.863848] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.187 [2024-06-10 11:57:16.864126] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.187 [2024-06-10 11:57:16.864406] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.187 [2024-06-10 11:57:16.864435] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.187 [2024-06-10 11:57:16.864450] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.187 [2024-06-10 11:57:16.866795] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.187 [2024-06-10 11:57:16.867789] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.187 [2024-06-10 11:57:16.867835] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.187 [2024-06-10 11:57:16.868887] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.187 [2024-06-10 11:57:16.869079] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.187 [2024-06-10 11:57:16.869128] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.187 [2024-06-10 11:57:16.870206] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.187 [2024-06-10 11:57:16.870244] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.187 [2024-06-10 11:57:16.870510] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.870862] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.870885] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.870901] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.872393] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.873352] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.873391] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.874390] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.874728] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.874778] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.875567] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.875604] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.876542] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.876731] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.876747] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.876760] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.879318] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.880242] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.880282] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.881311] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.881503] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.881554] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.882195] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.882231] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.883006] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.883198] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.883216] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.883230] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.884790] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.885067] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.885111] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.885657] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.885901] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.885951] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.886899] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.886935] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.887874] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.888107] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.888126] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.888149] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.891516] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.891790] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.891827] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.892099] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.892323] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.892369] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.893157] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.893195] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.894135] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.894323] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.894339] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.894352] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.895480] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.896529] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.896795] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.897066] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.897388] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.897435] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.897701] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.898645] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.899503] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.899698] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.899714] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.899728] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.903048] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.903324] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.903588] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.903857] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.904048] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.904827] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.905768] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.906705] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.907144] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.907333] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.907356] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.907372] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.908765] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.909038] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.909306] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.910147] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.910384] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.911350] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.912298] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.912797] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.913759] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.913955] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.913971] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.913985] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.917327] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.918129] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.919077] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.188 [2024-06-10 11:57:16.920020] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.920269] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.921242] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.922132] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.923079] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.924095] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.924483] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.924509] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.924526] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.927088] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.928038] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.928989] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.929027] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.929352] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.930379] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.931330] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.932347] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.932395] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.932582] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.932599] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.932612] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.935501] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.935547] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.936484] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.936520] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.936801] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.937746] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.937787] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.938724] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.938761] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.938953] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.938972] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.938987] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.940841] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.940884] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.941803] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.941843] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.942041] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.942997] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.943045] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.943678] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.943715] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.943941] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.943958] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.943973] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.946894] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.946939] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.947726] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.947763] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.947954] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.948858] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.948905] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.949529] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.949568] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.949757] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.949774] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.949788] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.951724] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.951778] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.952501] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.952541] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.952731] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.953378] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.953417] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.954209] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.954245] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.954430] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.954446] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.954459] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.957805] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.957861] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.958756] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.958793] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.959000] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.959792] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.959829] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.960766] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.960805] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.961102] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.961120] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.961134] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.963781] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.963845] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.964481] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.964518] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.964745] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.965030] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.965075] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.965341] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.965388] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.965781] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.965799] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.965821] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.968550] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.189 [2024-06-10 11:57:16.968823] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.968872] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.969137] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.969494] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.969769] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.969805] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.970083] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.970362] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.970728] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.970749] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.970764] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.972788] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.972830] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.973109] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.973380] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.973742] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.973791] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.974073] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.974363] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.974411] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.974786] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.974805] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.974823] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.977251] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.977531] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.977806] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.977846] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.978127] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.978414] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.978695] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.978737] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.979017] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.979394] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.979415] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.979434] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.981448] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.981725] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.981767] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.982058] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.982319] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.982604] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.982646] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.982924] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.983198] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.983555] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.983572] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.983596] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.986263] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.986315] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.986587] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.986862] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.987223] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.987273] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.987543] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.987823] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.987875] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.988231] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.988250] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.988270] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.990186] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.990468] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.990744] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.990785] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.991148] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.991431] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.991705] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.991744] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.992029] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.992353] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.992371] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.992385] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.994904] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.995182] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.995221] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.995496] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.995763] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.996059] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.996103] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.996386] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.996667] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.997055] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.997076] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.997096] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.999156] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.999197] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.999470] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.999748] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:16.999998] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:17.000053] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:17.000328] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:17.000605] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:17.000652] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.190 [2024-06-10 11:57:17.001057] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.001079] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.001098] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.003327] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.003610] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.003904] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.003953] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.004280] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.004561] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.004836] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.004882] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.005156] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.005496] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.005521] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.005538] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.007625] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.007915] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.007958] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.008230] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.008569] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.008850] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.008898] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.009169] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.009448] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.009732] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.009749] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.009764] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.012242] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.012295] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.012569] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.012846] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.013130] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.013185] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.013459] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.013746] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.013787] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.014108] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.014127] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.014151] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.015882] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.016161] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.016435] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.016475] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.016831] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.017127] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.017402] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.017442] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.017706] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.018051] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.018069] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.018084] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.020411] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.020456] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.020721] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.020759] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.021031] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.021520] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.021559] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.022097] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.022148] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.022517] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.022538] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.022554] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.024397] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.024442] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.025252] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.025303] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.025632] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.025916] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.025956] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.026249] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.026286] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.026643] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.026661] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.026675] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.030220] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.030265] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.030531] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.030569] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.030812] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.031412] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.031454] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.032295] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.032335] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.032690] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.032707] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.032721] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.034406] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.034451] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.035106] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.035143] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.035371] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.036362] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.191 [2024-06-10 11:57:17.036402] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.037361] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.037399] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.037699] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.037718] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.037732] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.040513] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.040562] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.040944] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.040987] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.041222] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.041530] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.041571] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.042330] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.042377] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.042570] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.042587] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.042600] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.044409] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.044450] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.045490] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.045529] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.045721] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.046813] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.046853] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.047721] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.047768] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.047974] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.047991] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.048006] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.051044] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.051093] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.051756] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.051793] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.052034] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.053033] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.053074] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.054048] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.054087] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.054451] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.054469] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.054483] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.055974] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.056028] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.056578] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.056618] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.056976] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.057773] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.057811] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.058105] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.058146] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.058492] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.058511] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.058525] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.062565] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.062609] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.063565] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.063608] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.063861] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.064156] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.064192] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.064455] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.064493] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.064791] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.064810] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.064825] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.067108] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.067150] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.067181] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.067213] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.067540] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.068568] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.068615] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.068649] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.068682] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.068880] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.068897] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.068912] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.192 [2024-06-10 11:57:17.071617] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.071655] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.071687] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.071733] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.072092] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.072141] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.072176] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.072212] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.072248] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.072509] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.072530] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.072544] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.073622] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.073660] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.073692] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.073726] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.073917] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.073966] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.074001] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.074036] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.074068] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.074256] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.074272] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.074287] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.077179] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.077219] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.077256] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.077289] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.077487] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.077536] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.077574] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.077606] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.077638] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.077827] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.077843] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.077856] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.078958] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.078994] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.079028] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.080010] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.080199] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.080251] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.080292] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.080330] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.081031] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.081276] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.081294] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.081310] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.083009] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.083961] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.083999] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.084971] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.085313] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.085364] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.086145] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.086182] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.087127] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.087320] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.087338] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.087351] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.089005] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.089274] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.089312] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.090150] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.090343] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.090392] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.091347] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.091385] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.092127] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.092314] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.092331] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.092349] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.094735] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.095075] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.095113] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.095813] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.096169] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.096219] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.096870] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.096907] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.097685] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.097880] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.097896] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.097910] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.099063] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.100039] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.100085] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.101032] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.101337] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.101383] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.101649] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.101688] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.101956] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.193 [2024-06-10 11:57:17.102297] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.194 [2024-06-10 11:57:17.102318] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.194 [2024-06-10 11:57:17.102337] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.194 [2024-06-10 11:57:17.105450] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.194 [2024-06-10 11:57:17.106436] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.194 [2024-06-10 11:57:17.106476] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.194 [2024-06-10 11:57:17.107406] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.194 [2024-06-10 11:57:17.107606] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.194 [2024-06-10 11:57:17.107658] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.194 [2024-06-10 11:57:17.108458] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.194 [2024-06-10 11:57:17.108495] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.194 [2024-06-10 11:57:17.108764] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.194 [2024-06-10 11:57:17.109122] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.194 [2024-06-10 11:57:17.109143] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.194 [2024-06-10 11:57:17.109158] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.194 [2024-06-10 11:57:17.110552] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.194 [2024-06-10 11:57:17.111574] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.194 [2024-06-10 11:57:17.111619] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.194 [2024-06-10 11:57:17.112555] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.194 [2024-06-10 11:57:17.112744] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.194 [2024-06-10 11:57:17.112793] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.194 [2024-06-10 11:57:17.113501] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.194 [2024-06-10 11:57:17.113537] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.194 [2024-06-10 11:57:17.114327] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.194 [2024-06-10 11:57:17.114519] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.194 [2024-06-10 11:57:17.114535] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.194 [2024-06-10 11:57:17.114549] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.194 [2024-06-10 11:57:17.117533] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.194 [2024-06-10 11:57:17.117572] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.194 [2024-06-10 11:57:17.118355] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.194 [2024-06-10 11:57:17.118392] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.194 [2024-06-10 11:57:17.118579] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.194 [2024-06-10 11:57:17.118628] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.194 [2024-06-10 11:57:17.119575] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.194 [2024-06-10 11:57:17.119614] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.194 [2024-06-10 11:57:17.119645] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.194 [2024-06-10 11:57:17.119907] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.194 [2024-06-10 11:57:17.119924] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.194 [2024-06-10 11:57:17.119938] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.194 [2024-06-10 11:57:17.120965] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.194 [2024-06-10 11:57:17.121453] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.194 [2024-06-10 11:57:17.121490] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.194 [2024-06-10 11:57:17.121522] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.194 [2024-06-10 11:57:17.121782] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.194 [2024-06-10 11:57:17.122073] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.194 [2024-06-10 11:57:17.122112] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.194 [2024-06-10 11:57:17.122145] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.194 [2024-06-10 11:57:17.123162] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.194 [2024-06-10 11:57:17.123514] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.194 [2024-06-10 11:57:17.123535] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.194 [2024-06-10 11:57:17.123554] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.194 [2024-06-10 11:57:17.126904] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.194 [2024-06-10 11:57:17.126965] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.194 [2024-06-10 11:57:17.126997] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.194 [2024-06-10 11:57:17.127961] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.194 [2024-06-10 11:57:17.128158] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.194 [2024-06-10 11:57:17.128208] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.194 [2024-06-10 11:57:17.128242] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.194 [2024-06-10 11:57:17.128822] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.194 [2024-06-10 11:57:17.128871] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.194 [2024-06-10 11:57:17.129257] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.194 [2024-06-10 11:57:17.129277] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.194 [2024-06-10 11:57:17.129295] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.457 [2024-06-10 11:57:17.130899] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.457 [2024-06-10 11:57:17.130939] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.457 [2024-06-10 11:57:17.131915] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.457 [2024-06-10 11:57:17.131953] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.457 [2024-06-10 11:57:17.132147] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.457 [2024-06-10 11:57:17.132196] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.457 [2024-06-10 11:57:17.132955] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.457 [2024-06-10 11:57:17.132995] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.457 [2024-06-10 11:57:17.133031] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.457 [2024-06-10 11:57:17.133220] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.457 [2024-06-10 11:57:17.133236] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.457 [2024-06-10 11:57:17.133250] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.457 [2024-06-10 11:57:17.136048] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.457 [2024-06-10 11:57:17.136587] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.457 [2024-06-10 11:57:17.136623] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.457 [2024-06-10 11:57:17.136655] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.457 [2024-06-10 11:57:17.136932] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.457 [2024-06-10 11:57:17.137212] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.457 [2024-06-10 11:57:17.137253] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.457 [2024-06-10 11:57:17.137286] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.457 [2024-06-10 11:57:17.138319] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.457 [2024-06-10 11:57:17.138510] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.457 [2024-06-10 11:57:17.138526] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.457 [2024-06-10 11:57:17.138539] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.457 [2024-06-10 11:57:17.140741] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.457 [2024-06-10 11:57:17.140788] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.457 [2024-06-10 11:57:17.140819] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.457 [2024-06-10 11:57:17.141782] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.457 [2024-06-10 11:57:17.141992] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.457 [2024-06-10 11:57:17.142044] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.457 [2024-06-10 11:57:17.142080] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.457 [2024-06-10 11:57:17.142355] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.457 [2024-06-10 11:57:17.142394] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.457 [2024-06-10 11:57:17.142735] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.457 [2024-06-10 11:57:17.142753] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.457 [2024-06-10 11:57:17.142768] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.457 [2024-06-10 11:57:17.145256] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.457 [2024-06-10 11:57:17.145312] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.457 [2024-06-10 11:57:17.146108] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.457 [2024-06-10 11:57:17.146145] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.457 [2024-06-10 11:57:17.146371] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.457 [2024-06-10 11:57:17.146420] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.457 [2024-06-10 11:57:17.147360] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.457 [2024-06-10 11:57:17.147397] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.457 [2024-06-10 11:57:17.147428] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.457 [2024-06-10 11:57:17.147617] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.457 [2024-06-10 11:57:17.147633] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.457 [2024-06-10 11:57:17.147646] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.457 [2024-06-10 11:57:17.149115] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.457 [2024-06-10 11:57:17.149799] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.457 [2024-06-10 11:57:17.149836] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.457 [2024-06-10 11:57:17.149872] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.457 [2024-06-10 11:57:17.150213] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.457 [2024-06-10 11:57:17.150978] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.457 [2024-06-10 11:57:17.151015] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.457 [2024-06-10 11:57:17.151049] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.457 [2024-06-10 11:57:17.151957] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.152152] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.152168] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.152182] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.155550] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.155595] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.155629] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.155898] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.156263] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.156312] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.156348] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.156653] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.156694] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.156921] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.156939] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.156953] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.159712] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.159763] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.160656] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.160693] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.160988] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.161038] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.161889] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.161925] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.161957] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.162277] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.162300] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.162316] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.165228] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.166177] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.166216] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.166248] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.166523] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.167451] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.167491] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.167522] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.168457] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.168646] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.168663] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.168676] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.172010] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.172054] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.172087] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.173092] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.173288] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.173337] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.173379] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.174072] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.174109] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.174336] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.174352] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.174366] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.177947] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.179013] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.179065] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.179329] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.179633] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.179682] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.180467] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.180506] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.181440] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.181631] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.181647] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.181661] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.184311] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.184574] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.184612] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.184861] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.185238] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.185286] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.185864] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.185905] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.186681] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.186880] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.186897] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.186910] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.189868] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.190250] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.190289] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.190966] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.191303] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.191353] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.191971] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.192008] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.192426] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.192762] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.192784] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.192801] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.195885] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.196913] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.196958] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.458 [2024-06-10 11:57:17.197899] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.198092] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.198144] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.198412] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.198451] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.198717] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.199069] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.199089] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.199106] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.201464] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.202254] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.202293] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.203240] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.203434] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.203484] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.204135] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.204180] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.205212] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.205585] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.205606] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.205624] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.207495] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.208431] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.208469] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.209216] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.209427] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.209472] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.210240] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.210277] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.211212] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.211404] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.211420] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.211433] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.213613] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.214512] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.214553] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.215324] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.215523] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.215570] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.216359] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.216397] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.217332] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.217523] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.217542] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.217558] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.219899] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.220481] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.220518] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.221312] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.221503] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.221551] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.222493] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.222530] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.222953] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.223145] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.223171] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.223187] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.225955] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.226618] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.226656] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.227370] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.227612] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.227665] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.228688] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.228733] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.229739] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.229937] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.229953] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.229967] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.232513] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.233140] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.233871] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.234807] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.235062] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.235121] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.235889] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.236822] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.237761] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.238053] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.238072] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.238095] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.240607] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.241333] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.242008] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.242876] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.243180] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.243465] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.244332] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.244627] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.244897] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.245092] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.245111] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.459 [2024-06-10 11:57:17.245125] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.247412] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.247923] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.248708] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.249649] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.249845] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.250600] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.251396] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.252342] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.252759] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.252961] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.252979] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.252997] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.255220] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.255498] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.255767] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.256039] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.256318] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.256600] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.257233] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.257761] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.258036] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.258235] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.258252] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.258266] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.260809] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.261091] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.261363] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.261424] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.261714] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.262753] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.263028] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.263487] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.263528] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.263750] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.263769] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.263785] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.266130] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.266177] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.266443] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.266483] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.266754] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.267633] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.267677] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.267956] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.267993] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.268295] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.268312] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.268326] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.271433] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.271478] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.271744] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.271782] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.272048] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.272328] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.272370] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.273279] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.273316] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.273684] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.273703] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.273721] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.276009] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.276057] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.276321] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.276357] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.276686] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.276973] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.277021] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.277285] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.277326] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.277524] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.277541] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.277556] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.280011] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.280065] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.280341] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.280379] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.280705] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.280998] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.281038] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.281305] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.281351] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.281712] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.281730] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.281744] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.284760] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.284807] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.285080] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.285122] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.285465] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.285743] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.285779] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.460 [2024-06-10 11:57:17.286050] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.286090] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.286409] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.286433] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.286449] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.288523] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.288570] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.288842] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.288897] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.289199] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.289479] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.289517] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.289785] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.289821] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.290192] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.290213] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.290229] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.294349] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.294632] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.294671] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.294941] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.295324] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.295605] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.295645] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.295914] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.296181] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.296536] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.296557] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.296572] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.299653] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.299703] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.299972] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.300240] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.300556] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.300615] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.300891] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.301160] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.301201] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.301540] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.301559] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.301574] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.304392] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.305408] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.305692] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.305731] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.306039] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.306328] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.306603] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.306643] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.306913] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.307202] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.307220] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.307235] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.310701] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.311059] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.311102] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.311368] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.311667] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.311953] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.312004] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.312269] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.312534] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.312897] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.312919] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.312937] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.315864] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.315914] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.316179] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.316449] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.316742] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.316796] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.317082] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.317354] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.317394] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.317734] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.317753] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.317770] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.461 [2024-06-10 11:57:17.320200] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.320932] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.321360] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.321400] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.321745] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.322026] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.322784] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.322823] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.323102] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.323474] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.323496] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.323512] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.326717] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.327634] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.327675] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.328481] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.328783] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.329071] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.329112] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.330046] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.330319] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.330666] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.330686] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.330701] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.333462] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.333507] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.334240] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.334511] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.334795] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.334846] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.335784] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.336493] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.336530] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.336721] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.336738] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.336752] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.340051] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.340783] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.341054] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.341094] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.341308] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.342109] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.342710] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.342751] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.343779] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.344053] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.344072] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.344086] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.348616] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.348912] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.348954] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.349939] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.350326] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.351391] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.351444] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.351714] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.352203] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.352410] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.352426] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.352443] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.355675] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.355726] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.356723] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.357665] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.357859] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.357916] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.358586] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.359116] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.359157] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.359503] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.359522] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.359541] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.363077] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.364114] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.364683] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.364723] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.364968] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.365963] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.366954] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.367002] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.367486] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.367678] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.367695] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.367709] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.370522] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.370575] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.371546] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.371584] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.371784] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.372494] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.372532] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.373321] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.462 [2024-06-10 11:57:17.373358] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.463 [2024-06-10 11:57:17.373546] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.463 [2024-06-10 11:57:17.373562] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.463 [2024-06-10 11:57:17.373577] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.463 [2024-06-10 11:57:17.376318] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.463 [2024-06-10 11:57:17.376364] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.463 [2024-06-10 11:57:17.377013] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.463 [2024-06-10 11:57:17.377051] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.463 [2024-06-10 11:57:17.377278] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.463 [2024-06-10 11:57:17.378262] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.463 [2024-06-10 11:57:17.378301] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.463 [2024-06-10 11:57:17.379245] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.463 [2024-06-10 11:57:17.379283] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.463 [2024-06-10 11:57:17.379558] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.463 [2024-06-10 11:57:17.379579] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.463 [2024-06-10 11:57:17.379594] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.463 [2024-06-10 11:57:17.382960] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.463 [2024-06-10 11:57:17.383008] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.463 [2024-06-10 11:57:17.383641] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.463 [2024-06-10 11:57:17.383679] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.463 [2024-06-10 11:57:17.383954] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.463 [2024-06-10 11:57:17.384237] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.463 [2024-06-10 11:57:17.384276] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.463 [2024-06-10 11:57:17.385264] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.463 [2024-06-10 11:57:17.385309] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.463 [2024-06-10 11:57:17.385495] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.463 [2024-06-10 11:57:17.385511] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.463 [2024-06-10 11:57:17.385528] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.463 [2024-06-10 11:57:17.389414] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.463 [2024-06-10 11:57:17.389460] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.463 [2024-06-10 11:57:17.390221] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.463 [2024-06-10 11:57:17.390260] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.463 [2024-06-10 11:57:17.390595] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.463 [2024-06-10 11:57:17.390884] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.463 [2024-06-10 11:57:17.390921] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.463 [2024-06-10 11:57:17.391973] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.463 [2024-06-10 11:57:17.392020] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.463 [2024-06-10 11:57:17.392393] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.463 [2024-06-10 11:57:17.392414] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.463 [2024-06-10 11:57:17.392432] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.463 [2024-06-10 11:57:17.395801] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.463 [2024-06-10 11:57:17.395847] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.463 [2024-06-10 11:57:17.396811] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.463 [2024-06-10 11:57:17.396849] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.463 [2024-06-10 11:57:17.397046] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.463 [2024-06-10 11:57:17.397695] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.463 [2024-06-10 11:57:17.397741] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.463 [2024-06-10 11:57:17.398840] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.463 [2024-06-10 11:57:17.398908] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.463 [2024-06-10 11:57:17.399281] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.463 [2024-06-10 11:57:17.399301] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.463 [2024-06-10 11:57:17.399321] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.726 [2024-06-10 11:57:17.402578] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.726 [2024-06-10 11:57:17.402625] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.726 [2024-06-10 11:57:17.403371] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.726 [2024-06-10 11:57:17.403410] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.726 [2024-06-10 11:57:17.403599] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.726 [2024-06-10 11:57:17.404408] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.726 [2024-06-10 11:57:17.404447] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.726 [2024-06-10 11:57:17.405385] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.726 [2024-06-10 11:57:17.405430] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.726 [2024-06-10 11:57:17.405615] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.726 [2024-06-10 11:57:17.405631] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.726 [2024-06-10 11:57:17.405645] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.726 [2024-06-10 11:57:17.408889] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.726 [2024-06-10 11:57:17.408935] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.726 [2024-06-10 11:57:17.409714] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.726 [2024-06-10 11:57:17.409751] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.726 [2024-06-10 11:57:17.409946] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.726 [2024-06-10 11:57:17.410921] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.726 [2024-06-10 11:57:17.410960] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.726 [2024-06-10 11:57:17.411382] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.726 [2024-06-10 11:57:17.411421] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.726 [2024-06-10 11:57:17.411646] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.726 [2024-06-10 11:57:17.411664] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.726 [2024-06-10 11:57:17.411679] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.726 [2024-06-10 11:57:17.415151] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.726 [2024-06-10 11:57:17.415195] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.726 [2024-06-10 11:57:17.415606] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.726 [2024-06-10 11:57:17.415646] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.726 [2024-06-10 11:57:17.416012] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.726 [2024-06-10 11:57:17.416991] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.726 [2024-06-10 11:57:17.417031] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.726 [2024-06-10 11:57:17.417984] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.726 [2024-06-10 11:57:17.418026] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.418215] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.418231] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.418246] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.421960] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.422005] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.422322] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.422361] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.422710] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.423686] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.423730] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.424003] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.424041] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.424328] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.424343] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.424356] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.428021] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.428066] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.428098] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.428131] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.428318] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.428830] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.428874] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.428907] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.428939] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.429141] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.429156] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.429171] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.430928] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.430967] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.430999] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.431030] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.431215] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.431262] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.431298] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.431337] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.431373] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.431558] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.431572] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.431586] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.434387] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.434425] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.434463] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.434498] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.434872] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.434915] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.434951] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.434987] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.435021] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.435263] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.435277] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.435291] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.437883] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.437921] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.437953] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.437986] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.438299] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.438347] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.438379] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.438411] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.438442] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.438703] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.438717] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.438730] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.442397] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.442440] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.442475] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.443060] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.443388] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.443432] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.443466] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.443502] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.444483] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.444717] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.444732] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.444745] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.447625] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.448515] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.448553] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.449381] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.449688] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.449739] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.450011] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.450046] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.451109] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.451500] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.451516] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.451532] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.727 [2024-06-10 11:57:17.454083] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.454865] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.454905] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.455835] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.456032] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.456079] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.456553] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.456595] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.457494] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.457864] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.457885] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.457900] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.460321] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.461275] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.461312] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.461803] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.461995] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.462038] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.462965] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.463002] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.464030] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.464219] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.464233] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.464247] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.467186] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.468121] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.468156] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.469092] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.469281] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.469329] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.470290] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.470335] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.470979] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.471244] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.471259] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.471272] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.474860] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.475897] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.475936] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.476203] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.476481] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.476525] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.477308] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.477344] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.478288] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.478483] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.478498] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.478511] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.480652] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.480947] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.480985] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.481251] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.481490] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.481532] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.482127] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.482165] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.482428] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.482617] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.482631] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.482644] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.484799] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.484841] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.485779] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.485814] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.486007] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.486054] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.486718] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.486752] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.486793] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.487215] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.487234] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.487252] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.488841] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.489794] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.489831] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.489863] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.490053] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.490774] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.490809] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.490844] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.491773] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.491968] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.491983] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.491997] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.494303] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.494346] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.494381] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.494646] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.494833] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.494881] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.494914] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.728 [2024-06-10 11:57:17.495813] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.495848] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.496041] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.496056] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.496070] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.498007] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.498045] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.498649] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.498712] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.499110] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.499155] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.499447] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.499481] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.499512] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.499715] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.499731] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.499745] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.501853] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.502474] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.502521] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.502553] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.502740] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.503677] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.503721] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.503758] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.504699] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.504892] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.504907] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.504925] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.508480] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.508550] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.508582] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.509625] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.509813] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.509857] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.509912] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.510835] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.510875] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.511105] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.511122] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.511136] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.513426] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.513468] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.513922] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.513956] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.514147] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.514194] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.514462] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.514494] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.514531] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.514724] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.514739] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.514752] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.516859] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.517890] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.517927] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.517958] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.518148] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.519074] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.519112] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.519146] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.519411] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.519765] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.519783] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.519800] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.522126] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.522183] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.522227] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.523166] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.523411] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.523455] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.523487] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.524272] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.524307] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.524497] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.524511] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.524525] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.526047] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.729 [2024-06-10 11:57:17.526087] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.526353] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.526388] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.526637] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.526680] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.527463] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.527499] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.527530] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.527720] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.527734] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.527747] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.529797] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.530783] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.530828] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.530863] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.531126] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.531400] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.531438] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.531474] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.532407] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.532760] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.532778] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.532801] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.536040] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.536088] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.536129] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.537065] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.537252] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.537300] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.537332] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.538114] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.538149] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.538463] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.538479] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.538494] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.540229] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.541057] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.541093] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.542094] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.542375] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.542426] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.543436] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.543472] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.544304] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.544552] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.544568] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.544583] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.546677] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.547394] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.547433] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.547881] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.548071] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.548123] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.549079] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.549120] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.550193] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.550480] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.550503] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.550526] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.552769] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.553560] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.553597] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.554462] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.554687] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.554746] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.555690] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.555743] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.556713] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.556905] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.556920] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.556934] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.558770] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.559327] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.559364] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.559850] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.560083] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.560129] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.560553] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.560592] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.560862] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.561199] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.561214] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.561231] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.562603] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.730 [2024-06-10 11:57:17.563548] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.731 [2024-06-10 11:57:17.563584] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.731 [2024-06-10 11:57:17.564159] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.731 [2024-06-10 11:57:17.564386] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.731 [2024-06-10 11:57:17.564434] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.731 [2024-06-10 11:57:17.565373] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.731 [2024-06-10 11:57:17.565409] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.731 [2024-06-10 11:57:17.566342] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.731 [2024-06-10 11:57:17.566578] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.731 [2024-06-10 11:57:17.566594] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.731 [2024-06-10 11:57:17.566608] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.731 [2024-06-10 11:57:17.569237] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.731 [2024-06-10 11:57:17.570273] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.731 [2024-06-10 11:57:17.570308] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.731 [2024-06-10 11:57:17.571295] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.731 [2024-06-10 11:57:17.571484] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.731 [2024-06-10 11:57:17.571532] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.731 [2024-06-10 11:57:17.572110] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.731 [2024-06-10 11:57:17.572146] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.731 [2024-06-10 11:57:17.572924] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.731 [2024-06-10 11:57:17.573113] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.731 [2024-06-10 11:57:17.573128] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.731 [2024-06-10 11:57:17.573141] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.731 [2024-06-10 11:57:17.574689] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.731 [2024-06-10 11:57:17.574968] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.731 [2024-06-10 11:57:17.575021] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.731 [2024-06-10 11:57:17.575290] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.731 [2024-06-10 11:57:17.575571] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.731 [2024-06-10 11:57:17.575630] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.731 [2024-06-10 11:57:17.575907] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.731 [2024-06-10 11:57:17.575944] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.731 [2024-06-10 11:57:17.576745] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.731 [2024-06-10 11:57:17.577083] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.731 [2024-06-10 11:57:17.577099] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.731 [2024-06-10 11:57:17.577114] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.731 [2024-06-10 11:57:17.578954] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.731 [2024-06-10 11:57:17.579906] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.731 [2024-06-10 11:57:17.579944] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.731 [2024-06-10 11:57:17.580244] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.731 [2024-06-10 11:57:17.580577] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.731 [2024-06-10 11:57:17.580628] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.731 [2024-06-10 11:57:17.580901] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.731 [2024-06-10 11:57:17.580946] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.731 [2024-06-10 11:57:17.581216] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.731 [2024-06-10 11:57:17.581569] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.731 [2024-06-10 11:57:17.581586] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.731 [2024-06-10 11:57:17.581601] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.731 [2024-06-10 11:57:17.583119] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.731 [2024-06-10 11:57:17.583394] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.731 [2024-06-10 11:57:17.583432] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.731 [2024-06-10 11:57:17.583697] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.731 [2024-06-10 11:57:17.583888] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.731 [2024-06-10 11:57:17.583931] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.731 [2024-06-10 11:57:17.584198] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.731 [2024-06-10 11:57:17.584235] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.731 [2024-06-10 11:57:17.584500] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.731 [2024-06-10 11:57:17.584795] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.731 [2024-06-10 11:57:17.584819] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.731 [2024-06-10 11:57:17.584833] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.731 [2024-06-10 11:57:17.586659] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.731 [2024-06-10 11:57:17.586945] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.731 [2024-06-10 11:57:17.587222] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.731 [2024-06-10 11:57:17.587487] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.731 [2024-06-10 11:57:17.587815] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.731 [2024-06-10 11:57:17.587863] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.731 [2024-06-10 11:57:17.588755] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.731 [2024-06-10 11:57:17.589044] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.731 [2024-06-10 11:57:17.589315] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.731 [2024-06-10 11:57:17.589635] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.731 [2024-06-10 11:57:17.589651] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.731 [2024-06-10 11:57:17.589666] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.591482] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.591758] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.592032] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.592302] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.592534] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.593210] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.593477] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.593742] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.594015] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.594307] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.594323] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.594338] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.596332] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.596626] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.596958] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.597838] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.598206] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.598489] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.598776] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.599068] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.599349] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.599547] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.599562] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.599575] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.601314] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.601605] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.602456] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.602808] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.603155] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.603436] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.603712] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.603992] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.604496] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.604698] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.604716] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.604732] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.606439] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.606814] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.607614] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.607650] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.607975] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.608250] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.608523] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.608793] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.608831] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.609168] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.609187] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.609202] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.610730] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.610771] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.611045] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.611088] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.611374] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.611652] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.611691] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.611961] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.612007] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.612337] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.612352] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.612366] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.614066] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.614114] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.614380] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.614418] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.614669] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.614955] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.615005] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.615271] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.615307] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.615611] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.615626] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.615639] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.617312] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.617350] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.617619] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.732 [2024-06-10 11:57:17.617657] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.617902] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.618178] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.618216] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.618480] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.618519] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.618853] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.618875] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.618894] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.620518] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.620558] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.620823] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.620859] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.621122] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.621398] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.621436] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.621705] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.621745] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.621997] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.622013] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.622028] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.623753] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.623797] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.624070] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.624111] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.624437] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.624721] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.624765] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.625038] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.625080] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.625422] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.625441] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.625458] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.627039] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.627088] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.627353] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.627397] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.627666] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.627948] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.627989] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.628254] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.628292] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.628516] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.628531] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.628545] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.630132] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.631048] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.631088] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.631359] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.631548] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.631832] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.631875] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.632141] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.632409] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.632673] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.632688] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.632701] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.634193] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.634239] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.634507] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.634788] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.635118] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.635171] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.636172] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.636451] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.636493] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.636680] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.636698] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.636712] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.638292] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.639255] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.640300] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.640338] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.640528] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.641595] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.642226] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.642263] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.643051] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.643241] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.643256] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.643269] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.644861] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.645138] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.645182] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.645446] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.645720] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.646004] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.646041] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.646508] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.647193] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.647442] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.647457] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.647470] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.733 [2024-06-10 11:57:17.649262] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.734 [2024-06-10 11:57:17.649301] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.734 [2024-06-10 11:57:17.650217] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.734 [2024-06-10 11:57:17.651153] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.734 [2024-06-10 11:57:17.651345] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.734 [2024-06-10 11:57:17.651393] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.734 [2024-06-10 11:57:17.652085] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.734 [2024-06-10 11:57:17.653077] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.734 [2024-06-10 11:57:17.653123] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.734 [2024-06-10 11:57:17.653317] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.734 [2024-06-10 11:57:17.653332] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.734 [2024-06-10 11:57:17.653346] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.734 [2024-06-10 11:57:17.654539] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.734 [2024-06-10 11:57:17.654811] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.734 [2024-06-10 11:57:17.655297] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.734 [2024-06-10 11:57:17.655333] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.734 [2024-06-10 11:57:17.655552] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.734 [2024-06-10 11:57:17.656510] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.734 [2024-06-10 11:57:17.657452] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.734 [2024-06-10 11:57:17.657489] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.734 [2024-06-10 11:57:17.658149] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.734 [2024-06-10 11:57:17.658340] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.734 [2024-06-10 11:57:17.658354] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.734 [2024-06-10 11:57:17.658368] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.734 [2024-06-10 11:57:17.659525] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.734 [2024-06-10 11:57:17.659795] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.734 [2024-06-10 11:57:17.659832] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.734 [2024-06-10 11:57:17.660102] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.734 [2024-06-10 11:57:17.660287] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.734 [2024-06-10 11:57:17.661167] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.734 [2024-06-10 11:57:17.661204] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.734 [2024-06-10 11:57:17.662258] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.734 [2024-06-10 11:57:17.663269] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.734 [2024-06-10 11:57:17.663563] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.734 [2024-06-10 11:57:17.663578] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.734 [2024-06-10 11:57:17.663595] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.734 [2024-06-10 11:57:17.664885] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.734 [2024-06-10 11:57:17.664930] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.734 [2024-06-10 11:57:17.665204] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.734 [2024-06-10 11:57:17.665479] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.734 [2024-06-10 11:57:17.665700] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.734 [2024-06-10 11:57:17.665744] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.734 [2024-06-10 11:57:17.666547] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.734 [2024-06-10 11:57:17.667516] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.734 [2024-06-10 11:57:17.667552] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.734 [2024-06-10 11:57:17.667772] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.734 [2024-06-10 11:57:17.667787] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.734 [2024-06-10 11:57:17.667800] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.995 [2024-06-10 11:57:17.668935] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.995 [2024-06-10 11:57:17.669894] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.995 [2024-06-10 11:57:17.670221] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.995 [2024-06-10 11:57:17.670258] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.995 [2024-06-10 11:57:17.670585] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.995 [2024-06-10 11:57:17.670875] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.995 [2024-06-10 11:57:17.671548] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.995 [2024-06-10 11:57:17.671583] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.995 [2024-06-10 11:57:17.672372] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.995 [2024-06-10 11:57:17.672558] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.995 [2024-06-10 11:57:17.672573] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.995 [2024-06-10 11:57:17.672586] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.995 [2024-06-10 11:57:17.674559] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.995 [2024-06-10 11:57:17.675499] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.995 [2024-06-10 11:57:17.675535] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.995 [2024-06-10 11:57:17.676139] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.995 [2024-06-10 11:57:17.676452] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.995 [2024-06-10 11:57:17.676726] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.995 [2024-06-10 11:57:17.676771] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.995 [2024-06-10 11:57:17.677240] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.995 [2024-06-10 11:57:17.678049] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.995 [2024-06-10 11:57:17.678240] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.995 [2024-06-10 11:57:17.678257] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.995 [2024-06-10 11:57:17.678271] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.995 [2024-06-10 11:57:17.680134] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.995 [2024-06-10 11:57:17.680185] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.995 [2024-06-10 11:57:17.681207] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.995 [2024-06-10 11:57:17.682210] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.995 [2024-06-10 11:57:17.682488] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.995 [2024-06-10 11:57:17.682534] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.995 [2024-06-10 11:57:17.682808] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.995 [2024-06-10 11:57:17.683084] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.995 [2024-06-10 11:57:17.683121] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.995 [2024-06-10 11:57:17.683309] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.995 [2024-06-10 11:57:17.683324] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.995 [2024-06-10 11:57:17.683338] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.995 [2024-06-10 11:57:17.684291] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.995 [2024-06-10 11:57:17.685250] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.995 [2024-06-10 11:57:17.686088] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.995 [2024-06-10 11:57:17.686124] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.995 [2024-06-10 11:57:17.686310] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.995 [2024-06-10 11:57:17.687294] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.995 [2024-06-10 11:57:17.687637] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.995 [2024-06-10 11:57:17.687674] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.995 [2024-06-10 11:57:17.687949] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.995 [2024-06-10 11:57:17.688291] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.995 [2024-06-10 11:57:17.688309] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.995 [2024-06-10 11:57:17.688327] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.995 [2024-06-10 11:57:17.690237] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.995 [2024-06-10 11:57:17.690284] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.995 [2024-06-10 11:57:17.691084] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.995 [2024-06-10 11:57:17.691118] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.995 [2024-06-10 11:57:17.691342] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.995 [2024-06-10 11:57:17.692305] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.995 [2024-06-10 11:57:17.692341] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.995 [2024-06-10 11:57:17.693275] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.995 [2024-06-10 11:57:17.693310] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.995 [2024-06-10 11:57:17.693593] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.995 [2024-06-10 11:57:17.693608] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.693622] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.695769] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.695807] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.696886] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.696928] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.697113] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.697667] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.697702] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.698489] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.698524] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.698712] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.698727] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.698740] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.700143] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.700190] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.701127] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.701166] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.701352] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.702356] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.702406] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.703318] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.703353] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.703562] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.703577] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.703591] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.704769] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.704808] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.705078] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.705114] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.705437] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.706535] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.706569] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.707617] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.707653] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.707837] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.707851] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.707865] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.709808] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.709846] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.710893] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.710928] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.711174] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.711452] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.711488] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.711780] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.711814] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.712011] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.712027] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.712041] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.713962] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.714010] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.715048] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.715089] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.715276] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.716310] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.716353] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.716619] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.716655] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.716945] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.716961] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.716976] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.718953] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.718991] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.719441] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.719477] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.719672] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.720713] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.720764] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.721729] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.721766] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.721962] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.721985] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.722000] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.724201] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.724242] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.725197] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.725233] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.725423] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.725917] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.725955] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.726764] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.726804] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.727001] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.727017] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.727031] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.728497] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.728536] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.729652] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.729693] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.729889] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.730861] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.730903] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.996 [2024-06-10 11:57:17.731974] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.997 [2024-06-10 11:57:17.732017] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.997 [2024-06-10 11:57:17.732295] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.997 [2024-06-10 11:57:17.732310] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.997 [2024-06-10 11:57:17.732324] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.997 [2024-06-10 11:57:17.733707] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.997 [2024-06-10 11:57:17.733747] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.997 [2024-06-10 11:57:17.733780] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.997 [2024-06-10 11:57:17.733814] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.997 [2024-06-10 11:57:17.734116] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.997 [2024-06-10 11:57:17.734398] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.997 [2024-06-10 11:57:17.734435] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.997 [2024-06-10 11:57:17.734467] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.997 [2024-06-10 11:57:17.734498] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.997 [2024-06-10 11:57:17.734732] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.997 [2024-06-10 11:57:17.734748] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.997 [2024-06-10 11:57:17.734761] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.997 [2024-06-10 11:57:17.735779] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.997 [2024-06-10 11:57:17.735813] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.997 [2024-06-10 11:57:17.735849] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.997 [2024-06-10 11:57:17.735887] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.997 [2024-06-10 11:57:17.736146] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.997 [2024-06-10 11:57:17.736192] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.997 [2024-06-10 11:57:17.736225] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.997 [2024-06-10 11:57:17.736256] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.997 [2024-06-10 11:57:17.736287] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.997 [2024-06-10 11:57:17.736476] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.997 [2024-06-10 11:57:17.736491] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.997 [2024-06-10 11:57:17.736505] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.997 [2024-06-10 11:57:17.737692] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.997 [2024-06-10 11:57:17.737730] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.997 [2024-06-10 11:57:17.737767] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.997 [2024-06-10 11:57:17.737802] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.997 [2024-06-10 11:57:17.738053] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.997 [2024-06-10 11:57:17.738099] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.997 [2024-06-10 11:57:17.738132] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.997 [2024-06-10 11:57:17.738164] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.997 [2024-06-10 11:57:17.738195] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.997 [2024-06-10 11:57:17.738415] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.997 [2024-06-10 11:57:17.738430] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.997 [2024-06-10 11:57:17.738444] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.997 [2024-06-10 11:57:17.739968] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.997 [2024-06-10 11:57:17.740004] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.997 [2024-06-10 11:57:17.740039] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.997 [2024-06-10 11:57:17.740325] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:33.997 [2024-06-10 11:57:17.740339] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:46:34.256 00:46:34.256 Latency(us) 00:46:34.256 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:46:34.256 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:46:34.256 Verification LBA range: start 0x0 length 0x100 00:46:34.256 crypto_ram : 5.54 58.48 3.65 0.00 0.00 2076970.22 55848.07 1728782.69 00:46:34.256 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:46:34.256 Verification LBA range: start 0x100 length 0x100 00:46:34.256 crypto_ram : 5.68 62.98 3.94 0.00 0.00 1951389.24 101666.28 1633954.95 00:46:34.256 Job: crypto_ram1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:46:34.256 Verification LBA range: start 0x0 length 0x100 00:46:34.256 crypto_ram1 : 5.58 63.59 3.97 0.00 0.00 1903647.67 58355.53 1604777.18 00:46:34.256 Job: crypto_ram1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:46:34.256 Verification LBA range: start 0x100 length 0x100 00:46:34.256 crypto_ram1 : 5.71 67.10 4.19 0.00 0.00 1833372.46 88445.11 1509949.44 00:46:34.256 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:46:34.256 Verification LBA range: start 0x0 length 0x100 00:46:34.256 crypto_ram2 : 5.36 419.24 26.20 0.00 0.00 284065.36 24162.84 428548.45 00:46:34.256 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:46:34.256 Verification LBA range: start 0x100 length 0x100 00:46:34.256 crypto_ram2 : 5.39 426.88 26.68 0.00 0.00 278795.53 50833.14 417606.79 00:46:34.256 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:46:34.256 Verification LBA range: start 0x0 length 0x100 00:46:34.256 crypto_ram3 : 5.44 431.81 26.99 0.00 0.00 270350.74 10029.86 306366.55 00:46:34.256 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:46:34.256 Verification LBA range: start 0x100 length 0x100 00:46:34.256 crypto_ram3 : 5.47 437.27 27.33 0.00 0.00 266846.93 27012.23 317308.22 00:46:34.256 =================================================================================================================== 00:46:34.256 Total : 1967.34 122.96 0.00 0.00 495143.70 10029.86 1728782.69 00:46:34.825 00:46:34.825 real 0m8.706s 00:46:34.825 user 0m16.591s 00:46:34.825 sys 0m0.432s 00:46:34.825 11:57:18 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@1125 -- # xtrace_disable 00:46:34.825 11:57:18 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:46:34.825 ************************************ 00:46:34.825 END TEST bdev_verify_big_io 00:46:34.825 ************************************ 00:46:34.825 11:57:18 blockdev_crypto_qat -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:46:34.825 11:57:18 blockdev_crypto_qat -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:46:34.825 11:57:18 blockdev_crypto_qat -- common/autotest_common.sh@1106 -- # xtrace_disable 00:46:34.825 11:57:18 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:46:34.825 ************************************ 00:46:34.825 START TEST bdev_write_zeroes 00:46:34.825 ************************************ 00:46:34.825 11:57:18 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:46:34.825 [2024-06-10 11:57:18.657748] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:46:34.825 [2024-06-10 11:57:18.657789] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid302108 ] 00:46:34.825 [2024-06-10 11:57:18.741597] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:46:35.084 [2024-06-10 11:57:18.825987] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:46:35.084 [2024-06-10 11:57:18.846862] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:46:35.084 [2024-06-10 11:57:18.854898] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:46:35.084 [2024-06-10 11:57:18.862916] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:46:35.084 [2024-06-10 11:57:18.965682] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:46:37.621 [2024-06-10 11:57:21.150356] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:46:37.621 [2024-06-10 11:57:21.150433] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:46:37.621 [2024-06-10 11:57:21.150444] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:46:37.621 [2024-06-10 11:57:21.158377] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:46:37.621 [2024-06-10 11:57:21.158391] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:46:37.621 [2024-06-10 11:57:21.158399] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:46:37.621 [2024-06-10 11:57:21.166395] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:46:37.621 [2024-06-10 11:57:21.166406] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:46:37.621 [2024-06-10 11:57:21.166413] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:46:37.621 [2024-06-10 11:57:21.174415] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:46:37.621 [2024-06-10 11:57:21.174425] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:46:37.621 [2024-06-10 11:57:21.174432] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:46:37.621 Running I/O for 1 seconds... 00:46:38.558 00:46:38.559 Latency(us) 00:46:38.559 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:46:38.559 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:46:38.559 crypto_ram : 1.02 3142.25 12.27 0.00 0.00 40546.45 3704.21 50149.29 00:46:38.559 Job: crypto_ram1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:46:38.559 crypto_ram1 : 1.02 3147.76 12.30 0.00 0.00 40319.98 3704.21 46502.07 00:46:38.559 Job: crypto_ram2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:46:38.559 crypto_ram2 : 1.01 24527.27 95.81 0.00 0.00 5166.07 1617.03 6810.05 00:46:38.559 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:46:38.559 crypto_ram3 : 1.01 24560.04 95.94 0.00 0.00 5148.12 1588.54 5841.25 00:46:38.559 =================================================================================================================== 00:46:38.559 Total : 55377.32 216.32 0.00 0.00 9174.97 1588.54 50149.29 00:46:38.817 00:46:38.817 real 0m4.048s 00:46:38.817 user 0m3.668s 00:46:38.818 sys 0m0.343s 00:46:38.818 11:57:22 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@1125 -- # xtrace_disable 00:46:38.818 11:57:22 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:46:38.818 ************************************ 00:46:38.818 END TEST bdev_write_zeroes 00:46:38.818 ************************************ 00:46:38.818 11:57:22 blockdev_crypto_qat -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:46:38.818 11:57:22 blockdev_crypto_qat -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:46:38.818 11:57:22 blockdev_crypto_qat -- common/autotest_common.sh@1106 -- # xtrace_disable 00:46:38.818 11:57:22 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:46:38.818 ************************************ 00:46:38.818 START TEST bdev_json_nonenclosed 00:46:38.818 ************************************ 00:46:38.818 11:57:22 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:46:39.077 [2024-06-10 11:57:22.783043] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:46:39.077 [2024-06-10 11:57:22.783086] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid302648 ] 00:46:39.077 [2024-06-10 11:57:22.868672] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:46:39.077 [2024-06-10 11:57:22.949749] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:46:39.077 [2024-06-10 11:57:22.949824] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:46:39.077 [2024-06-10 11:57:22.949838] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:46:39.077 [2024-06-10 11:57:22.949847] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:46:39.337 00:46:39.337 real 0m0.303s 00:46:39.337 user 0m0.184s 00:46:39.337 sys 0m0.117s 00:46:39.337 11:57:23 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1125 -- # xtrace_disable 00:46:39.337 11:57:23 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:46:39.337 ************************************ 00:46:39.337 END TEST bdev_json_nonenclosed 00:46:39.337 ************************************ 00:46:39.337 11:57:23 blockdev_crypto_qat -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:46:39.337 11:57:23 blockdev_crypto_qat -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:46:39.337 11:57:23 blockdev_crypto_qat -- common/autotest_common.sh@1106 -- # xtrace_disable 00:46:39.337 11:57:23 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:46:39.337 ************************************ 00:46:39.337 START TEST bdev_json_nonarray 00:46:39.337 ************************************ 00:46:39.337 11:57:23 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:46:39.337 [2024-06-10 11:57:23.150531] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:46:39.337 [2024-06-10 11:57:23.150572] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid302669 ] 00:46:39.337 [2024-06-10 11:57:23.233424] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:46:39.597 [2024-06-10 11:57:23.321747] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:46:39.597 [2024-06-10 11:57:23.321806] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:46:39.597 [2024-06-10 11:57:23.321820] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:46:39.597 [2024-06-10 11:57:23.321829] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:46:39.597 00:46:39.597 real 0m0.314s 00:46:39.597 user 0m0.199s 00:46:39.597 sys 0m0.112s 00:46:39.597 11:57:23 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1125 -- # xtrace_disable 00:46:39.597 11:57:23 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:46:39.597 ************************************ 00:46:39.597 END TEST bdev_json_nonarray 00:46:39.597 ************************************ 00:46:39.597 11:57:23 blockdev_crypto_qat -- bdev/blockdev.sh@787 -- # [[ crypto_qat == bdev ]] 00:46:39.597 11:57:23 blockdev_crypto_qat -- bdev/blockdev.sh@794 -- # [[ crypto_qat == gpt ]] 00:46:39.597 11:57:23 blockdev_crypto_qat -- bdev/blockdev.sh@798 -- # [[ crypto_qat == crypto_sw ]] 00:46:39.597 11:57:23 blockdev_crypto_qat -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:46:39.597 11:57:23 blockdev_crypto_qat -- bdev/blockdev.sh@811 -- # cleanup 00:46:39.597 11:57:23 blockdev_crypto_qat -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:46:39.597 11:57:23 blockdev_crypto_qat -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:46:39.597 11:57:23 blockdev_crypto_qat -- bdev/blockdev.sh@26 -- # [[ crypto_qat == rbd ]] 00:46:39.597 11:57:23 blockdev_crypto_qat -- bdev/blockdev.sh@30 -- # [[ crypto_qat == daos ]] 00:46:39.597 11:57:23 blockdev_crypto_qat -- bdev/blockdev.sh@34 -- # [[ crypto_qat = \g\p\t ]] 00:46:39.597 11:57:23 blockdev_crypto_qat -- bdev/blockdev.sh@40 -- # [[ crypto_qat == xnvme ]] 00:46:39.597 00:46:39.597 real 1m8.279s 00:46:39.597 user 2m34.096s 00:46:39.597 sys 0m7.785s 00:46:39.597 11:57:23 blockdev_crypto_qat -- common/autotest_common.sh@1125 -- # xtrace_disable 00:46:39.597 11:57:23 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:46:39.597 ************************************ 00:46:39.597 END TEST blockdev_crypto_qat 00:46:39.597 ************************************ 00:46:39.597 11:57:23 -- spdk/autotest.sh@360 -- # run_test chaining /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/chaining.sh 00:46:39.597 11:57:23 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:46:39.597 11:57:23 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:46:39.597 11:57:23 -- common/autotest_common.sh@10 -- # set +x 00:46:39.597 ************************************ 00:46:39.597 START TEST chaining 00:46:39.597 ************************************ 00:46:39.597 11:57:23 chaining -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/chaining.sh 00:46:39.857 * Looking for test storage... 00:46:39.857 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:46:39.857 11:57:23 chaining -- bdev/chaining.sh@14 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:46:39.857 11:57:23 chaining -- nvmf/common.sh@7 -- # uname -s 00:46:39.857 11:57:23 chaining -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:46:39.857 11:57:23 chaining -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:46:39.857 11:57:23 chaining -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:46:39.857 11:57:23 chaining -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:46:39.857 11:57:23 chaining -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:46:39.857 11:57:23 chaining -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:46:39.857 11:57:23 chaining -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:46:39.857 11:57:23 chaining -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:46:39.857 11:57:23 chaining -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:46:39.857 11:57:23 chaining -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:46:39.857 11:57:23 chaining -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:804da62e-425e-e711-906e-0017a4403562 00:46:39.857 11:57:23 chaining -- nvmf/common.sh@18 -- # NVME_HOSTID=804da62e-425e-e711-906e-0017a4403562 00:46:39.857 11:57:23 chaining -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:46:39.857 11:57:23 chaining -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:46:39.857 11:57:23 chaining -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:46:39.857 11:57:23 chaining -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:46:39.857 11:57:23 chaining -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:46:39.857 11:57:23 chaining -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:46:39.857 11:57:23 chaining -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:46:39.857 11:57:23 chaining -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:46:39.857 11:57:23 chaining -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:46:39.857 11:57:23 chaining -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:46:39.857 11:57:23 chaining -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:46:39.857 11:57:23 chaining -- paths/export.sh@5 -- # export PATH 00:46:39.857 11:57:23 chaining -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:46:39.857 11:57:23 chaining -- nvmf/common.sh@47 -- # : 0 00:46:39.857 11:57:23 chaining -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:46:39.857 11:57:23 chaining -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:46:39.857 11:57:23 chaining -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:46:39.857 11:57:23 chaining -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:46:39.857 11:57:23 chaining -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:46:39.857 11:57:23 chaining -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:46:39.857 11:57:23 chaining -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:46:39.857 11:57:23 chaining -- nvmf/common.sh@51 -- # have_pci_nics=0 00:46:39.857 11:57:23 chaining -- bdev/chaining.sh@16 -- # nqn=nqn.2016-06.io.spdk:cnode0 00:46:39.857 11:57:23 chaining -- bdev/chaining.sh@17 -- # key0=(00112233445566778899001122334455 11223344556677889900112233445500) 00:46:39.857 11:57:23 chaining -- bdev/chaining.sh@18 -- # key1=(22334455667788990011223344550011 33445566778899001122334455001122) 00:46:39.857 11:57:23 chaining -- bdev/chaining.sh@19 -- # bperfsock=/var/tmp/bperf.sock 00:46:39.857 11:57:23 chaining -- bdev/chaining.sh@20 -- # declare -A stats 00:46:39.857 11:57:23 chaining -- bdev/chaining.sh@66 -- # nvmftestinit 00:46:39.857 11:57:23 chaining -- nvmf/common.sh@441 -- # '[' -z tcp ']' 00:46:39.857 11:57:23 chaining -- nvmf/common.sh@446 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:46:39.857 11:57:23 chaining -- nvmf/common.sh@448 -- # prepare_net_devs 00:46:39.857 11:57:23 chaining -- nvmf/common.sh@410 -- # local -g is_hw=no 00:46:39.857 11:57:23 chaining -- nvmf/common.sh@412 -- # remove_spdk_ns 00:46:39.857 11:57:23 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:46:39.857 11:57:23 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:46:39.857 11:57:23 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:46:39.857 11:57:23 chaining -- nvmf/common.sh@414 -- # [[ phy-fallback != virt ]] 00:46:39.857 11:57:23 chaining -- nvmf/common.sh@414 -- # gather_supported_nvmf_pci_devs 00:46:39.857 11:57:23 chaining -- nvmf/common.sh@285 -- # xtrace_disable 00:46:39.857 11:57:23 chaining -- common/autotest_common.sh@10 -- # set +x 00:46:47.979 11:57:31 chaining -- nvmf/common.sh@289 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:46:47.979 11:57:31 chaining -- nvmf/common.sh@291 -- # pci_devs=() 00:46:47.979 11:57:31 chaining -- nvmf/common.sh@291 -- # local -a pci_devs 00:46:47.979 11:57:31 chaining -- nvmf/common.sh@292 -- # pci_net_devs=() 00:46:47.979 11:57:31 chaining -- nvmf/common.sh@292 -- # local -a pci_net_devs 00:46:47.979 11:57:31 chaining -- nvmf/common.sh@293 -- # pci_drivers=() 00:46:47.979 11:57:31 chaining -- nvmf/common.sh@293 -- # local -A pci_drivers 00:46:47.979 11:57:31 chaining -- nvmf/common.sh@295 -- # net_devs=() 00:46:47.979 11:57:31 chaining -- nvmf/common.sh@295 -- # local -ga net_devs 00:46:47.979 11:57:31 chaining -- nvmf/common.sh@296 -- # e810=() 00:46:47.979 11:57:31 chaining -- nvmf/common.sh@296 -- # local -ga e810 00:46:47.979 11:57:31 chaining -- nvmf/common.sh@297 -- # x722=() 00:46:47.979 11:57:31 chaining -- nvmf/common.sh@297 -- # local -ga x722 00:46:47.979 11:57:31 chaining -- nvmf/common.sh@298 -- # mlx=() 00:46:47.979 11:57:31 chaining -- nvmf/common.sh@298 -- # local -ga mlx 00:46:47.979 11:57:31 chaining -- nvmf/common.sh@301 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:46:47.979 11:57:31 chaining -- nvmf/common.sh@302 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:46:47.979 11:57:31 chaining -- nvmf/common.sh@304 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:46:47.979 11:57:31 chaining -- nvmf/common.sh@306 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@308 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@310 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@312 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@314 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@315 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@317 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@318 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@320 -- # pci_devs+=("${e810[@]}") 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@321 -- # [[ tcp == rdma ]] 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@327 -- # [[ '' == mlx5 ]] 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@329 -- # [[ '' == e810 ]] 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@331 -- # [[ '' == x722 ]] 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@335 -- # (( 0 == 0 )) 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@336 -- # return 1 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@416 -- # [[ no == yes ]] 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@423 -- # [[ phy-fallback == phy ]] 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@426 -- # [[ phy-fallback == phy-fallback ]] 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@427 -- # echo 'WARNING: No supported devices were found, fallback requested for tcp test' 00:46:47.980 WARNING: No supported devices were found, fallback requested for tcp test 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@431 -- # [[ tcp == tcp ]] 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@432 -- # nvmf_veth_init 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@141 -- # NVMF_INITIATOR_IP=10.0.0.1 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@142 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@143 -- # NVMF_SECOND_TARGET_IP=10.0.0.3 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@144 -- # NVMF_BRIDGE=nvmf_br 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@145 -- # NVMF_INITIATOR_INTERFACE=nvmf_init_if 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@146 -- # NVMF_INITIATOR_BRIDGE=nvmf_init_br 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@147 -- # NVMF_TARGET_NAMESPACE=nvmf_tgt_ns_spdk 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@148 -- # NVMF_TARGET_NS_CMD=(ip netns exec "$NVMF_TARGET_NAMESPACE") 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@149 -- # NVMF_TARGET_INTERFACE=nvmf_tgt_if 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@150 -- # NVMF_TARGET_INTERFACE2=nvmf_tgt_if2 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@151 -- # NVMF_TARGET_BRIDGE=nvmf_tgt_br 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@152 -- # NVMF_TARGET_BRIDGE2=nvmf_tgt_br2 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@154 -- # ip link set nvmf_init_br nomaster 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@155 -- # ip link set nvmf_tgt_br nomaster 00:46:47.980 Cannot find device "nvmf_tgt_br" 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@155 -- # true 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@156 -- # ip link set nvmf_tgt_br2 nomaster 00:46:47.980 Cannot find device "nvmf_tgt_br2" 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@156 -- # true 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@157 -- # ip link set nvmf_init_br down 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@158 -- # ip link set nvmf_tgt_br down 00:46:47.980 Cannot find device "nvmf_tgt_br" 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@158 -- # true 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@159 -- # ip link set nvmf_tgt_br2 down 00:46:47.980 Cannot find device "nvmf_tgt_br2" 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@159 -- # true 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@160 -- # ip link delete nvmf_br type bridge 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@161 -- # ip link delete nvmf_init_if 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@162 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if 00:46:47.980 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@162 -- # true 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@163 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if2 00:46:47.980 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@163 -- # true 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@166 -- # ip netns add nvmf_tgt_ns_spdk 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@169 -- # ip link add nvmf_init_if type veth peer name nvmf_init_br 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@170 -- # ip link add nvmf_tgt_if type veth peer name nvmf_tgt_br 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@171 -- # ip link add nvmf_tgt_if2 type veth peer name nvmf_tgt_br2 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@174 -- # ip link set nvmf_tgt_if netns nvmf_tgt_ns_spdk 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@175 -- # ip link set nvmf_tgt_if2 netns nvmf_tgt_ns_spdk 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@178 -- # ip addr add 10.0.0.1/24 dev nvmf_init_if 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@179 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.2/24 dev nvmf_tgt_if 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@180 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.3/24 dev nvmf_tgt_if2 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@183 -- # ip link set nvmf_init_if up 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@184 -- # ip link set nvmf_init_br up 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@185 -- # ip link set nvmf_tgt_br up 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@186 -- # ip link set nvmf_tgt_br2 up 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@187 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if up 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@188 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if2 up 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@189 -- # ip netns exec nvmf_tgt_ns_spdk ip link set lo up 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@192 -- # ip link add nvmf_br type bridge 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@193 -- # ip link set nvmf_br up 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@196 -- # ip link set nvmf_init_br master nvmf_br 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@197 -- # ip link set nvmf_tgt_br master nvmf_br 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@198 -- # ip link set nvmf_tgt_br2 master nvmf_br 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@201 -- # iptables -I INPUT 1 -i nvmf_init_if -p tcp --dport 4420 -j ACCEPT 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@202 -- # iptables -A FORWARD -i nvmf_br -o nvmf_br -j ACCEPT 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@205 -- # ping -c 1 10.0.0.2 00:46:47.980 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:46:47.980 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.088 ms 00:46:47.980 00:46:47.980 --- 10.0.0.2 ping statistics --- 00:46:47.980 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:46:47.980 rtt min/avg/max/mdev = 0.088/0.088/0.088/0.000 ms 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@206 -- # ping -c 1 10.0.0.3 00:46:47.980 PING 10.0.0.3 (10.0.0.3) 56(84) bytes of data. 00:46:47.980 64 bytes from 10.0.0.3: icmp_seq=1 ttl=64 time=0.103 ms 00:46:47.980 00:46:47.980 --- 10.0.0.3 ping statistics --- 00:46:47.980 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:46:47.980 rtt min/avg/max/mdev = 0.103/0.103/0.103/0.000 ms 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@207 -- # ip netns exec nvmf_tgt_ns_spdk ping -c 1 10.0.0.1 00:46:47.980 PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data. 00:46:47.980 64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.046 ms 00:46:47.980 00:46:47.980 --- 10.0.0.1 ping statistics --- 00:46:47.980 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:46:47.980 rtt min/avg/max/mdev = 0.046/0.046/0.046/0.000 ms 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@209 -- # NVMF_APP=("${NVMF_TARGET_NS_CMD[@]}" "${NVMF_APP[@]}") 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@433 -- # return 0 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@450 -- # '[' '' == iso ']' 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@454 -- # NVMF_TRANSPORT_OPTS='-t tcp' 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@455 -- # [[ tcp == \r\d\m\a ]] 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@464 -- # [[ tcp == \t\c\p ]] 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@465 -- # NVMF_TRANSPORT_OPTS='-t tcp -o' 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@468 -- # '[' tcp == tcp ']' 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@474 -- # modprobe nvme-tcp 00:46:47.980 11:57:31 chaining -- bdev/chaining.sh@67 -- # nvmfappstart -m 0x2 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@479 -- # timing_enter start_nvmf_tgt 00:46:47.980 11:57:31 chaining -- common/autotest_common.sh@723 -- # xtrace_disable 00:46:47.980 11:57:31 chaining -- common/autotest_common.sh@10 -- # set +x 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@481 -- # nvmfpid=306463 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@482 -- # waitforlisten 306463 00:46:47.980 11:57:31 chaining -- nvmf/common.sh@480 -- # ip netns exec nvmf_tgt_ns_spdk /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x2 00:46:47.980 11:57:31 chaining -- common/autotest_common.sh@830 -- # '[' -z 306463 ']' 00:46:47.980 11:57:31 chaining -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:46:47.980 11:57:31 chaining -- common/autotest_common.sh@835 -- # local max_retries=100 00:46:47.981 11:57:31 chaining -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:46:47.981 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:46:47.981 11:57:31 chaining -- common/autotest_common.sh@839 -- # xtrace_disable 00:46:47.981 11:57:31 chaining -- common/autotest_common.sh@10 -- # set +x 00:46:47.981 [2024-06-10 11:57:31.830478] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:46:47.981 [2024-06-10 11:57:31.830533] [ DPDK EAL parameters: nvmf -c 0x2 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:46:47.981 [2024-06-10 11:57:31.923371] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:46:48.239 [2024-06-10 11:57:32.015990] app.c: 604:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:46:48.239 [2024-06-10 11:57:32.016032] app.c: 605:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:46:48.239 [2024-06-10 11:57:32.016041] app.c: 610:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:46:48.239 [2024-06-10 11:57:32.016049] app.c: 611:app_setup_trace: *NOTICE*: SPDK application currently running. 00:46:48.239 [2024-06-10 11:57:32.016057] app.c: 612:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:46:48.239 [2024-06-10 11:57:32.016084] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:46:48.806 11:57:32 chaining -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:46:48.806 11:57:32 chaining -- common/autotest_common.sh@863 -- # return 0 00:46:48.806 11:57:32 chaining -- nvmf/common.sh@483 -- # timing_exit start_nvmf_tgt 00:46:48.806 11:57:32 chaining -- common/autotest_common.sh@729 -- # xtrace_disable 00:46:48.807 11:57:32 chaining -- common/autotest_common.sh@10 -- # set +x 00:46:48.807 11:57:32 chaining -- nvmf/common.sh@484 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:46:48.807 11:57:32 chaining -- bdev/chaining.sh@69 -- # mktemp 00:46:48.807 11:57:32 chaining -- bdev/chaining.sh@69 -- # input=/tmp/tmp.G1vP9tL19u 00:46:48.807 11:57:32 chaining -- bdev/chaining.sh@69 -- # mktemp 00:46:48.807 11:57:32 chaining -- bdev/chaining.sh@69 -- # output=/tmp/tmp.qhdnO2H9U1 00:46:48.807 11:57:32 chaining -- bdev/chaining.sh@70 -- # trap 'tgtcleanup; exit 1' SIGINT SIGTERM EXIT 00:46:48.807 11:57:32 chaining -- bdev/chaining.sh@72 -- # rpc_cmd 00:46:48.807 11:57:32 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:46:48.807 11:57:32 chaining -- common/autotest_common.sh@10 -- # set +x 00:46:48.807 malloc0 00:46:48.807 true 00:46:48.807 true 00:46:48.807 [2024-06-10 11:57:32.730118] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:46:48.807 crypto0 00:46:48.807 [2024-06-10 11:57:32.738143] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:46:48.807 crypto1 00:46:48.807 [2024-06-10 11:57:32.746234] tcp.c: 672:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:46:49.066 [2024-06-10 11:57:32.762452] tcp.c: 967:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 10.0.0.2 port 4420 *** 00:46:49.066 11:57:32 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:46:49.066 11:57:32 chaining -- bdev/chaining.sh@85 -- # update_stats 00:46:49.066 11:57:32 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:46:49.066 11:57:32 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:46:49.066 11:57:32 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:46:49.066 11:57:32 chaining -- bdev/chaining.sh@39 -- # opcode= 00:46:49.066 11:57:32 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:46:49.066 11:57:32 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:46:49.066 11:57:32 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:46:49.066 11:57:32 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:46:49.066 11:57:32 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:46:49.066 11:57:32 chaining -- common/autotest_common.sh@10 -- # set +x 00:46:49.066 11:57:32 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:46:49.066 11:57:32 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=12 00:46:49.066 11:57:32 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:46:49.066 11:57:32 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:46:49.066 11:57:32 chaining -- bdev/chaining.sh@39 -- # event=executed 00:46:49.066 11:57:32 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:46:49.066 11:57:32 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:46:49.066 11:57:32 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:46:49.066 11:57:32 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:46:49.066 11:57:32 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:46:49.066 11:57:32 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:46:49.066 11:57:32 chaining -- common/autotest_common.sh@10 -- # set +x 00:46:49.066 11:57:32 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:46:49.066 11:57:32 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]= 00:46:49.066 11:57:32 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:46:49.066 11:57:32 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:46:49.066 11:57:32 chaining -- bdev/chaining.sh@39 -- # event=executed 00:46:49.066 11:57:32 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:46:49.066 11:57:32 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:46:49.066 11:57:32 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:46:49.066 11:57:32 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:46:49.066 11:57:32 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:46:49.066 11:57:32 chaining -- common/autotest_common.sh@10 -- # set +x 00:46:49.066 11:57:32 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:46:49.066 11:57:32 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:46:49.066 11:57:32 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=12 00:46:49.066 11:57:32 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:46:49.066 11:57:32 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:46:49.066 11:57:32 chaining -- bdev/chaining.sh@39 -- # event=executed 00:46:49.066 11:57:32 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:46:49.066 11:57:32 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:46:49.066 11:57:32 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:46:49.066 11:57:32 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:46:49.066 11:57:32 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:46:49.066 11:57:32 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:46:49.066 11:57:32 chaining -- common/autotest_common.sh@10 -- # set +x 00:46:49.066 11:57:32 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:46:49.066 11:57:32 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:46:49.066 11:57:32 chaining -- bdev/chaining.sh@88 -- # dd if=/dev/urandom of=/tmp/tmp.G1vP9tL19u bs=1K count=64 00:46:49.066 64+0 records in 00:46:49.066 64+0 records out 00:46:49.066 65536 bytes (66 kB, 64 KiB) copied, 0.00106057 s, 61.8 MB/s 00:46:49.066 11:57:32 chaining -- bdev/chaining.sh@89 -- # spdk_dd --if /tmp/tmp.G1vP9tL19u --ob Nvme0n1 --bs 65536 --count 1 00:46:49.066 11:57:32 chaining -- bdev/chaining.sh@25 -- # local config 00:46:49.066 11:57:32 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:46:49.066 11:57:32 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:46:49.066 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:46:49.066 11:57:33 chaining -- bdev/chaining.sh@31 -- # config='{ 00:46:49.066 "subsystems": [ 00:46:49.066 { 00:46:49.066 "subsystem": "bdev", 00:46:49.066 "config": [ 00:46:49.066 { 00:46:49.066 "method": "bdev_nvme_attach_controller", 00:46:49.066 "params": { 00:46:49.066 "trtype": "tcp", 00:46:49.066 "adrfam": "IPv4", 00:46:49.066 "name": "Nvme0", 00:46:49.066 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:46:49.066 "traddr": "10.0.0.2", 00:46:49.066 "trsvcid": "4420" 00:46:49.066 } 00:46:49.066 }, 00:46:49.066 { 00:46:49.066 "method": "bdev_set_options", 00:46:49.066 "params": { 00:46:49.066 "bdev_auto_examine": false 00:46:49.067 } 00:46:49.067 } 00:46:49.067 ] 00:46:49.067 } 00:46:49.067 ] 00:46:49.067 }' 00:46:49.067 11:57:33 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:46:49.067 "subsystems": [ 00:46:49.067 { 00:46:49.067 "subsystem": "bdev", 00:46:49.067 "config": [ 00:46:49.067 { 00:46:49.067 "method": "bdev_nvme_attach_controller", 00:46:49.067 "params": { 00:46:49.067 "trtype": "tcp", 00:46:49.067 "adrfam": "IPv4", 00:46:49.067 "name": "Nvme0", 00:46:49.067 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:46:49.067 "traddr": "10.0.0.2", 00:46:49.067 "trsvcid": "4420" 00:46:49.067 } 00:46:49.067 }, 00:46:49.067 { 00:46:49.067 "method": "bdev_set_options", 00:46:49.067 "params": { 00:46:49.067 "bdev_auto_examine": false 00:46:49.067 } 00:46:49.067 } 00:46:49.067 ] 00:46:49.067 } 00:46:49.067 ] 00:46:49.067 }' 00:46:49.067 11:57:33 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /tmp/tmp.G1vP9tL19u --ob Nvme0n1 --bs 65536 --count 1 00:46:49.325 [2024-06-10 11:57:33.053634] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:46:49.325 [2024-06-10 11:57:33.053678] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid306615 ] 00:46:49.325 [2024-06-10 11:57:33.140687] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:46:49.325 [2024-06-10 11:57:33.220469] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:46:49.843  Copying: 64/64 [kB] (average 12 MBps) 00:46:49.843 00:46:49.843 11:57:33 chaining -- bdev/chaining.sh@90 -- # get_stat sequence_executed 00:46:49.843 11:57:33 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:46:49.843 11:57:33 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:46:49.843 11:57:33 chaining -- bdev/chaining.sh@39 -- # opcode= 00:46:49.843 11:57:33 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:46:49.843 11:57:33 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:46:49.843 11:57:33 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:46:49.843 11:57:33 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:46:49.843 11:57:33 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:46:49.843 11:57:33 chaining -- common/autotest_common.sh@10 -- # set +x 00:46:49.843 11:57:33 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:46:49.843 11:57:33 chaining -- bdev/chaining.sh@90 -- # (( 13 == stats[sequence_executed] + 1 )) 00:46:49.843 11:57:33 chaining -- bdev/chaining.sh@91 -- # get_stat executed encrypt 00:46:49.843 11:57:33 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:46:49.843 11:57:33 chaining -- bdev/chaining.sh@39 -- # event=executed 00:46:49.843 11:57:33 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:46:49.843 11:57:33 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:46:49.843 11:57:33 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:46:49.843 11:57:33 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:46:49.843 11:57:33 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:46:49.843 11:57:33 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:46:49.843 11:57:33 chaining -- common/autotest_common.sh@10 -- # set +x 00:46:49.843 11:57:33 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:46:49.843 11:57:33 chaining -- bdev/chaining.sh@91 -- # (( 2 == stats[encrypt_executed] + 2 )) 00:46:49.843 11:57:33 chaining -- bdev/chaining.sh@92 -- # get_stat executed decrypt 00:46:49.843 11:57:33 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:46:49.843 11:57:33 chaining -- bdev/chaining.sh@39 -- # event=executed 00:46:49.843 11:57:33 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:46:49.843 11:57:33 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:46:49.843 11:57:33 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:46:49.843 11:57:33 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:46:49.843 11:57:33 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:46:49.843 11:57:33 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:46:49.843 11:57:33 chaining -- common/autotest_common.sh@10 -- # set +x 00:46:49.843 11:57:33 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:46:49.843 11:57:33 chaining -- bdev/chaining.sh@92 -- # (( 12 == stats[decrypt_executed] )) 00:46:49.843 11:57:33 chaining -- bdev/chaining.sh@95 -- # get_stat executed copy 00:46:49.843 11:57:33 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:46:49.843 11:57:33 chaining -- bdev/chaining.sh@39 -- # event=executed 00:46:49.843 11:57:33 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:46:49.843 11:57:33 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:46:49.843 11:57:33 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:46:49.843 11:57:33 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:46:49.843 11:57:33 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:46:49.843 11:57:33 chaining -- common/autotest_common.sh@10 -- # set +x 00:46:49.843 11:57:33 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:46:49.843 11:57:33 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:46:49.843 11:57:33 chaining -- bdev/chaining.sh@95 -- # (( 4 == stats[copy_executed] )) 00:46:49.843 11:57:33 chaining -- bdev/chaining.sh@96 -- # update_stats 00:46:49.843 11:57:33 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:46:49.843 11:57:33 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:46:49.843 11:57:33 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:46:49.843 11:57:33 chaining -- bdev/chaining.sh@39 -- # opcode= 00:46:49.843 11:57:33 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:46:49.843 11:57:33 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:46:49.843 11:57:33 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:46:49.843 11:57:33 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:46:49.843 11:57:33 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:46:49.843 11:57:33 chaining -- common/autotest_common.sh@10 -- # set +x 00:46:50.103 11:57:33 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:46:50.103 11:57:33 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=13 00:46:50.103 11:57:33 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:46:50.103 11:57:33 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:46:50.103 11:57:33 chaining -- bdev/chaining.sh@39 -- # event=executed 00:46:50.103 11:57:33 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:46:50.103 11:57:33 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:46:50.103 11:57:33 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:46:50.103 11:57:33 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:46:50.103 11:57:33 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:46:50.103 11:57:33 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:46:50.103 11:57:33 chaining -- common/autotest_common.sh@10 -- # set +x 00:46:50.103 11:57:33 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:46:50.103 11:57:33 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=2 00:46:50.103 11:57:33 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:46:50.103 11:57:33 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:46:50.103 11:57:33 chaining -- bdev/chaining.sh@39 -- # event=executed 00:46:50.103 11:57:33 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:46:50.103 11:57:33 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:46:50.103 11:57:33 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:46:50.103 11:57:33 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:46:50.103 11:57:33 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:46:50.103 11:57:33 chaining -- common/autotest_common.sh@10 -- # set +x 00:46:50.103 11:57:33 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:46:50.103 11:57:33 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:46:50.103 11:57:33 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=12 00:46:50.103 11:57:33 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:46:50.103 11:57:33 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:46:50.103 11:57:33 chaining -- bdev/chaining.sh@39 -- # event=executed 00:46:50.103 11:57:33 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:46:50.103 11:57:33 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:46:50.103 11:57:33 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:46:50.103 11:57:33 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:46:50.103 11:57:33 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:46:50.103 11:57:33 chaining -- common/autotest_common.sh@10 -- # set +x 00:46:50.103 11:57:33 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:46:50.103 11:57:33 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:46:50.104 11:57:33 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:46:50.104 11:57:33 chaining -- bdev/chaining.sh@99 -- # spdk_dd --of /tmp/tmp.qhdnO2H9U1 --ib Nvme0n1 --bs 65536 --count 1 00:46:50.104 11:57:33 chaining -- bdev/chaining.sh@25 -- # local config 00:46:50.104 11:57:33 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:46:50.104 11:57:33 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:46:50.104 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:46:50.104 11:57:33 chaining -- bdev/chaining.sh@31 -- # config='{ 00:46:50.104 "subsystems": [ 00:46:50.104 { 00:46:50.104 "subsystem": "bdev", 00:46:50.104 "config": [ 00:46:50.104 { 00:46:50.104 "method": "bdev_nvme_attach_controller", 00:46:50.104 "params": { 00:46:50.104 "trtype": "tcp", 00:46:50.104 "adrfam": "IPv4", 00:46:50.104 "name": "Nvme0", 00:46:50.104 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:46:50.104 "traddr": "10.0.0.2", 00:46:50.104 "trsvcid": "4420" 00:46:50.104 } 00:46:50.104 }, 00:46:50.104 { 00:46:50.104 "method": "bdev_set_options", 00:46:50.104 "params": { 00:46:50.104 "bdev_auto_examine": false 00:46:50.104 } 00:46:50.104 } 00:46:50.104 ] 00:46:50.104 } 00:46:50.104 ] 00:46:50.104 }' 00:46:50.104 11:57:33 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --of /tmp/tmp.qhdnO2H9U1 --ib Nvme0n1 --bs 65536 --count 1 00:46:50.104 11:57:33 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:46:50.104 "subsystems": [ 00:46:50.104 { 00:46:50.104 "subsystem": "bdev", 00:46:50.104 "config": [ 00:46:50.104 { 00:46:50.104 "method": "bdev_nvme_attach_controller", 00:46:50.104 "params": { 00:46:50.104 "trtype": "tcp", 00:46:50.104 "adrfam": "IPv4", 00:46:50.104 "name": "Nvme0", 00:46:50.104 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:46:50.104 "traddr": "10.0.0.2", 00:46:50.104 "trsvcid": "4420" 00:46:50.104 } 00:46:50.104 }, 00:46:50.104 { 00:46:50.104 "method": "bdev_set_options", 00:46:50.104 "params": { 00:46:50.104 "bdev_auto_examine": false 00:46:50.104 } 00:46:50.104 } 00:46:50.104 ] 00:46:50.104 } 00:46:50.104 ] 00:46:50.104 }' 00:46:50.104 [2024-06-10 11:57:34.009433] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:46:50.104 [2024-06-10 11:57:34.009480] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid306777 ] 00:46:50.363 [2024-06-10 11:57:34.095470] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:46:50.363 [2024-06-10 11:57:34.174292] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:46:50.622  Copying: 64/64 [kB] (average 9142 kBps) 00:46:50.622 00:46:50.622 11:57:34 chaining -- bdev/chaining.sh@100 -- # get_stat sequence_executed 00:46:50.622 11:57:34 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:46:50.622 11:57:34 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:46:50.622 11:57:34 chaining -- bdev/chaining.sh@39 -- # opcode= 00:46:50.622 11:57:34 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:46:50.622 11:57:34 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:46:50.622 11:57:34 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:46:50.622 11:57:34 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:46:50.622 11:57:34 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:46:50.622 11:57:34 chaining -- common/autotest_common.sh@10 -- # set +x 00:46:50.881 11:57:34 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:46:50.881 11:57:34 chaining -- bdev/chaining.sh@100 -- # (( 14 == stats[sequence_executed] + 1 )) 00:46:50.881 11:57:34 chaining -- bdev/chaining.sh@101 -- # get_stat executed encrypt 00:46:50.881 11:57:34 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:46:50.881 11:57:34 chaining -- bdev/chaining.sh@39 -- # event=executed 00:46:50.881 11:57:34 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:46:50.881 11:57:34 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:46:50.881 11:57:34 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:46:50.881 11:57:34 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:46:50.881 11:57:34 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:46:50.881 11:57:34 chaining -- common/autotest_common.sh@10 -- # set +x 00:46:50.881 11:57:34 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:46:50.881 11:57:34 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:46:50.881 11:57:34 chaining -- bdev/chaining.sh@101 -- # (( 2 == stats[encrypt_executed] )) 00:46:50.881 11:57:34 chaining -- bdev/chaining.sh@102 -- # get_stat executed decrypt 00:46:50.881 11:57:34 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:46:50.881 11:57:34 chaining -- bdev/chaining.sh@39 -- # event=executed 00:46:50.881 11:57:34 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:46:50.881 11:57:34 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:46:50.881 11:57:34 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:46:50.881 11:57:34 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:46:50.881 11:57:34 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:46:50.881 11:57:34 chaining -- common/autotest_common.sh@10 -- # set +x 00:46:50.881 11:57:34 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:46:50.881 11:57:34 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:46:50.881 11:57:34 chaining -- bdev/chaining.sh@102 -- # (( 14 == stats[decrypt_executed] + 2 )) 00:46:50.881 11:57:34 chaining -- bdev/chaining.sh@103 -- # get_stat executed copy 00:46:50.881 11:57:34 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:46:50.881 11:57:34 chaining -- bdev/chaining.sh@39 -- # event=executed 00:46:50.881 11:57:34 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:46:50.881 11:57:34 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:46:50.881 11:57:34 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:46:50.881 11:57:34 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:46:50.881 11:57:34 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:46:50.881 11:57:34 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:46:50.881 11:57:34 chaining -- common/autotest_common.sh@10 -- # set +x 00:46:50.881 11:57:34 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:46:50.881 11:57:34 chaining -- bdev/chaining.sh@103 -- # (( 4 == stats[copy_executed] )) 00:46:50.881 11:57:34 chaining -- bdev/chaining.sh@104 -- # cmp /tmp/tmp.G1vP9tL19u /tmp/tmp.qhdnO2H9U1 00:46:50.881 11:57:34 chaining -- bdev/chaining.sh@105 -- # spdk_dd --if /dev/zero --ob Nvme0n1 --bs 65536 --count 1 00:46:50.882 11:57:34 chaining -- bdev/chaining.sh@25 -- # local config 00:46:50.882 11:57:34 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:46:50.882 11:57:34 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:46:50.882 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:46:50.882 11:57:34 chaining -- bdev/chaining.sh@31 -- # config='{ 00:46:50.882 "subsystems": [ 00:46:50.882 { 00:46:50.882 "subsystem": "bdev", 00:46:50.882 "config": [ 00:46:50.882 { 00:46:50.882 "method": "bdev_nvme_attach_controller", 00:46:50.882 "params": { 00:46:50.882 "trtype": "tcp", 00:46:50.882 "adrfam": "IPv4", 00:46:50.882 "name": "Nvme0", 00:46:50.882 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:46:50.882 "traddr": "10.0.0.2", 00:46:50.882 "trsvcid": "4420" 00:46:50.882 } 00:46:50.882 }, 00:46:50.882 { 00:46:50.882 "method": "bdev_set_options", 00:46:50.882 "params": { 00:46:50.882 "bdev_auto_examine": false 00:46:50.882 } 00:46:50.882 } 00:46:50.882 ] 00:46:50.882 } 00:46:50.882 ] 00:46:50.882 }' 00:46:50.882 11:57:34 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:46:50.882 "subsystems": [ 00:46:50.882 { 00:46:50.882 "subsystem": "bdev", 00:46:50.882 "config": [ 00:46:50.882 { 00:46:50.882 "method": "bdev_nvme_attach_controller", 00:46:50.882 "params": { 00:46:50.882 "trtype": "tcp", 00:46:50.882 "adrfam": "IPv4", 00:46:50.882 "name": "Nvme0", 00:46:50.882 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:46:50.882 "traddr": "10.0.0.2", 00:46:50.882 "trsvcid": "4420" 00:46:50.882 } 00:46:50.882 }, 00:46:50.882 { 00:46:50.882 "method": "bdev_set_options", 00:46:50.882 "params": { 00:46:50.882 "bdev_auto_examine": false 00:46:50.882 } 00:46:50.882 } 00:46:50.882 ] 00:46:50.882 } 00:46:50.882 ] 00:46:50.882 }' 00:46:50.882 11:57:34 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /dev/zero --ob Nvme0n1 --bs 65536 --count 1 00:46:50.882 [2024-06-10 11:57:34.818840] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:46:50.882 [2024-06-10 11:57:34.818893] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid306971 ] 00:46:51.141 [2024-06-10 11:57:34.902714] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:46:51.141 [2024-06-10 11:57:34.982663] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:46:51.660  Copying: 64/64 [kB] (average 12 MBps) 00:46:51.660 00:46:51.660 11:57:35 chaining -- bdev/chaining.sh@106 -- # update_stats 00:46:51.660 11:57:35 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:46:51.660 11:57:35 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:46:51.660 11:57:35 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:46:51.660 11:57:35 chaining -- bdev/chaining.sh@39 -- # opcode= 00:46:51.660 11:57:35 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:46:51.660 11:57:35 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:46:51.660 11:57:35 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:46:51.660 11:57:35 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:46:51.660 11:57:35 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:46:51.660 11:57:35 chaining -- common/autotest_common.sh@10 -- # set +x 00:46:51.660 11:57:35 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:46:51.660 11:57:35 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=15 00:46:51.660 11:57:35 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:46:51.660 11:57:35 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:46:51.660 11:57:35 chaining -- bdev/chaining.sh@39 -- # event=executed 00:46:51.660 11:57:35 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:46:51.660 11:57:35 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:46:51.660 11:57:35 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:46:51.660 11:57:35 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:46:51.660 11:57:35 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:46:51.660 11:57:35 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:46:51.660 11:57:35 chaining -- common/autotest_common.sh@10 -- # set +x 00:46:51.660 11:57:35 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:46:51.660 11:57:35 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=4 00:46:51.660 11:57:35 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:46:51.660 11:57:35 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:46:51.660 11:57:35 chaining -- bdev/chaining.sh@39 -- # event=executed 00:46:51.660 11:57:35 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:46:51.660 11:57:35 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:46:51.660 11:57:35 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:46:51.660 11:57:35 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:46:51.660 11:57:35 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:46:51.660 11:57:35 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:46:51.660 11:57:35 chaining -- common/autotest_common.sh@10 -- # set +x 00:46:51.660 11:57:35 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:46:51.660 11:57:35 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=14 00:46:51.660 11:57:35 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:46:51.660 11:57:35 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:46:51.660 11:57:35 chaining -- bdev/chaining.sh@39 -- # event=executed 00:46:51.660 11:57:35 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:46:51.660 11:57:35 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:46:51.660 11:57:35 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:46:51.660 11:57:35 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:46:51.660 11:57:35 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:46:51.660 11:57:35 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:46:51.660 11:57:35 chaining -- common/autotest_common.sh@10 -- # set +x 00:46:51.660 11:57:35 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:46:51.660 11:57:35 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:46:51.660 11:57:35 chaining -- bdev/chaining.sh@109 -- # spdk_dd --if /tmp/tmp.G1vP9tL19u --ob Nvme0n1 --bs 4096 --count 16 00:46:51.660 11:57:35 chaining -- bdev/chaining.sh@25 -- # local config 00:46:51.660 11:57:35 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:46:51.660 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:46:51.660 11:57:35 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:46:51.920 11:57:35 chaining -- bdev/chaining.sh@31 -- # config='{ 00:46:51.920 "subsystems": [ 00:46:51.920 { 00:46:51.920 "subsystem": "bdev", 00:46:51.920 "config": [ 00:46:51.920 { 00:46:51.920 "method": "bdev_nvme_attach_controller", 00:46:51.920 "params": { 00:46:51.920 "trtype": "tcp", 00:46:51.920 "adrfam": "IPv4", 00:46:51.920 "name": "Nvme0", 00:46:51.920 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:46:51.920 "traddr": "10.0.0.2", 00:46:51.920 "trsvcid": "4420" 00:46:51.920 } 00:46:51.920 }, 00:46:51.920 { 00:46:51.920 "method": "bdev_set_options", 00:46:51.920 "params": { 00:46:51.920 "bdev_auto_examine": false 00:46:51.920 } 00:46:51.920 } 00:46:51.920 ] 00:46:51.920 } 00:46:51.920 ] 00:46:51.920 }' 00:46:51.920 11:57:35 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:46:51.920 "subsystems": [ 00:46:51.920 { 00:46:51.920 "subsystem": "bdev", 00:46:51.920 "config": [ 00:46:51.920 { 00:46:51.920 "method": "bdev_nvme_attach_controller", 00:46:51.920 "params": { 00:46:51.920 "trtype": "tcp", 00:46:51.920 "adrfam": "IPv4", 00:46:51.920 "name": "Nvme0", 00:46:51.920 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:46:51.920 "traddr": "10.0.0.2", 00:46:51.920 "trsvcid": "4420" 00:46:51.920 } 00:46:51.920 }, 00:46:51.920 { 00:46:51.920 "method": "bdev_set_options", 00:46:51.920 "params": { 00:46:51.920 "bdev_auto_examine": false 00:46:51.920 } 00:46:51.920 } 00:46:51.920 ] 00:46:51.920 } 00:46:51.920 ] 00:46:51.920 }' 00:46:51.920 11:57:35 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /tmp/tmp.G1vP9tL19u --ob Nvme0n1 --bs 4096 --count 16 00:46:51.920 [2024-06-10 11:57:35.676657] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:46:51.920 [2024-06-10 11:57:35.676708] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid307004 ] 00:46:51.920 [2024-06-10 11:57:35.763244] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:46:51.920 [2024-06-10 11:57:35.842381] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:46:52.439  Copying: 64/64 [kB] (average 12 MBps) 00:46:52.439 00:46:52.439 11:57:36 chaining -- bdev/chaining.sh@110 -- # get_stat sequence_executed 00:46:52.439 11:57:36 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:46:52.439 11:57:36 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:46:52.439 11:57:36 chaining -- bdev/chaining.sh@39 -- # opcode= 00:46:52.439 11:57:36 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:46:52.439 11:57:36 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:46:52.439 11:57:36 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:46:52.439 11:57:36 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:46:52.439 11:57:36 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:46:52.439 11:57:36 chaining -- common/autotest_common.sh@10 -- # set +x 00:46:52.439 11:57:36 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:46:52.439 11:57:36 chaining -- bdev/chaining.sh@110 -- # (( 31 == stats[sequence_executed] + 16 )) 00:46:52.439 11:57:36 chaining -- bdev/chaining.sh@111 -- # get_stat executed encrypt 00:46:52.439 11:57:36 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:46:52.439 11:57:36 chaining -- bdev/chaining.sh@39 -- # event=executed 00:46:52.439 11:57:36 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:46:52.439 11:57:36 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:46:52.439 11:57:36 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:46:52.439 11:57:36 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:46:52.439 11:57:36 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:46:52.439 11:57:36 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:46:52.439 11:57:36 chaining -- common/autotest_common.sh@10 -- # set +x 00:46:52.439 11:57:36 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:46:52.439 11:57:36 chaining -- bdev/chaining.sh@111 -- # (( 36 == stats[encrypt_executed] + 32 )) 00:46:52.439 11:57:36 chaining -- bdev/chaining.sh@112 -- # get_stat executed decrypt 00:46:52.439 11:57:36 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:46:52.439 11:57:36 chaining -- bdev/chaining.sh@39 -- # event=executed 00:46:52.439 11:57:36 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:46:52.439 11:57:36 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:46:52.439 11:57:36 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:46:52.439 11:57:36 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:46:52.439 11:57:36 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:46:52.439 11:57:36 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:46:52.439 11:57:36 chaining -- common/autotest_common.sh@10 -- # set +x 00:46:52.439 11:57:36 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:46:52.439 11:57:36 chaining -- bdev/chaining.sh@112 -- # (( 14 == stats[decrypt_executed] )) 00:46:52.439 11:57:36 chaining -- bdev/chaining.sh@113 -- # get_stat executed copy 00:46:52.439 11:57:36 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:46:52.439 11:57:36 chaining -- bdev/chaining.sh@39 -- # event=executed 00:46:52.439 11:57:36 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:46:52.439 11:57:36 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:46:52.439 11:57:36 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:46:52.439 11:57:36 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:46:52.439 11:57:36 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:46:52.439 11:57:36 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:46:52.439 11:57:36 chaining -- common/autotest_common.sh@10 -- # set +x 00:46:52.439 11:57:36 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:46:52.699 11:57:36 chaining -- bdev/chaining.sh@113 -- # (( 4 == stats[copy_executed] )) 00:46:52.699 11:57:36 chaining -- bdev/chaining.sh@114 -- # update_stats 00:46:52.699 11:57:36 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:46:52.699 11:57:36 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:46:52.699 11:57:36 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:46:52.699 11:57:36 chaining -- bdev/chaining.sh@39 -- # opcode= 00:46:52.699 11:57:36 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:46:52.699 11:57:36 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:46:52.699 11:57:36 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:46:52.699 11:57:36 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:46:52.699 11:57:36 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:46:52.699 11:57:36 chaining -- common/autotest_common.sh@10 -- # set +x 00:46:52.699 11:57:36 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:46:52.699 11:57:36 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=31 00:46:52.699 11:57:36 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:46:52.699 11:57:36 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:46:52.699 11:57:36 chaining -- bdev/chaining.sh@39 -- # event=executed 00:46:52.699 11:57:36 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:46:52.699 11:57:36 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:46:52.699 11:57:36 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:46:52.699 11:57:36 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:46:52.699 11:57:36 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:46:52.699 11:57:36 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:46:52.699 11:57:36 chaining -- common/autotest_common.sh@10 -- # set +x 00:46:52.699 11:57:36 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:46:52.699 11:57:36 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=36 00:46:52.699 11:57:36 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:46:52.699 11:57:36 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:46:52.699 11:57:36 chaining -- bdev/chaining.sh@39 -- # event=executed 00:46:52.699 11:57:36 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:46:52.699 11:57:36 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:46:52.699 11:57:36 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:46:52.699 11:57:36 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:46:52.699 11:57:36 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:46:52.699 11:57:36 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:46:52.699 11:57:36 chaining -- common/autotest_common.sh@10 -- # set +x 00:46:52.699 11:57:36 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:46:52.699 11:57:36 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=14 00:46:52.699 11:57:36 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:46:52.699 11:57:36 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:46:52.699 11:57:36 chaining -- bdev/chaining.sh@39 -- # event=executed 00:46:52.699 11:57:36 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:46:52.699 11:57:36 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:46:52.699 11:57:36 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:46:52.699 11:57:36 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:46:52.699 11:57:36 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:46:52.699 11:57:36 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:46:52.699 11:57:36 chaining -- common/autotest_common.sh@10 -- # set +x 00:46:52.699 11:57:36 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:46:52.699 11:57:36 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:46:52.699 11:57:36 chaining -- bdev/chaining.sh@117 -- # : 00:46:52.699 11:57:36 chaining -- bdev/chaining.sh@118 -- # spdk_dd --of /tmp/tmp.qhdnO2H9U1 --ib Nvme0n1 --bs 4096 --count 16 00:46:52.699 11:57:36 chaining -- bdev/chaining.sh@25 -- # local config 00:46:52.699 11:57:36 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:46:52.699 11:57:36 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:46:52.699 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:46:52.699 11:57:36 chaining -- bdev/chaining.sh@31 -- # config='{ 00:46:52.699 "subsystems": [ 00:46:52.699 { 00:46:52.699 "subsystem": "bdev", 00:46:52.699 "config": [ 00:46:52.699 { 00:46:52.699 "method": "bdev_nvme_attach_controller", 00:46:52.699 "params": { 00:46:52.699 "trtype": "tcp", 00:46:52.699 "adrfam": "IPv4", 00:46:52.699 "name": "Nvme0", 00:46:52.699 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:46:52.699 "traddr": "10.0.0.2", 00:46:52.699 "trsvcid": "4420" 00:46:52.699 } 00:46:52.699 }, 00:46:52.699 { 00:46:52.699 "method": "bdev_set_options", 00:46:52.699 "params": { 00:46:52.699 "bdev_auto_examine": false 00:46:52.699 } 00:46:52.699 } 00:46:52.699 ] 00:46:52.699 } 00:46:52.699 ] 00:46:52.699 }' 00:46:52.699 11:57:36 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --of /tmp/tmp.qhdnO2H9U1 --ib Nvme0n1 --bs 4096 --count 16 00:46:52.699 11:57:36 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:46:52.699 "subsystems": [ 00:46:52.699 { 00:46:52.699 "subsystem": "bdev", 00:46:52.699 "config": [ 00:46:52.699 { 00:46:52.699 "method": "bdev_nvme_attach_controller", 00:46:52.699 "params": { 00:46:52.699 "trtype": "tcp", 00:46:52.699 "adrfam": "IPv4", 00:46:52.699 "name": "Nvme0", 00:46:52.699 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:46:52.699 "traddr": "10.0.0.2", 00:46:52.699 "trsvcid": "4420" 00:46:52.699 } 00:46:52.699 }, 00:46:52.699 { 00:46:52.699 "method": "bdev_set_options", 00:46:52.699 "params": { 00:46:52.699 "bdev_auto_examine": false 00:46:52.699 } 00:46:52.699 } 00:46:52.699 ] 00:46:52.700 } 00:46:52.700 ] 00:46:52.700 }' 00:46:52.959 [2024-06-10 11:57:36.659504] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:46:52.959 [2024-06-10 11:57:36.659550] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid307211 ] 00:46:52.959 [2024-06-10 11:57:36.743981] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:46:52.959 [2024-06-10 11:57:36.824106] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:46:53.478  Copying: 64/64 [kB] (average 1333 kBps) 00:46:53.478 00:46:53.478 11:57:37 chaining -- bdev/chaining.sh@119 -- # get_stat sequence_executed 00:46:53.478 11:57:37 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:46:53.478 11:57:37 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:46:53.478 11:57:37 chaining -- bdev/chaining.sh@39 -- # opcode= 00:46:53.478 11:57:37 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:46:53.478 11:57:37 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:46:53.478 11:57:37 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:46:53.478 11:57:37 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:46:53.478 11:57:37 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:46:53.478 11:57:37 chaining -- common/autotest_common.sh@10 -- # set +x 00:46:53.478 11:57:37 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:46:53.478 11:57:37 chaining -- bdev/chaining.sh@119 -- # (( 47 == stats[sequence_executed] + 16 )) 00:46:53.478 11:57:37 chaining -- bdev/chaining.sh@120 -- # get_stat executed encrypt 00:46:53.478 11:57:37 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:46:53.478 11:57:37 chaining -- bdev/chaining.sh@39 -- # event=executed 00:46:53.478 11:57:37 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:46:53.478 11:57:37 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:46:53.478 11:57:37 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:46:53.478 11:57:37 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:46:53.478 11:57:37 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:46:53.478 11:57:37 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:46:53.478 11:57:37 chaining -- common/autotest_common.sh@10 -- # set +x 00:46:53.478 11:57:37 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:46:53.478 11:57:37 chaining -- bdev/chaining.sh@120 -- # (( 36 == stats[encrypt_executed] )) 00:46:53.478 11:57:37 chaining -- bdev/chaining.sh@121 -- # get_stat executed decrypt 00:46:53.478 11:57:37 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:46:53.478 11:57:37 chaining -- bdev/chaining.sh@39 -- # event=executed 00:46:53.478 11:57:37 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:46:53.478 11:57:37 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:46:53.478 11:57:37 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:46:53.478 11:57:37 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:46:53.478 11:57:37 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:46:53.478 11:57:37 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:46:53.478 11:57:37 chaining -- common/autotest_common.sh@10 -- # set +x 00:46:53.478 11:57:37 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:46:53.478 11:57:37 chaining -- bdev/chaining.sh@121 -- # (( 46 == stats[decrypt_executed] + 32 )) 00:46:53.478 11:57:37 chaining -- bdev/chaining.sh@122 -- # get_stat executed copy 00:46:53.478 11:57:37 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:46:53.478 11:57:37 chaining -- bdev/chaining.sh@39 -- # event=executed 00:46:53.478 11:57:37 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:46:53.478 11:57:37 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:46:53.478 11:57:37 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:46:53.478 11:57:37 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:46:53.478 11:57:37 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:46:53.478 11:57:37 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:46:53.478 11:57:37 chaining -- common/autotest_common.sh@10 -- # set +x 00:46:53.478 11:57:37 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:46:53.738 11:57:37 chaining -- bdev/chaining.sh@122 -- # (( 4 == stats[copy_executed] )) 00:46:53.738 11:57:37 chaining -- bdev/chaining.sh@123 -- # cmp /tmp/tmp.G1vP9tL19u /tmp/tmp.qhdnO2H9U1 00:46:53.738 11:57:37 chaining -- bdev/chaining.sh@125 -- # trap - SIGINT SIGTERM EXIT 00:46:53.738 11:57:37 chaining -- bdev/chaining.sh@126 -- # tgtcleanup 00:46:53.738 11:57:37 chaining -- bdev/chaining.sh@58 -- # rm -f /tmp/tmp.G1vP9tL19u /tmp/tmp.qhdnO2H9U1 00:46:53.738 11:57:37 chaining -- bdev/chaining.sh@59 -- # nvmftestfini 00:46:53.738 11:57:37 chaining -- nvmf/common.sh@488 -- # nvmfcleanup 00:46:53.738 11:57:37 chaining -- nvmf/common.sh@117 -- # sync 00:46:53.738 11:57:37 chaining -- nvmf/common.sh@119 -- # '[' tcp == tcp ']' 00:46:53.738 11:57:37 chaining -- nvmf/common.sh@120 -- # set +e 00:46:53.738 11:57:37 chaining -- nvmf/common.sh@121 -- # for i in {1..20} 00:46:53.738 11:57:37 chaining -- nvmf/common.sh@122 -- # modprobe -v -r nvme-tcp 00:46:53.738 rmmod nvme_tcp 00:46:53.738 rmmod nvme_fabrics 00:46:53.738 rmmod nvme_keyring 00:46:53.738 11:57:37 chaining -- nvmf/common.sh@123 -- # modprobe -v -r nvme-fabrics 00:46:53.738 11:57:37 chaining -- nvmf/common.sh@124 -- # set -e 00:46:53.738 11:57:37 chaining -- nvmf/common.sh@125 -- # return 0 00:46:53.738 11:57:37 chaining -- nvmf/common.sh@489 -- # '[' -n 306463 ']' 00:46:53.738 11:57:37 chaining -- nvmf/common.sh@490 -- # killprocess 306463 00:46:53.738 11:57:37 chaining -- common/autotest_common.sh@949 -- # '[' -z 306463 ']' 00:46:53.738 11:57:37 chaining -- common/autotest_common.sh@953 -- # kill -0 306463 00:46:53.738 11:57:37 chaining -- common/autotest_common.sh@954 -- # uname 00:46:53.738 11:57:37 chaining -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:46:53.738 11:57:37 chaining -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 306463 00:46:53.738 11:57:37 chaining -- common/autotest_common.sh@955 -- # process_name=reactor_1 00:46:53.738 11:57:37 chaining -- common/autotest_common.sh@959 -- # '[' reactor_1 = sudo ']' 00:46:53.738 11:57:37 chaining -- common/autotest_common.sh@967 -- # echo 'killing process with pid 306463' 00:46:53.738 killing process with pid 306463 00:46:53.738 11:57:37 chaining -- common/autotest_common.sh@968 -- # kill 306463 00:46:53.738 11:57:37 chaining -- common/autotest_common.sh@973 -- # wait 306463 00:46:53.998 11:57:37 chaining -- nvmf/common.sh@492 -- # '[' '' == iso ']' 00:46:53.998 11:57:37 chaining -- nvmf/common.sh@495 -- # [[ tcp == \t\c\p ]] 00:46:53.998 11:57:37 chaining -- nvmf/common.sh@496 -- # nvmf_tcp_fini 00:46:53.998 11:57:37 chaining -- nvmf/common.sh@274 -- # [[ nvmf_tgt_ns_spdk == \n\v\m\f\_\t\g\t\_\n\s ]] 00:46:53.998 11:57:37 chaining -- nvmf/common.sh@278 -- # remove_spdk_ns 00:46:53.998 11:57:37 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:46:53.998 11:57:37 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:46:53.998 11:57:37 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:46:53.998 11:57:37 chaining -- nvmf/common.sh@279 -- # ip -4 addr flush nvmf_init_if 00:46:53.998 11:57:37 chaining -- bdev/chaining.sh@129 -- # trap 'bperfcleanup; exit 1' SIGINT SIGTERM EXIT 00:46:53.998 11:57:37 chaining -- bdev/chaining.sh@132 -- # bperfpid=307427 00:46:53.998 11:57:37 chaining -- bdev/chaining.sh@134 -- # waitforlisten 307427 00:46:53.998 11:57:37 chaining -- common/autotest_common.sh@830 -- # '[' -z 307427 ']' 00:46:53.998 11:57:37 chaining -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:46:53.998 11:57:37 chaining -- common/autotest_common.sh@835 -- # local max_retries=100 00:46:53.998 11:57:37 chaining -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:46:53.998 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:46:53.998 11:57:37 chaining -- common/autotest_common.sh@839 -- # xtrace_disable 00:46:53.998 11:57:37 chaining -- bdev/chaining.sh@131 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:46:53.998 11:57:37 chaining -- common/autotest_common.sh@10 -- # set +x 00:46:53.998 [2024-06-10 11:57:37.853207] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:46:53.998 [2024-06-10 11:57:37.853262] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid307427 ] 00:46:54.257 [2024-06-10 11:57:37.955069] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:46:54.257 [2024-06-10 11:57:38.036614] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:46:54.827 11:57:38 chaining -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:46:54.827 11:57:38 chaining -- common/autotest_common.sh@863 -- # return 0 00:46:54.827 11:57:38 chaining -- bdev/chaining.sh@135 -- # rpc_cmd 00:46:54.827 11:57:38 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:46:54.827 11:57:38 chaining -- common/autotest_common.sh@10 -- # set +x 00:46:54.827 malloc0 00:46:55.220 true 00:46:55.220 true 00:46:55.220 [2024-06-10 11:57:38.786682] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:46:55.220 crypto0 00:46:55.220 [2024-06-10 11:57:38.794705] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:46:55.220 crypto1 00:46:55.220 11:57:38 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:46:55.220 11:57:38 chaining -- bdev/chaining.sh@145 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:46:55.220 Running I/O for 5 seconds... 00:47:00.493 00:47:00.493 Latency(us) 00:47:00.493 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:47:00.493 Job: crypto1 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:47:00.493 Verification LBA range: start 0x0 length 0x2000 00:47:00.493 crypto1 : 5.01 18150.46 70.90 0.00 0.00 14073.06 4217.10 10713.71 00:47:00.493 =================================================================================================================== 00:47:00.493 Total : 18150.46 70.90 0.00 0.00 14073.06 4217.10 10713.71 00:47:00.493 0 00:47:00.493 11:57:43 chaining -- bdev/chaining.sh@146 -- # killprocess 307427 00:47:00.493 11:57:43 chaining -- common/autotest_common.sh@949 -- # '[' -z 307427 ']' 00:47:00.493 11:57:43 chaining -- common/autotest_common.sh@953 -- # kill -0 307427 00:47:00.493 11:57:43 chaining -- common/autotest_common.sh@954 -- # uname 00:47:00.493 11:57:43 chaining -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:47:00.493 11:57:43 chaining -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 307427 00:47:00.493 11:57:43 chaining -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:47:00.493 11:57:43 chaining -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:47:00.493 11:57:43 chaining -- common/autotest_common.sh@967 -- # echo 'killing process with pid 307427' 00:47:00.493 killing process with pid 307427 00:47:00.493 11:57:43 chaining -- common/autotest_common.sh@968 -- # kill 307427 00:47:00.493 Received shutdown signal, test time was about 5.000000 seconds 00:47:00.493 00:47:00.493 Latency(us) 00:47:00.493 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:47:00.493 =================================================================================================================== 00:47:00.493 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:47:00.493 11:57:43 chaining -- common/autotest_common.sh@973 -- # wait 307427 00:47:00.493 11:57:44 chaining -- bdev/chaining.sh@152 -- # bperfpid=308239 00:47:00.493 11:57:44 chaining -- bdev/chaining.sh@154 -- # waitforlisten 308239 00:47:00.493 11:57:44 chaining -- common/autotest_common.sh@830 -- # '[' -z 308239 ']' 00:47:00.493 11:57:44 chaining -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:47:00.493 11:57:44 chaining -- bdev/chaining.sh@151 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:47:00.493 11:57:44 chaining -- common/autotest_common.sh@835 -- # local max_retries=100 00:47:00.493 11:57:44 chaining -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:47:00.493 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:47:00.493 11:57:44 chaining -- common/autotest_common.sh@839 -- # xtrace_disable 00:47:00.493 11:57:44 chaining -- common/autotest_common.sh@10 -- # set +x 00:47:00.493 [2024-06-10 11:57:44.191896] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:47:00.493 [2024-06-10 11:57:44.191951] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid308239 ] 00:47:00.493 [2024-06-10 11:57:44.279123] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:47:00.493 [2024-06-10 11:57:44.366952] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:47:01.060 11:57:44 chaining -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:47:01.060 11:57:44 chaining -- common/autotest_common.sh@863 -- # return 0 00:47:01.060 11:57:44 chaining -- bdev/chaining.sh@155 -- # rpc_cmd 00:47:01.060 11:57:44 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:47:01.060 11:57:44 chaining -- common/autotest_common.sh@10 -- # set +x 00:47:01.319 malloc0 00:47:01.319 true 00:47:01.319 true 00:47:01.319 [2024-06-10 11:57:45.126962] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc0 00:47:01.319 [2024-06-10 11:57:45.127002] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:47:01.319 [2024-06-10 11:57:45.127017] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1025250 00:47:01.319 [2024-06-10 11:57:45.127026] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:47:01.319 [2024-06-10 11:57:45.127771] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:47:01.319 [2024-06-10 11:57:45.127792] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt0 00:47:01.319 pt0 00:47:01.319 [2024-06-10 11:57:45.134997] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:47:01.319 crypto0 00:47:01.319 [2024-06-10 11:57:45.143016] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:47:01.319 crypto1 00:47:01.319 11:57:45 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:47:01.319 11:57:45 chaining -- bdev/chaining.sh@166 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:47:01.319 Running I/O for 5 seconds... 00:47:06.589 00:47:06.589 Latency(us) 00:47:06.589 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:47:06.589 Job: crypto1 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:47:06.589 Verification LBA range: start 0x0 length 0x2000 00:47:06.589 crypto1 : 5.01 14378.22 56.16 0.00 0.00 17760.42 2393.49 12366.36 00:47:06.589 =================================================================================================================== 00:47:06.589 Total : 14378.22 56.16 0.00 0.00 17760.42 2393.49 12366.36 00:47:06.589 0 00:47:06.589 11:57:50 chaining -- bdev/chaining.sh@167 -- # killprocess 308239 00:47:06.589 11:57:50 chaining -- common/autotest_common.sh@949 -- # '[' -z 308239 ']' 00:47:06.589 11:57:50 chaining -- common/autotest_common.sh@953 -- # kill -0 308239 00:47:06.589 11:57:50 chaining -- common/autotest_common.sh@954 -- # uname 00:47:06.589 11:57:50 chaining -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:47:06.589 11:57:50 chaining -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 308239 00:47:06.589 11:57:50 chaining -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:47:06.589 11:57:50 chaining -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:47:06.589 11:57:50 chaining -- common/autotest_common.sh@967 -- # echo 'killing process with pid 308239' 00:47:06.589 killing process with pid 308239 00:47:06.589 11:57:50 chaining -- common/autotest_common.sh@968 -- # kill 308239 00:47:06.589 Received shutdown signal, test time was about 5.000000 seconds 00:47:06.589 00:47:06.589 Latency(us) 00:47:06.589 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:47:06.589 =================================================================================================================== 00:47:06.589 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:47:06.589 11:57:50 chaining -- common/autotest_common.sh@973 -- # wait 308239 00:47:06.589 11:57:50 chaining -- bdev/chaining.sh@169 -- # trap - SIGINT SIGTERM EXIT 00:47:06.589 11:57:50 chaining -- bdev/chaining.sh@170 -- # killprocess 308239 00:47:06.589 11:57:50 chaining -- common/autotest_common.sh@949 -- # '[' -z 308239 ']' 00:47:06.589 11:57:50 chaining -- common/autotest_common.sh@953 -- # kill -0 308239 00:47:06.589 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh: line 953: kill: (308239) - No such process 00:47:06.589 11:57:50 chaining -- common/autotest_common.sh@976 -- # echo 'Process with pid 308239 is not found' 00:47:06.589 Process with pid 308239 is not found 00:47:06.589 11:57:50 chaining -- bdev/chaining.sh@171 -- # wait 308239 00:47:06.589 11:57:50 chaining -- bdev/chaining.sh@175 -- # nvmftestinit 00:47:06.589 11:57:50 chaining -- nvmf/common.sh@441 -- # '[' -z tcp ']' 00:47:06.589 11:57:50 chaining -- nvmf/common.sh@446 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:47:06.589 11:57:50 chaining -- nvmf/common.sh@448 -- # prepare_net_devs 00:47:06.589 11:57:50 chaining -- nvmf/common.sh@410 -- # local -g is_hw=no 00:47:06.589 11:57:50 chaining -- nvmf/common.sh@412 -- # remove_spdk_ns 00:47:06.589 11:57:50 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:47:06.589 11:57:50 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:47:06.589 11:57:50 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:47:06.589 11:57:50 chaining -- nvmf/common.sh@414 -- # [[ phy-fallback != virt ]] 00:47:06.589 11:57:50 chaining -- nvmf/common.sh@414 -- # gather_supported_nvmf_pci_devs 00:47:06.589 11:57:50 chaining -- nvmf/common.sh@285 -- # xtrace_disable 00:47:06.589 11:57:50 chaining -- common/autotest_common.sh@10 -- # set +x 00:47:06.589 11:57:50 chaining -- nvmf/common.sh@289 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:47:06.589 11:57:50 chaining -- nvmf/common.sh@291 -- # pci_devs=() 00:47:06.589 11:57:50 chaining -- nvmf/common.sh@291 -- # local -a pci_devs 00:47:06.589 11:57:50 chaining -- nvmf/common.sh@292 -- # pci_net_devs=() 00:47:06.589 11:57:50 chaining -- nvmf/common.sh@292 -- # local -a pci_net_devs 00:47:06.589 11:57:50 chaining -- nvmf/common.sh@293 -- # pci_drivers=() 00:47:06.589 11:57:50 chaining -- nvmf/common.sh@293 -- # local -A pci_drivers 00:47:06.589 11:57:50 chaining -- nvmf/common.sh@295 -- # net_devs=() 00:47:06.589 11:57:50 chaining -- nvmf/common.sh@295 -- # local -ga net_devs 00:47:06.589 11:57:50 chaining -- nvmf/common.sh@296 -- # e810=() 00:47:06.589 11:57:50 chaining -- nvmf/common.sh@296 -- # local -ga e810 00:47:06.589 11:57:50 chaining -- nvmf/common.sh@297 -- # x722=() 00:47:06.590 11:57:50 chaining -- nvmf/common.sh@297 -- # local -ga x722 00:47:06.590 11:57:50 chaining -- nvmf/common.sh@298 -- # mlx=() 00:47:06.590 11:57:50 chaining -- nvmf/common.sh@298 -- # local -ga mlx 00:47:06.848 11:57:50 chaining -- nvmf/common.sh@301 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:47:06.848 11:57:50 chaining -- nvmf/common.sh@302 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:47:06.848 11:57:50 chaining -- nvmf/common.sh@304 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:47:06.848 11:57:50 chaining -- nvmf/common.sh@306 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:47:06.848 11:57:50 chaining -- nvmf/common.sh@308 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:47:06.848 11:57:50 chaining -- nvmf/common.sh@310 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:47:06.848 11:57:50 chaining -- nvmf/common.sh@312 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:47:06.848 11:57:50 chaining -- nvmf/common.sh@314 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:47:06.848 11:57:50 chaining -- nvmf/common.sh@315 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:47:06.848 11:57:50 chaining -- nvmf/common.sh@317 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:47:06.848 11:57:50 chaining -- nvmf/common.sh@318 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:47:06.848 11:57:50 chaining -- nvmf/common.sh@320 -- # pci_devs+=("${e810[@]}") 00:47:06.848 11:57:50 chaining -- nvmf/common.sh@321 -- # [[ tcp == rdma ]] 00:47:06.848 11:57:50 chaining -- nvmf/common.sh@327 -- # [[ '' == mlx5 ]] 00:47:06.848 11:57:50 chaining -- nvmf/common.sh@329 -- # [[ '' == e810 ]] 00:47:06.848 11:57:50 chaining -- nvmf/common.sh@331 -- # [[ '' == x722 ]] 00:47:06.848 11:57:50 chaining -- nvmf/common.sh@335 -- # (( 0 == 0 )) 00:47:06.848 11:57:50 chaining -- nvmf/common.sh@336 -- # return 1 00:47:06.848 11:57:50 chaining -- nvmf/common.sh@416 -- # [[ no == yes ]] 00:47:06.848 11:57:50 chaining -- nvmf/common.sh@423 -- # [[ phy-fallback == phy ]] 00:47:06.848 11:57:50 chaining -- nvmf/common.sh@426 -- # [[ phy-fallback == phy-fallback ]] 00:47:06.848 11:57:50 chaining -- nvmf/common.sh@427 -- # echo 'WARNING: No supported devices were found, fallback requested for tcp test' 00:47:06.848 WARNING: No supported devices were found, fallback requested for tcp test 00:47:06.848 11:57:50 chaining -- nvmf/common.sh@431 -- # [[ tcp == tcp ]] 00:47:06.848 11:57:50 chaining -- nvmf/common.sh@432 -- # nvmf_veth_init 00:47:06.848 11:57:50 chaining -- nvmf/common.sh@141 -- # NVMF_INITIATOR_IP=10.0.0.1 00:47:06.848 11:57:50 chaining -- nvmf/common.sh@142 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:47:06.848 11:57:50 chaining -- nvmf/common.sh@143 -- # NVMF_SECOND_TARGET_IP=10.0.0.3 00:47:06.848 11:57:50 chaining -- nvmf/common.sh@144 -- # NVMF_BRIDGE=nvmf_br 00:47:06.848 11:57:50 chaining -- nvmf/common.sh@145 -- # NVMF_INITIATOR_INTERFACE=nvmf_init_if 00:47:06.848 11:57:50 chaining -- nvmf/common.sh@146 -- # NVMF_INITIATOR_BRIDGE=nvmf_init_br 00:47:06.848 11:57:50 chaining -- nvmf/common.sh@147 -- # NVMF_TARGET_NAMESPACE=nvmf_tgt_ns_spdk 00:47:06.848 11:57:50 chaining -- nvmf/common.sh@148 -- # NVMF_TARGET_NS_CMD=(ip netns exec "$NVMF_TARGET_NAMESPACE") 00:47:06.848 11:57:50 chaining -- nvmf/common.sh@149 -- # NVMF_TARGET_INTERFACE=nvmf_tgt_if 00:47:06.848 11:57:50 chaining -- nvmf/common.sh@150 -- # NVMF_TARGET_INTERFACE2=nvmf_tgt_if2 00:47:06.848 11:57:50 chaining -- nvmf/common.sh@151 -- # NVMF_TARGET_BRIDGE=nvmf_tgt_br 00:47:06.848 11:57:50 chaining -- nvmf/common.sh@152 -- # NVMF_TARGET_BRIDGE2=nvmf_tgt_br2 00:47:06.848 11:57:50 chaining -- nvmf/common.sh@154 -- # ip link set nvmf_init_br nomaster 00:47:06.848 11:57:50 chaining -- nvmf/common.sh@155 -- # ip link set nvmf_tgt_br nomaster 00:47:06.848 Cannot find device "nvmf_tgt_br" 00:47:06.848 11:57:50 chaining -- nvmf/common.sh@155 -- # true 00:47:06.848 11:57:50 chaining -- nvmf/common.sh@156 -- # ip link set nvmf_tgt_br2 nomaster 00:47:06.848 Cannot find device "nvmf_tgt_br2" 00:47:06.848 11:57:50 chaining -- nvmf/common.sh@156 -- # true 00:47:06.848 11:57:50 chaining -- nvmf/common.sh@157 -- # ip link set nvmf_init_br down 00:47:06.848 11:57:50 chaining -- nvmf/common.sh@158 -- # ip link set nvmf_tgt_br down 00:47:06.848 Cannot find device "nvmf_tgt_br" 00:47:06.848 11:57:50 chaining -- nvmf/common.sh@158 -- # true 00:47:06.848 11:57:50 chaining -- nvmf/common.sh@159 -- # ip link set nvmf_tgt_br2 down 00:47:06.848 Cannot find device "nvmf_tgt_br2" 00:47:06.848 11:57:50 chaining -- nvmf/common.sh@159 -- # true 00:47:06.848 11:57:50 chaining -- nvmf/common.sh@160 -- # ip link delete nvmf_br type bridge 00:47:06.849 11:57:50 chaining -- nvmf/common.sh@161 -- # ip link delete nvmf_init_if 00:47:06.849 11:57:50 chaining -- nvmf/common.sh@162 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if 00:47:06.849 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:47:06.849 11:57:50 chaining -- nvmf/common.sh@162 -- # true 00:47:06.849 11:57:50 chaining -- nvmf/common.sh@163 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if2 00:47:06.849 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:47:06.849 11:57:50 chaining -- nvmf/common.sh@163 -- # true 00:47:06.849 11:57:50 chaining -- nvmf/common.sh@166 -- # ip netns add nvmf_tgt_ns_spdk 00:47:06.849 11:57:50 chaining -- nvmf/common.sh@169 -- # ip link add nvmf_init_if type veth peer name nvmf_init_br 00:47:06.849 11:57:50 chaining -- nvmf/common.sh@170 -- # ip link add nvmf_tgt_if type veth peer name nvmf_tgt_br 00:47:06.849 11:57:50 chaining -- nvmf/common.sh@171 -- # ip link add nvmf_tgt_if2 type veth peer name nvmf_tgt_br2 00:47:06.849 11:57:50 chaining -- nvmf/common.sh@174 -- # ip link set nvmf_tgt_if netns nvmf_tgt_ns_spdk 00:47:06.849 11:57:50 chaining -- nvmf/common.sh@175 -- # ip link set nvmf_tgt_if2 netns nvmf_tgt_ns_spdk 00:47:07.107 11:57:50 chaining -- nvmf/common.sh@178 -- # ip addr add 10.0.0.1/24 dev nvmf_init_if 00:47:07.107 11:57:50 chaining -- nvmf/common.sh@179 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.2/24 dev nvmf_tgt_if 00:47:07.107 11:57:50 chaining -- nvmf/common.sh@180 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.3/24 dev nvmf_tgt_if2 00:47:07.107 11:57:50 chaining -- nvmf/common.sh@183 -- # ip link set nvmf_init_if up 00:47:07.107 11:57:50 chaining -- nvmf/common.sh@184 -- # ip link set nvmf_init_br up 00:47:07.107 11:57:50 chaining -- nvmf/common.sh@185 -- # ip link set nvmf_tgt_br up 00:47:07.107 11:57:50 chaining -- nvmf/common.sh@186 -- # ip link set nvmf_tgt_br2 up 00:47:07.107 11:57:50 chaining -- nvmf/common.sh@187 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if up 00:47:07.107 11:57:50 chaining -- nvmf/common.sh@188 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if2 up 00:47:07.107 11:57:50 chaining -- nvmf/common.sh@189 -- # ip netns exec nvmf_tgt_ns_spdk ip link set lo up 00:47:07.107 11:57:50 chaining -- nvmf/common.sh@192 -- # ip link add nvmf_br type bridge 00:47:07.107 11:57:50 chaining -- nvmf/common.sh@193 -- # ip link set nvmf_br up 00:47:07.107 11:57:50 chaining -- nvmf/common.sh@196 -- # ip link set nvmf_init_br master nvmf_br 00:47:07.107 11:57:50 chaining -- nvmf/common.sh@197 -- # ip link set nvmf_tgt_br master nvmf_br 00:47:07.107 11:57:50 chaining -- nvmf/common.sh@198 -- # ip link set nvmf_tgt_br2 master nvmf_br 00:47:07.107 11:57:51 chaining -- nvmf/common.sh@201 -- # iptables -I INPUT 1 -i nvmf_init_if -p tcp --dport 4420 -j ACCEPT 00:47:07.107 11:57:51 chaining -- nvmf/common.sh@202 -- # iptables -A FORWARD -i nvmf_br -o nvmf_br -j ACCEPT 00:47:07.107 11:57:51 chaining -- nvmf/common.sh@205 -- # ping -c 1 10.0.0.2 00:47:07.107 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:47:07.107 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.109 ms 00:47:07.107 00:47:07.107 --- 10.0.0.2 ping statistics --- 00:47:07.107 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:47:07.107 rtt min/avg/max/mdev = 0.109/0.109/0.109/0.000 ms 00:47:07.107 11:57:51 chaining -- nvmf/common.sh@206 -- # ping -c 1 10.0.0.3 00:47:07.107 PING 10.0.0.3 (10.0.0.3) 56(84) bytes of data. 00:47:07.107 64 bytes from 10.0.0.3: icmp_seq=1 ttl=64 time=0.076 ms 00:47:07.107 00:47:07.107 --- 10.0.0.3 ping statistics --- 00:47:07.107 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:47:07.107 rtt min/avg/max/mdev = 0.076/0.076/0.076/0.000 ms 00:47:07.107 11:57:51 chaining -- nvmf/common.sh@207 -- # ip netns exec nvmf_tgt_ns_spdk ping -c 1 10.0.0.1 00:47:07.365 PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data. 00:47:07.365 64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.033 ms 00:47:07.365 00:47:07.365 --- 10.0.0.1 ping statistics --- 00:47:07.366 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:47:07.366 rtt min/avg/max/mdev = 0.033/0.033/0.033/0.000 ms 00:47:07.366 11:57:51 chaining -- nvmf/common.sh@209 -- # NVMF_APP=("${NVMF_TARGET_NS_CMD[@]}" "${NVMF_APP[@]}") 00:47:07.366 11:57:51 chaining -- nvmf/common.sh@433 -- # return 0 00:47:07.366 11:57:51 chaining -- nvmf/common.sh@450 -- # '[' '' == iso ']' 00:47:07.366 11:57:51 chaining -- nvmf/common.sh@454 -- # NVMF_TRANSPORT_OPTS='-t tcp' 00:47:07.366 11:57:51 chaining -- nvmf/common.sh@455 -- # [[ tcp == \r\d\m\a ]] 00:47:07.366 11:57:51 chaining -- nvmf/common.sh@464 -- # [[ tcp == \t\c\p ]] 00:47:07.366 11:57:51 chaining -- nvmf/common.sh@465 -- # NVMF_TRANSPORT_OPTS='-t tcp -o' 00:47:07.366 11:57:51 chaining -- nvmf/common.sh@468 -- # '[' tcp == tcp ']' 00:47:07.366 11:57:51 chaining -- nvmf/common.sh@474 -- # modprobe nvme-tcp 00:47:07.366 11:57:51 chaining -- bdev/chaining.sh@176 -- # nvmfappstart -m 0x2 00:47:07.366 11:57:51 chaining -- nvmf/common.sh@479 -- # timing_enter start_nvmf_tgt 00:47:07.366 11:57:51 chaining -- common/autotest_common.sh@723 -- # xtrace_disable 00:47:07.366 11:57:51 chaining -- common/autotest_common.sh@10 -- # set +x 00:47:07.366 11:57:51 chaining -- nvmf/common.sh@481 -- # nvmfpid=309390 00:47:07.366 11:57:51 chaining -- nvmf/common.sh@480 -- # ip netns exec nvmf_tgt_ns_spdk ip netns exec nvmf_tgt_ns_spdk /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x2 00:47:07.366 11:57:51 chaining -- nvmf/common.sh@482 -- # waitforlisten 309390 00:47:07.366 11:57:51 chaining -- common/autotest_common.sh@830 -- # '[' -z 309390 ']' 00:47:07.366 11:57:51 chaining -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:47:07.366 11:57:51 chaining -- common/autotest_common.sh@835 -- # local max_retries=100 00:47:07.366 11:57:51 chaining -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:47:07.366 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:47:07.366 11:57:51 chaining -- common/autotest_common.sh@839 -- # xtrace_disable 00:47:07.366 11:57:51 chaining -- common/autotest_common.sh@10 -- # set +x 00:47:07.366 [2024-06-10 11:57:51.168049] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:47:07.366 [2024-06-10 11:57:51.168101] [ DPDK EAL parameters: nvmf -c 0x2 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:47:07.366 [2024-06-10 11:57:51.260741] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:47:07.624 [2024-06-10 11:57:51.345123] app.c: 604:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:47:07.624 [2024-06-10 11:57:51.345161] app.c: 605:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:47:07.624 [2024-06-10 11:57:51.345171] app.c: 610:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:47:07.624 [2024-06-10 11:57:51.345179] app.c: 611:app_setup_trace: *NOTICE*: SPDK application currently running. 00:47:07.624 [2024-06-10 11:57:51.345187] app.c: 612:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:47:07.624 [2024-06-10 11:57:51.345214] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:47:08.191 11:57:51 chaining -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:47:08.191 11:57:51 chaining -- common/autotest_common.sh@863 -- # return 0 00:47:08.192 11:57:51 chaining -- nvmf/common.sh@483 -- # timing_exit start_nvmf_tgt 00:47:08.192 11:57:51 chaining -- common/autotest_common.sh@729 -- # xtrace_disable 00:47:08.192 11:57:51 chaining -- common/autotest_common.sh@10 -- # set +x 00:47:08.192 11:57:52 chaining -- nvmf/common.sh@484 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:47:08.192 11:57:52 chaining -- bdev/chaining.sh@178 -- # rpc_cmd 00:47:08.192 11:57:52 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:47:08.192 11:57:52 chaining -- common/autotest_common.sh@10 -- # set +x 00:47:08.192 malloc0 00:47:08.192 [2024-06-10 11:57:52.027635] tcp.c: 672:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:47:08.192 [2024-06-10 11:57:52.043832] tcp.c: 967:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 10.0.0.2 port 4420 *** 00:47:08.192 11:57:52 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:47:08.192 11:57:52 chaining -- bdev/chaining.sh@186 -- # trap 'bperfcleanup || :; nvmftestfini || :; exit 1' SIGINT SIGTERM EXIT 00:47:08.192 11:57:52 chaining -- bdev/chaining.sh@189 -- # bperfpid=309476 00:47:08.192 11:57:52 chaining -- bdev/chaining.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/bperf.sock -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:47:08.192 11:57:52 chaining -- bdev/chaining.sh@191 -- # waitforlisten 309476 /var/tmp/bperf.sock 00:47:08.192 11:57:52 chaining -- common/autotest_common.sh@830 -- # '[' -z 309476 ']' 00:47:08.192 11:57:52 chaining -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/bperf.sock 00:47:08.192 11:57:52 chaining -- common/autotest_common.sh@835 -- # local max_retries=100 00:47:08.192 11:57:52 chaining -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock...' 00:47:08.192 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock... 00:47:08.192 11:57:52 chaining -- common/autotest_common.sh@839 -- # xtrace_disable 00:47:08.192 11:57:52 chaining -- common/autotest_common.sh@10 -- # set +x 00:47:08.192 [2024-06-10 11:57:52.109947] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:47:08.192 [2024-06-10 11:57:52.109999] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid309476 ] 00:47:08.450 [2024-06-10 11:57:52.197438] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:47:08.450 [2024-06-10 11:57:52.285850] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:47:09.018 11:57:52 chaining -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:47:09.018 11:57:52 chaining -- common/autotest_common.sh@863 -- # return 0 00:47:09.018 11:57:52 chaining -- bdev/chaining.sh@192 -- # rpc_bperf 00:47:09.018 11:57:52 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock 00:47:09.586 [2024-06-10 11:57:53.245265] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:47:09.586 nvme0n1 00:47:09.586 true 00:47:09.586 crypto0 00:47:09.586 11:57:53 chaining -- bdev/chaining.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/bperf.sock perform_tests 00:47:09.586 Running I/O for 5 seconds... 00:47:14.857 00:47:14.857 Latency(us) 00:47:14.857 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:47:14.857 Job: crypto0 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:47:14.857 Verification LBA range: start 0x0 length 0x2000 00:47:14.857 crypto0 : 5.01 12463.14 48.68 0.00 0.00 20489.30 2721.17 20287.67 00:47:14.857 =================================================================================================================== 00:47:14.857 Total : 12463.14 48.68 0.00 0.00 20489.30 2721.17 20287.67 00:47:14.857 0 00:47:14.857 11:57:58 chaining -- bdev/chaining.sh@205 -- # get_stat_bperf sequence_executed 00:47:14.857 11:57:58 chaining -- bdev/chaining.sh@48 -- # get_stat sequence_executed '' rpc_bperf 00:47:14.857 11:57:58 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:47:14.857 11:57:58 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:47:14.857 11:57:58 chaining -- bdev/chaining.sh@39 -- # opcode= 00:47:14.857 11:57:58 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:47:14.857 11:57:58 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:47:14.857 11:57:58 chaining -- bdev/chaining.sh@41 -- # rpc_bperf accel_get_stats 00:47:14.857 11:57:58 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:47:14.857 11:57:58 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:47:14.857 11:57:58 chaining -- bdev/chaining.sh@205 -- # sequence=124998 00:47:14.857 11:57:58 chaining -- bdev/chaining.sh@206 -- # get_stat_bperf executed encrypt 00:47:14.857 11:57:58 chaining -- bdev/chaining.sh@48 -- # get_stat executed encrypt rpc_bperf 00:47:14.857 11:57:58 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:47:14.857 11:57:58 chaining -- bdev/chaining.sh@39 -- # event=executed 00:47:14.857 11:57:58 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:47:14.857 11:57:58 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:47:14.857 11:57:58 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:47:14.857 11:57:58 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:47:14.857 11:57:58 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:47:14.857 11:57:58 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:47:14.857 11:57:58 chaining -- bdev/chaining.sh@206 -- # encrypt=62499 00:47:14.857 11:57:58 chaining -- bdev/chaining.sh@207 -- # get_stat_bperf executed decrypt 00:47:14.857 11:57:58 chaining -- bdev/chaining.sh@48 -- # get_stat executed decrypt rpc_bperf 00:47:14.857 11:57:58 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:47:14.857 11:57:58 chaining -- bdev/chaining.sh@39 -- # event=executed 00:47:14.857 11:57:58 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:47:14.857 11:57:58 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:47:14.857 11:57:58 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:47:14.857 11:57:58 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:47:14.857 11:57:58 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:47:14.857 11:57:58 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:47:15.116 11:57:58 chaining -- bdev/chaining.sh@207 -- # decrypt=62499 00:47:15.116 11:57:58 chaining -- bdev/chaining.sh@208 -- # get_stat_bperf executed crc32c 00:47:15.116 11:57:58 chaining -- bdev/chaining.sh@48 -- # get_stat executed crc32c rpc_bperf 00:47:15.116 11:57:58 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:47:15.116 11:57:58 chaining -- bdev/chaining.sh@39 -- # event=executed 00:47:15.116 11:57:58 chaining -- bdev/chaining.sh@39 -- # opcode=crc32c 00:47:15.116 11:57:58 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:47:15.116 11:57:58 chaining -- bdev/chaining.sh@40 -- # [[ -z crc32c ]] 00:47:15.116 11:57:58 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:47:15.116 11:57:58 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "crc32c").executed' 00:47:15.116 11:57:58 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:47:15.375 11:57:59 chaining -- bdev/chaining.sh@208 -- # crc32c=124998 00:47:15.375 11:57:59 chaining -- bdev/chaining.sh@210 -- # (( sequence > 0 )) 00:47:15.375 11:57:59 chaining -- bdev/chaining.sh@211 -- # (( encrypt + decrypt == sequence )) 00:47:15.375 11:57:59 chaining -- bdev/chaining.sh@212 -- # (( encrypt + decrypt == crc32c )) 00:47:15.375 11:57:59 chaining -- bdev/chaining.sh@214 -- # killprocess 309476 00:47:15.375 11:57:59 chaining -- common/autotest_common.sh@949 -- # '[' -z 309476 ']' 00:47:15.375 11:57:59 chaining -- common/autotest_common.sh@953 -- # kill -0 309476 00:47:15.375 11:57:59 chaining -- common/autotest_common.sh@954 -- # uname 00:47:15.375 11:57:59 chaining -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:47:15.375 11:57:59 chaining -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 309476 00:47:15.375 11:57:59 chaining -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:47:15.375 11:57:59 chaining -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:47:15.375 11:57:59 chaining -- common/autotest_common.sh@967 -- # echo 'killing process with pid 309476' 00:47:15.375 killing process with pid 309476 00:47:15.375 11:57:59 chaining -- common/autotest_common.sh@968 -- # kill 309476 00:47:15.375 Received shutdown signal, test time was about 5.000000 seconds 00:47:15.375 00:47:15.375 Latency(us) 00:47:15.375 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:47:15.375 =================================================================================================================== 00:47:15.375 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:47:15.375 11:57:59 chaining -- common/autotest_common.sh@973 -- # wait 309476 00:47:15.635 11:57:59 chaining -- bdev/chaining.sh@217 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/bperf.sock -t 5 -w verify -o 65536 -q 32 --wait-for-rpc -z 00:47:15.635 11:57:59 chaining -- bdev/chaining.sh@219 -- # bperfpid=310472 00:47:15.635 11:57:59 chaining -- bdev/chaining.sh@221 -- # waitforlisten 310472 /var/tmp/bperf.sock 00:47:15.635 11:57:59 chaining -- common/autotest_common.sh@830 -- # '[' -z 310472 ']' 00:47:15.635 11:57:59 chaining -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/bperf.sock 00:47:15.635 11:57:59 chaining -- common/autotest_common.sh@835 -- # local max_retries=100 00:47:15.635 11:57:59 chaining -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock...' 00:47:15.635 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock... 00:47:15.635 11:57:59 chaining -- common/autotest_common.sh@839 -- # xtrace_disable 00:47:15.635 11:57:59 chaining -- common/autotest_common.sh@10 -- # set +x 00:47:15.635 [2024-06-10 11:57:59.408855] Starting SPDK v24.09-pre git sha1 a3f6419f1 / DPDK 24.03.0 initialization... 00:47:15.635 [2024-06-10 11:57:59.408916] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid310472 ] 00:47:15.635 [2024-06-10 11:57:59.495262] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:47:15.635 [2024-06-10 11:57:59.579597] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:47:16.573 11:58:00 chaining -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:47:16.573 11:58:00 chaining -- common/autotest_common.sh@863 -- # return 0 00:47:16.573 11:58:00 chaining -- bdev/chaining.sh@222 -- # rpc_bperf 00:47:16.573 11:58:00 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock 00:47:16.832 [2024-06-10 11:58:00.553969] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:47:16.832 nvme0n1 00:47:16.832 true 00:47:16.832 crypto0 00:47:16.832 11:58:00 chaining -- bdev/chaining.sh@231 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/bperf.sock perform_tests 00:47:16.832 Running I/O for 5 seconds... 00:47:22.118 00:47:22.118 Latency(us) 00:47:22.118 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:47:22.118 Job: crypto0 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:47:22.118 Verification LBA range: start 0x0 length 0x200 00:47:22.118 crypto0 : 5.00 2502.60 156.41 0.00 0.00 12543.63 598.37 13278.16 00:47:22.118 =================================================================================================================== 00:47:22.118 Total : 2502.60 156.41 0.00 0.00 12543.63 598.37 13278.16 00:47:22.118 0 00:47:22.118 11:58:05 chaining -- bdev/chaining.sh@233 -- # get_stat_bperf sequence_executed 00:47:22.118 11:58:05 chaining -- bdev/chaining.sh@48 -- # get_stat sequence_executed '' rpc_bperf 00:47:22.118 11:58:05 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:47:22.118 11:58:05 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:47:22.118 11:58:05 chaining -- bdev/chaining.sh@39 -- # opcode= 00:47:22.118 11:58:05 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:47:22.118 11:58:05 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:47:22.118 11:58:05 chaining -- bdev/chaining.sh@41 -- # rpc_bperf accel_get_stats 00:47:22.118 11:58:05 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:47:22.118 11:58:05 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:47:22.118 11:58:05 chaining -- bdev/chaining.sh@233 -- # sequence=25046 00:47:22.118 11:58:05 chaining -- bdev/chaining.sh@234 -- # get_stat_bperf executed encrypt 00:47:22.118 11:58:05 chaining -- bdev/chaining.sh@48 -- # get_stat executed encrypt rpc_bperf 00:47:22.118 11:58:05 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:47:22.118 11:58:05 chaining -- bdev/chaining.sh@39 -- # event=executed 00:47:22.118 11:58:05 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:47:22.118 11:58:05 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:47:22.118 11:58:05 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:47:22.118 11:58:05 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:47:22.118 11:58:05 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:47:22.118 11:58:05 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:47:22.118 11:58:06 chaining -- bdev/chaining.sh@234 -- # encrypt=12523 00:47:22.377 11:58:06 chaining -- bdev/chaining.sh@235 -- # get_stat_bperf executed decrypt 00:47:22.377 11:58:06 chaining -- bdev/chaining.sh@48 -- # get_stat executed decrypt rpc_bperf 00:47:22.377 11:58:06 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:47:22.377 11:58:06 chaining -- bdev/chaining.sh@39 -- # event=executed 00:47:22.377 11:58:06 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:47:22.377 11:58:06 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:47:22.377 11:58:06 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:47:22.377 11:58:06 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:47:22.377 11:58:06 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:47:22.377 11:58:06 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:47:22.377 11:58:06 chaining -- bdev/chaining.sh@235 -- # decrypt=12523 00:47:22.377 11:58:06 chaining -- bdev/chaining.sh@236 -- # get_stat_bperf executed crc32c 00:47:22.377 11:58:06 chaining -- bdev/chaining.sh@48 -- # get_stat executed crc32c rpc_bperf 00:47:22.377 11:58:06 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:47:22.377 11:58:06 chaining -- bdev/chaining.sh@39 -- # event=executed 00:47:22.377 11:58:06 chaining -- bdev/chaining.sh@39 -- # opcode=crc32c 00:47:22.377 11:58:06 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:47:22.377 11:58:06 chaining -- bdev/chaining.sh@40 -- # [[ -z crc32c ]] 00:47:22.377 11:58:06 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:47:22.377 11:58:06 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "crc32c").executed' 00:47:22.377 11:58:06 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:47:22.635 11:58:06 chaining -- bdev/chaining.sh@236 -- # crc32c=25046 00:47:22.635 11:58:06 chaining -- bdev/chaining.sh@238 -- # (( sequence > 0 )) 00:47:22.635 11:58:06 chaining -- bdev/chaining.sh@239 -- # (( encrypt + decrypt == sequence )) 00:47:22.635 11:58:06 chaining -- bdev/chaining.sh@240 -- # (( encrypt + decrypt == crc32c )) 00:47:22.635 11:58:06 chaining -- bdev/chaining.sh@242 -- # killprocess 310472 00:47:22.635 11:58:06 chaining -- common/autotest_common.sh@949 -- # '[' -z 310472 ']' 00:47:22.635 11:58:06 chaining -- common/autotest_common.sh@953 -- # kill -0 310472 00:47:22.635 11:58:06 chaining -- common/autotest_common.sh@954 -- # uname 00:47:22.635 11:58:06 chaining -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:47:22.635 11:58:06 chaining -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 310472 00:47:22.635 11:58:06 chaining -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:47:22.635 11:58:06 chaining -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:47:22.635 11:58:06 chaining -- common/autotest_common.sh@967 -- # echo 'killing process with pid 310472' 00:47:22.635 killing process with pid 310472 00:47:22.635 11:58:06 chaining -- common/autotest_common.sh@968 -- # kill 310472 00:47:22.635 Received shutdown signal, test time was about 5.000000 seconds 00:47:22.635 00:47:22.635 Latency(us) 00:47:22.635 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:47:22.635 =================================================================================================================== 00:47:22.635 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:47:22.635 11:58:06 chaining -- common/autotest_common.sh@973 -- # wait 310472 00:47:22.894 11:58:06 chaining -- bdev/chaining.sh@243 -- # nvmftestfini 00:47:22.894 11:58:06 chaining -- nvmf/common.sh@488 -- # nvmfcleanup 00:47:22.894 11:58:06 chaining -- nvmf/common.sh@117 -- # sync 00:47:22.894 11:58:06 chaining -- nvmf/common.sh@119 -- # '[' tcp == tcp ']' 00:47:22.894 11:58:06 chaining -- nvmf/common.sh@120 -- # set +e 00:47:22.894 11:58:06 chaining -- nvmf/common.sh@121 -- # for i in {1..20} 00:47:22.894 11:58:06 chaining -- nvmf/common.sh@122 -- # modprobe -v -r nvme-tcp 00:47:22.894 rmmod nvme_tcp 00:47:22.894 rmmod nvme_fabrics 00:47:22.894 rmmod nvme_keyring 00:47:22.894 11:58:06 chaining -- nvmf/common.sh@123 -- # modprobe -v -r nvme-fabrics 00:47:22.894 11:58:06 chaining -- nvmf/common.sh@124 -- # set -e 00:47:22.894 11:58:06 chaining -- nvmf/common.sh@125 -- # return 0 00:47:22.894 11:58:06 chaining -- nvmf/common.sh@489 -- # '[' -n 309390 ']' 00:47:22.894 11:58:06 chaining -- nvmf/common.sh@490 -- # killprocess 309390 00:47:22.894 11:58:06 chaining -- common/autotest_common.sh@949 -- # '[' -z 309390 ']' 00:47:22.894 11:58:06 chaining -- common/autotest_common.sh@953 -- # kill -0 309390 00:47:22.894 11:58:06 chaining -- common/autotest_common.sh@954 -- # uname 00:47:22.894 11:58:06 chaining -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:47:22.894 11:58:06 chaining -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 309390 00:47:22.894 11:58:06 chaining -- common/autotest_common.sh@955 -- # process_name=reactor_1 00:47:22.894 11:58:06 chaining -- common/autotest_common.sh@959 -- # '[' reactor_1 = sudo ']' 00:47:22.894 11:58:06 chaining -- common/autotest_common.sh@967 -- # echo 'killing process with pid 309390' 00:47:22.894 killing process with pid 309390 00:47:22.894 11:58:06 chaining -- common/autotest_common.sh@968 -- # kill 309390 00:47:22.894 11:58:06 chaining -- common/autotest_common.sh@973 -- # wait 309390 00:47:23.153 11:58:07 chaining -- nvmf/common.sh@492 -- # '[' '' == iso ']' 00:47:23.153 11:58:07 chaining -- nvmf/common.sh@495 -- # [[ tcp == \t\c\p ]] 00:47:23.153 11:58:07 chaining -- nvmf/common.sh@496 -- # nvmf_tcp_fini 00:47:23.153 11:58:07 chaining -- nvmf/common.sh@274 -- # [[ nvmf_tgt_ns_spdk == \n\v\m\f\_\t\g\t\_\n\s ]] 00:47:23.153 11:58:07 chaining -- nvmf/common.sh@278 -- # remove_spdk_ns 00:47:23.153 11:58:07 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:47:23.153 11:58:07 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:47:23.153 11:58:07 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:47:23.153 11:58:07 chaining -- nvmf/common.sh@279 -- # ip -4 addr flush nvmf_init_if 00:47:23.153 11:58:07 chaining -- bdev/chaining.sh@245 -- # trap - SIGINT SIGTERM EXIT 00:47:23.153 00:47:23.153 real 0m43.534s 00:47:23.153 user 0m54.535s 00:47:23.153 sys 0m12.579s 00:47:23.153 11:58:07 chaining -- common/autotest_common.sh@1125 -- # xtrace_disable 00:47:23.153 11:58:07 chaining -- common/autotest_common.sh@10 -- # set +x 00:47:23.153 ************************************ 00:47:23.153 END TEST chaining 00:47:23.153 ************************************ 00:47:23.412 11:58:07 -- spdk/autotest.sh@363 -- # [[ 0 -eq 1 ]] 00:47:23.412 11:58:07 -- spdk/autotest.sh@367 -- # [[ 0 -eq 1 ]] 00:47:23.412 11:58:07 -- spdk/autotest.sh@371 -- # [[ 0 -eq 1 ]] 00:47:23.412 11:58:07 -- spdk/autotest.sh@375 -- # [[ 0 -eq 1 ]] 00:47:23.412 11:58:07 -- spdk/autotest.sh@380 -- # trap - SIGINT SIGTERM EXIT 00:47:23.412 11:58:07 -- spdk/autotest.sh@382 -- # timing_enter post_cleanup 00:47:23.412 11:58:07 -- common/autotest_common.sh@723 -- # xtrace_disable 00:47:23.412 11:58:07 -- common/autotest_common.sh@10 -- # set +x 00:47:23.412 11:58:07 -- spdk/autotest.sh@383 -- # autotest_cleanup 00:47:23.412 11:58:07 -- common/autotest_common.sh@1391 -- # local autotest_es=0 00:47:23.412 11:58:07 -- common/autotest_common.sh@1392 -- # xtrace_disable 00:47:23.412 11:58:07 -- common/autotest_common.sh@10 -- # set +x 00:47:27.601 INFO: APP EXITING 00:47:27.601 INFO: killing all VMs 00:47:27.601 INFO: killing vhost app 00:47:27.601 INFO: EXIT DONE 00:47:30.889 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:47:30.889 Waiting for block devices as requested 00:47:30.889 0000:5e:00.0 (8086 0b60): vfio-pci -> nvme 00:47:30.889 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:47:30.889 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:47:30.889 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:47:31.148 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:47:31.148 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:47:31.148 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:47:31.408 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:47:31.408 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:47:31.408 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:47:31.408 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:47:31.667 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:47:31.667 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:47:31.667 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:47:31.926 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:47:31.926 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:47:31.926 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:47:36.126 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:47:36.126 Cleaning 00:47:36.126 Removing: /var/run/dpdk/spdk0/config 00:47:36.126 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-0 00:47:36.126 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-1 00:47:36.126 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-2 00:47:36.126 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-3 00:47:36.126 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-0 00:47:36.126 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-1 00:47:36.126 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-2 00:47:36.126 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-3 00:47:36.126 Removing: /var/run/dpdk/spdk0/fbarray_memzone 00:47:36.126 Removing: /var/run/dpdk/spdk0/hugepage_info 00:47:36.126 Removing: /dev/shm/nvmf_trace.0 00:47:36.126 Removing: /dev/shm/spdk_tgt_trace.pid94389 00:47:36.126 Removing: /var/run/dpdk/spdk0 00:47:36.126 Removing: /var/run/dpdk/spdk_pid100098 00:47:36.126 Removing: /var/run/dpdk/spdk_pid100480 00:47:36.126 Removing: /var/run/dpdk/spdk_pid100742 00:47:36.126 Removing: /var/run/dpdk/spdk_pid100944 00:47:36.126 Removing: /var/run/dpdk/spdk_pid101141 00:47:36.126 Removing: /var/run/dpdk/spdk_pid101359 00:47:36.126 Removing: /var/run/dpdk/spdk_pid102032 00:47:36.126 Removing: /var/run/dpdk/spdk_pid104466 00:47:36.126 Removing: /var/run/dpdk/spdk_pid104683 00:47:36.126 Removing: /var/run/dpdk/spdk_pid104917 00:47:36.126 Removing: /var/run/dpdk/spdk_pid105139 00:47:36.126 Removing: /var/run/dpdk/spdk_pid105182 00:47:36.126 Removing: /var/run/dpdk/spdk_pid105390 00:47:36.126 Removing: /var/run/dpdk/spdk_pid105587 00:47:36.126 Removing: /var/run/dpdk/spdk_pid105784 00:47:36.126 Removing: /var/run/dpdk/spdk_pid105977 00:47:36.126 Removing: /var/run/dpdk/spdk_pid106177 00:47:36.126 Removing: /var/run/dpdk/spdk_pid106423 00:47:36.126 Removing: /var/run/dpdk/spdk_pid106692 00:47:36.126 Removing: /var/run/dpdk/spdk_pid106925 00:47:36.126 Removing: /var/run/dpdk/spdk_pid107116 00:47:36.126 Removing: /var/run/dpdk/spdk_pid107325 00:47:36.126 Removing: /var/run/dpdk/spdk_pid107517 00:47:36.126 Removing: /var/run/dpdk/spdk_pid107760 00:47:36.126 Removing: /var/run/dpdk/spdk_pid108036 00:47:36.126 Removing: /var/run/dpdk/spdk_pid108239 00:47:36.126 Removing: /var/run/dpdk/spdk_pid108705 00:47:36.126 Removing: /var/run/dpdk/spdk_pid109093 00:47:36.126 Removing: /var/run/dpdk/spdk_pid109363 00:47:36.126 Removing: /var/run/dpdk/spdk_pid109577 00:47:36.126 Removing: /var/run/dpdk/spdk_pid109775 00:47:36.126 Removing: /var/run/dpdk/spdk_pid109977 00:47:36.126 Removing: /var/run/dpdk/spdk_pid110166 00:47:36.127 Removing: /var/run/dpdk/spdk_pid110366 00:47:36.127 Removing: /var/run/dpdk/spdk_pid110727 00:47:36.127 Removing: /var/run/dpdk/spdk_pid110928 00:47:36.127 Removing: /var/run/dpdk/spdk_pid111288 00:47:36.127 Removing: /var/run/dpdk/spdk_pid111496 00:47:36.127 Removing: /var/run/dpdk/spdk_pid111861 00:47:36.127 Removing: /var/run/dpdk/spdk_pid112065 00:47:36.127 Removing: /var/run/dpdk/spdk_pid112378 00:47:36.127 Removing: /var/run/dpdk/spdk_pid112498 00:47:36.127 Removing: /var/run/dpdk/spdk_pid112785 00:47:36.127 Removing: /var/run/dpdk/spdk_pid113222 00:47:36.127 Removing: /var/run/dpdk/spdk_pid113555 00:47:36.127 Removing: /var/run/dpdk/spdk_pid113625 00:47:36.127 Removing: /var/run/dpdk/spdk_pid116891 00:47:36.127 Removing: /var/run/dpdk/spdk_pid118599 00:47:36.127 Removing: /var/run/dpdk/spdk_pid120291 00:47:36.127 Removing: /var/run/dpdk/spdk_pid121035 00:47:36.127 Removing: /var/run/dpdk/spdk_pid122087 00:47:36.127 Removing: /var/run/dpdk/spdk_pid122295 00:47:36.127 Removing: /var/run/dpdk/spdk_pid122483 00:47:36.127 Removing: /var/run/dpdk/spdk_pid122506 00:47:36.127 Removing: /var/run/dpdk/spdk_pid126298 00:47:36.127 Removing: /var/run/dpdk/spdk_pid126693 00:47:36.127 Removing: /var/run/dpdk/spdk_pid127744 00:47:36.127 Removing: /var/run/dpdk/spdk_pid127948 00:47:36.127 Removing: /var/run/dpdk/spdk_pid132412 00:47:36.127 Removing: /var/run/dpdk/spdk_pid133697 00:47:36.127 Removing: /var/run/dpdk/spdk_pid134499 00:47:36.127 Removing: /var/run/dpdk/spdk_pid138418 00:47:36.127 Removing: /var/run/dpdk/spdk_pid139704 00:47:36.127 Removing: /var/run/dpdk/spdk_pid140506 00:47:36.127 Removing: /var/run/dpdk/spdk_pid143885 00:47:36.127 Removing: /var/run/dpdk/spdk_pid145803 00:47:36.127 Removing: /var/run/dpdk/spdk_pid146558 00:47:36.127 Removing: /var/run/dpdk/spdk_pid154243 00:47:36.127 Removing: /var/run/dpdk/spdk_pid155978 00:47:36.127 Removing: /var/run/dpdk/spdk_pid156783 00:47:36.127 Removing: /var/run/dpdk/spdk_pid165100 00:47:36.127 Removing: /var/run/dpdk/spdk_pid166845 00:47:36.127 Removing: /var/run/dpdk/spdk_pid167653 00:47:36.127 Removing: /var/run/dpdk/spdk_pid175461 00:47:36.127 Removing: /var/run/dpdk/spdk_pid178074 00:47:36.127 Removing: /var/run/dpdk/spdk_pid178921 00:47:36.127 Removing: /var/run/dpdk/spdk_pid187585 00:47:36.127 Removing: /var/run/dpdk/spdk_pid189601 00:47:36.127 Removing: /var/run/dpdk/spdk_pid190479 00:47:36.127 Removing: /var/run/dpdk/spdk_pid199690 00:47:36.127 Removing: /var/run/dpdk/spdk_pid201765 00:47:36.127 Removing: /var/run/dpdk/spdk_pid202591 00:47:36.127 Removing: /var/run/dpdk/spdk_pid211433 00:47:36.127 Removing: /var/run/dpdk/spdk_pid214441 00:47:36.127 Removing: /var/run/dpdk/spdk_pid215415 00:47:36.127 Removing: /var/run/dpdk/spdk_pid216254 00:47:36.127 Removing: /var/run/dpdk/spdk_pid218918 00:47:36.127 Removing: /var/run/dpdk/spdk_pid223607 00:47:36.127 Removing: /var/run/dpdk/spdk_pid225786 00:47:36.127 Removing: /var/run/dpdk/spdk_pid229764 00:47:36.127 Removing: /var/run/dpdk/spdk_pid232668 00:47:36.127 Removing: /var/run/dpdk/spdk_pid237094 00:47:36.127 Removing: /var/run/dpdk/spdk_pid239553 00:47:36.127 Removing: /var/run/dpdk/spdk_pid244837 00:47:36.127 Removing: /var/run/dpdk/spdk_pid246594 00:47:36.127 Removing: /var/run/dpdk/spdk_pid252281 00:47:36.127 Removing: /var/run/dpdk/spdk_pid254188 00:47:36.127 Removing: /var/run/dpdk/spdk_pid259303 00:47:36.127 Removing: /var/run/dpdk/spdk_pid261127 00:47:36.127 Removing: /var/run/dpdk/spdk_pid264735 00:47:36.127 Removing: /var/run/dpdk/spdk_pid265090 00:47:36.127 Removing: /var/run/dpdk/spdk_pid265439 00:47:36.127 Removing: /var/run/dpdk/spdk_pid265800 00:47:36.127 Removing: /var/run/dpdk/spdk_pid266236 00:47:36.127 Removing: /var/run/dpdk/spdk_pid266845 00:47:36.127 Removing: /var/run/dpdk/spdk_pid267647 00:47:36.127 Removing: /var/run/dpdk/spdk_pid267955 00:47:36.127 Removing: /var/run/dpdk/spdk_pid269036 00:47:36.127 Removing: /var/run/dpdk/spdk_pid270113 00:47:36.127 Removing: /var/run/dpdk/spdk_pid271201 00:47:36.127 Removing: /var/run/dpdk/spdk_pid271984 00:47:36.127 Removing: /var/run/dpdk/spdk_pid272964 00:47:36.127 Removing: /var/run/dpdk/spdk_pid274128 00:47:36.127 Removing: /var/run/dpdk/spdk_pid275610 00:47:36.127 Removing: /var/run/dpdk/spdk_pid276362 00:47:36.127 Removing: /var/run/dpdk/spdk_pid276924 00:47:36.127 Removing: /var/run/dpdk/spdk_pid277441 00:47:36.127 Removing: /var/run/dpdk/spdk_pid279185 00:47:36.127 Removing: /var/run/dpdk/spdk_pid281040 00:47:36.127 Removing: /var/run/dpdk/spdk_pid282855 00:47:36.127 Removing: /var/run/dpdk/spdk_pid283894 00:47:36.127 Removing: /var/run/dpdk/spdk_pid285113 00:47:36.127 Removing: /var/run/dpdk/spdk_pid285645 00:47:36.127 Removing: /var/run/dpdk/spdk_pid285683 00:47:36.127 Removing: /var/run/dpdk/spdk_pid285760 00:47:36.127 Removing: /var/run/dpdk/spdk_pid286087 00:47:36.127 Removing: /var/run/dpdk/spdk_pid286144 00:47:36.127 Removing: /var/run/dpdk/spdk_pid287207 00:47:36.127 Removing: /var/run/dpdk/spdk_pid288723 00:47:36.127 Removing: /var/run/dpdk/spdk_pid290229 00:47:36.127 Removing: /var/run/dpdk/spdk_pid290944 00:47:36.127 Removing: /var/run/dpdk/spdk_pid291751 00:47:36.127 Removing: /var/run/dpdk/spdk_pid292023 00:47:36.127 Removing: /var/run/dpdk/spdk_pid292047 00:47:36.127 Removing: /var/run/dpdk/spdk_pid292069 00:47:36.127 Removing: /var/run/dpdk/spdk_pid293012 00:47:36.127 Removing: /var/run/dpdk/spdk_pid293556 00:47:36.127 Removing: /var/run/dpdk/spdk_pid293931 00:47:36.127 Removing: /var/run/dpdk/spdk_pid295755 00:47:36.127 Removing: /var/run/dpdk/spdk_pid297612 00:47:36.127 Removing: /var/run/dpdk/spdk_pid299580 00:47:36.127 Removing: /var/run/dpdk/spdk_pid301033 00:47:36.127 Removing: /var/run/dpdk/spdk_pid302108 00:47:36.127 Removing: /var/run/dpdk/spdk_pid302648 00:47:36.127 Removing: /var/run/dpdk/spdk_pid302669 00:47:36.127 Removing: /var/run/dpdk/spdk_pid306615 00:47:36.127 Removing: /var/run/dpdk/spdk_pid306777 00:47:36.127 Removing: /var/run/dpdk/spdk_pid306971 00:47:36.127 Removing: /var/run/dpdk/spdk_pid307004 00:47:36.127 Removing: /var/run/dpdk/spdk_pid307211 00:47:36.127 Removing: /var/run/dpdk/spdk_pid307427 00:47:36.127 Removing: /var/run/dpdk/spdk_pid308239 00:47:36.127 Removing: /var/run/dpdk/spdk_pid309476 00:47:36.127 Removing: /var/run/dpdk/spdk_pid310472 00:47:36.127 Removing: /var/run/dpdk/spdk_pid93720 00:47:36.127 Removing: /var/run/dpdk/spdk_pid94389 00:47:36.127 Removing: /var/run/dpdk/spdk_pid94919 00:47:36.127 Removing: /var/run/dpdk/spdk_pid95651 00:47:36.127 Removing: /var/run/dpdk/spdk_pid95839 00:47:36.127 Removing: /var/run/dpdk/spdk_pid96594 00:47:36.127 Removing: /var/run/dpdk/spdk_pid96756 00:47:36.127 Removing: /var/run/dpdk/spdk_pid96966 00:47:36.127 Removing: /var/run/dpdk/spdk_pid98576 00:47:36.127 Removing: /var/run/dpdk/spdk_pid99633 00:47:36.127 Removing: /var/run/dpdk/spdk_pid99864 00:47:36.127 Clean 00:47:36.127 11:58:20 -- common/autotest_common.sh@1450 -- # return 0 00:47:36.127 11:58:20 -- spdk/autotest.sh@384 -- # timing_exit post_cleanup 00:47:36.127 11:58:20 -- common/autotest_common.sh@729 -- # xtrace_disable 00:47:36.127 11:58:20 -- common/autotest_common.sh@10 -- # set +x 00:47:36.127 11:58:20 -- spdk/autotest.sh@386 -- # timing_exit autotest 00:47:36.127 11:58:20 -- common/autotest_common.sh@729 -- # xtrace_disable 00:47:36.127 11:58:20 -- common/autotest_common.sh@10 -- # set +x 00:47:36.386 11:58:20 -- spdk/autotest.sh@387 -- # chmod a+r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/timing.txt 00:47:36.386 11:58:20 -- spdk/autotest.sh@389 -- # [[ -f /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/udev.log ]] 00:47:36.386 11:58:20 -- spdk/autotest.sh@389 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/udev.log 00:47:36.386 11:58:20 -- spdk/autotest.sh@391 -- # hash lcov 00:47:36.386 11:58:20 -- spdk/autotest.sh@391 -- # [[ CC_TYPE=gcc == *\c\l\a\n\g* ]] 00:47:36.386 11:58:20 -- spdk/autotest.sh@393 -- # hostname 00:47:36.386 11:58:20 -- spdk/autotest.sh@393 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -c -d /var/jenkins/workspace/crypto-phy-autotest/spdk -t spdk-wfp-52 -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_test.info 00:47:36.386 geninfo: WARNING: invalid characters removed from testname! 00:47:58.398 11:58:39 -- spdk/autotest.sh@394 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -a /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_base.info -a /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_test.info -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:47:58.398 11:58:42 -- spdk/autotest.sh@395 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/dpdk/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:48:00.301 11:58:43 -- spdk/autotest.sh@396 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '/usr/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:48:01.676 11:58:45 -- spdk/autotest.sh@397 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/examples/vmd/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:48:03.579 11:58:47 -- spdk/autotest.sh@398 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_lspci/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:48:04.958 11:58:48 -- spdk/autotest.sh@399 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_top/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:48:06.862 11:58:50 -- spdk/autotest.sh@400 -- # rm -f cov_base.info cov_test.info OLD_STDOUT OLD_STDERR 00:48:06.862 11:58:50 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:48:06.862 11:58:50 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:48:06.862 11:58:50 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:48:06.862 11:58:50 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:48:06.862 11:58:50 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:48:06.862 11:58:50 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:48:06.862 11:58:50 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:48:06.862 11:58:50 -- paths/export.sh@5 -- $ export PATH 00:48:06.862 11:58:50 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:48:06.862 11:58:50 -- common/autobuild_common.sh@436 -- $ out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:48:06.862 11:58:50 -- common/autobuild_common.sh@437 -- $ date +%s 00:48:06.862 11:58:50 -- common/autobuild_common.sh@437 -- $ mktemp -dt spdk_1718013530.XXXXXX 00:48:06.862 11:58:50 -- common/autobuild_common.sh@437 -- $ SPDK_WORKSPACE=/tmp/spdk_1718013530.WDRgs8 00:48:06.862 11:58:50 -- common/autobuild_common.sh@439 -- $ [[ -n '' ]] 00:48:06.862 11:58:50 -- common/autobuild_common.sh@443 -- $ '[' -n '' ']' 00:48:06.862 11:58:50 -- common/autobuild_common.sh@446 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/' 00:48:06.862 11:58:50 -- common/autobuild_common.sh@450 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp' 00:48:06.862 11:58:50 -- common/autobuild_common.sh@452 -- $ scanbuild='scan-build -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:48:06.862 11:58:50 -- common/autobuild_common.sh@453 -- $ get_config_params 00:48:06.862 11:58:50 -- common/autotest_common.sh@396 -- $ xtrace_disable 00:48:06.862 11:58:50 -- common/autotest_common.sh@10 -- $ set +x 00:48:06.862 11:58:50 -- common/autobuild_common.sh@453 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk' 00:48:06.862 11:58:50 -- common/autobuild_common.sh@455 -- $ start_monitor_resources 00:48:06.862 11:58:50 -- pm/common@17 -- $ local monitor 00:48:06.862 11:58:50 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:48:06.862 11:58:50 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:48:06.862 11:58:50 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:48:06.862 11:58:50 -- pm/common@21 -- $ date +%s 00:48:06.862 11:58:50 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:48:06.862 11:58:50 -- pm/common@21 -- $ date +%s 00:48:06.862 11:58:50 -- pm/common@25 -- $ sleep 1 00:48:06.862 11:58:50 -- pm/common@21 -- $ date +%s 00:48:06.862 11:58:50 -- pm/common@21 -- $ date +%s 00:48:06.862 11:58:50 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1718013530 00:48:06.862 11:58:50 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1718013530 00:48:06.862 11:58:50 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1718013530 00:48:06.862 11:58:50 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1718013530 00:48:06.862 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1718013530_collect-vmstat.pm.log 00:48:06.862 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1718013530_collect-cpu-temp.pm.log 00:48:06.862 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1718013530_collect-cpu-load.pm.log 00:48:06.862 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1718013530_collect-bmc-pm.bmc.pm.log 00:48:07.800 11:58:51 -- common/autobuild_common.sh@456 -- $ trap stop_monitor_resources EXIT 00:48:07.800 11:58:51 -- spdk/autopackage.sh@10 -- $ MAKEFLAGS=-j72 00:48:07.800 11:58:51 -- spdk/autopackage.sh@11 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:48:07.800 11:58:51 -- spdk/autopackage.sh@13 -- $ [[ 0 -eq 1 ]] 00:48:07.800 11:58:51 -- spdk/autopackage.sh@18 -- $ [[ 0 -eq 0 ]] 00:48:07.800 11:58:51 -- spdk/autopackage.sh@19 -- $ timing_finish 00:48:07.800 11:58:51 -- common/autotest_common.sh@735 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:48:07.800 11:58:51 -- common/autotest_common.sh@736 -- $ '[' -x /usr/local/FlameGraph/flamegraph.pl ']' 00:48:07.800 11:58:51 -- common/autotest_common.sh@738 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/timing.txt 00:48:08.060 11:58:51 -- spdk/autopackage.sh@20 -- $ exit 0 00:48:08.060 11:58:51 -- spdk/autopackage.sh@1 -- $ stop_monitor_resources 00:48:08.060 11:58:51 -- pm/common@29 -- $ signal_monitor_resources TERM 00:48:08.060 11:58:51 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:48:08.060 11:58:51 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:48:08.060 11:58:51 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:48:08.060 11:58:51 -- pm/common@44 -- $ pid=320319 00:48:08.060 11:58:51 -- pm/common@50 -- $ kill -TERM 320319 00:48:08.060 11:58:51 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:48:08.060 11:58:51 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:48:08.060 11:58:51 -- pm/common@44 -- $ pid=320321 00:48:08.060 11:58:51 -- pm/common@50 -- $ kill -TERM 320321 00:48:08.060 11:58:51 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:48:08.060 11:58:51 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:48:08.060 11:58:51 -- pm/common@44 -- $ pid=320323 00:48:08.060 11:58:51 -- pm/common@50 -- $ kill -TERM 320323 00:48:08.060 11:58:51 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:48:08.060 11:58:51 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:48:08.060 11:58:51 -- pm/common@44 -- $ pid=320347 00:48:08.060 11:58:51 -- pm/common@50 -- $ sudo -E kill -TERM 320347 00:48:08.060 + [[ -n 4177385 ]] 00:48:08.060 + sudo kill 4177385 00:48:08.070 [Pipeline] } 00:48:08.091 [Pipeline] // stage 00:48:08.099 [Pipeline] } 00:48:08.116 [Pipeline] // timeout 00:48:08.121 [Pipeline] } 00:48:08.139 [Pipeline] // catchError 00:48:08.145 [Pipeline] } 00:48:08.165 [Pipeline] // wrap 00:48:08.172 [Pipeline] } 00:48:08.188 [Pipeline] // catchError 00:48:08.198 [Pipeline] stage 00:48:08.200 [Pipeline] { (Epilogue) 00:48:08.216 [Pipeline] catchError 00:48:08.218 [Pipeline] { 00:48:08.233 [Pipeline] echo 00:48:08.235 Cleanup processes 00:48:08.241 [Pipeline] sh 00:48:08.530 + sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:48:08.530 320424 /usr/bin/ipmitool sdr dump /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/sdr.cache 00:48:08.530 320642 sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:48:08.544 [Pipeline] sh 00:48:08.826 ++ sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:48:08.826 ++ grep -v 'sudo pgrep' 00:48:08.826 ++ awk '{print $1}' 00:48:08.826 + sudo kill -9 320424 00:48:08.837 [Pipeline] sh 00:48:09.118 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:48:17.249 [Pipeline] sh 00:48:17.534 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:48:17.534 Artifacts sizes are good 00:48:17.549 [Pipeline] archiveArtifacts 00:48:17.556 Archiving artifacts 00:48:17.698 [Pipeline] sh 00:48:18.010 + sudo chown -R sys_sgci /var/jenkins/workspace/crypto-phy-autotest 00:48:18.024 [Pipeline] cleanWs 00:48:18.034 [WS-CLEANUP] Deleting project workspace... 00:48:18.034 [WS-CLEANUP] Deferred wipeout is used... 00:48:18.041 [WS-CLEANUP] done 00:48:18.043 [Pipeline] } 00:48:18.064 [Pipeline] // catchError 00:48:18.076 [Pipeline] sh 00:48:18.357 + logger -p user.info -t JENKINS-CI 00:48:18.368 [Pipeline] } 00:48:18.397 [Pipeline] // stage 00:48:18.408 [Pipeline] } 00:48:18.423 [Pipeline] // node 00:48:18.426 [Pipeline] End of Pipeline 00:48:18.444 Finished: SUCCESS